+ 1
Weird floats in JS?
Hello, coders! I've got possibly very stupid question about JavaScript that most certainly have an answer somewhere out there. Here goes... Suppose, I have two vars: i = 6 and di = 0.2 Every time a button is pressed runs following bit of code: i = i + di; As a result I get series of numbers: 6, 6.2, 6.4, 6.6000000000000005, 6.800000000000001, etc... Why is this happening? Can I somehow limit my floats to one decimal? Or do I have to round var 'i' each iteration? I know, it would probably be easier to give up floats altogether in this particular case (by multiplying by 10), but I'd like to understand what's going on at least.
6 Réponses
+ 8
found this explanation
https://docs.python.org/release/2.5.1/tut/node16.html
+ 5
you can use toPrecision
https://developer.mozilla.org/en/docs/Web/JavaScript/Reference/Global_objects/Number/toPrecision
+ 5
💪
+ 2
deals with how computers inaccurately represent floating point numbers. its not your fault. you can google more on this subject
+ 2
Oh! Thanks a lot! =) It was very confusing for a moment there, but I see it now.
It's just that I never quite had this problem before in C++, although to be fair it probably has to do with the fact that there you can easily specify how many decimals you want to print. So, I suppose I'd have to find a little workaround to achieve similar effect.
+ 2
Much appreciated, Burey =) I'll get right on it.