0

Почему 0.1 + 0.2 == 0.3. В результате выдаёт False???

24th May 2020, 8:27 PM
Vovo4ko
Vovo4ko - avatar
5 Answers
+ 1
Sorry I didn't understand what you mean but this link may give you an idea as why https://www.quora.com/Why-is-0-1+0-2-not-equal-to-0-3-in-most-programming-languages
24th May 2020, 9:22 PM
Abhay
Abhay - avatar
+ 1
0.1 doesn't exactly represents it Try this and you will see why the result is False from decimal import Decimal print(Decimal(0.1)) print(Decimal(0.2)) print(Decimal(0.3)) print(Decimal(0.1+0.2))
24th May 2020, 8:44 PM
Abhay
Abhay - avatar
+ 1
Doesn't (0.1) initially matter (float)?
24th May 2020, 8:52 PM
Vovo4ko
Vovo4ko - avatar
+ 1
thank you, bro.Now I figured it out
24th May 2020, 11:57 PM
Vovo4ko
Vovo4ko - avatar
0
а это распространенный во многих языках программирования прикол, кстати. но чтобы побороть его достаточно округлять до нужного количества нулей после десятичеой точки.
6th Nov 2020, 6:41 PM
Вадим