Why 0.1 + 0.2 = 0.30000000000000004 ? Java script | Sololearn: Learn to code for FREE!
Nouvelle formation ! Tous les codeurs devraient apprendre l'IA générative !
Essayez une leçon gratuite
+ 12

Why 0.1 + 0.2 = 0.30000000000000004 ? Java script

var num1 = 0.1 , num2 = 0.2 , num3 = 0.3 ; console.log (num1 + num2 == num3 ) ; false !!!!!! because 0.1 + 0.2 = 0.300000004 somebody explain why 🤔🤔🤔

13th Sep 2017, 3:20 PM
Mahmud amen
Mahmud amen - avatar
1 Réponse
+ 5
because it is irrational. computer can never display exact numbers when working with decimals even if you add seemingly rational decimals like 0.1. reason is that there is always a certain degree of uncertainty and computer can't represent float point exactly. try setting an epsilon for tolerance and rounding the number within the epsilon range. this is especially important when using == to compare two float point numbers
13th Sep 2017, 3:43 PM
Gao Xiangshuai
Gao Xiangshuai - avatar