Python - Data Science - Standard deviation
Whats wrong with this? >>>>>>>>>>>>>CODE<<<<<<<<<<<<<<<<<<<<< players = [180, 172, 178, 185, 190, 195, 192, 200, 210, 190] mean = (180+172+178+185+190+195+192+200+210+190)/10 print("Mean is ", mean) sums_sq = int() sums_sq = (180-mean)**2+(172-mean)**2+(178-mean)**2+(185-mean)**2+(190-mean)**2+(195-mean)**2+(192-mean)**2+(200-mean)**2+(210-mean)**2+(190-mean)**2 print ("Sums of diffs are",sums_sq) var = (sums_sq/10) ** (1/2) print ("Variance is",var) n = 0 for player in players: print (player) if (player >= (178.638)) and (player <= (199.762)): print("In the one std div") n = n + 1 else: print("oops...OUT") n = n print("No of players in the range of one std div:",n) <<<<<<<<<<<<<<<<>>>>>>>>>>>>>>>>>>>>>>> It gives answer '6' which is right, but doesn't pass the test, how?