0
Works fine on console, but displays error when run from file
This program runs fine on console, but displays error when run from file https://code.sololearn.com/cArl2Qbw3Lng Gives error: ValueError: invalid literal for int() with base 10:
8 Antworten
+ 1
Note that Py3 input work in different way to Py2 one... Try to run this code and see which type belong your var after input
https://code.sololearn.com/c68Tj7FWWynD/?ref=app
+ 1
You execute on same python version either (from file and line by line)?
+ 1
Ignore my comment Please! I might have noticed that wrongly.
I tried it now again to take a screenshot and saw that the console now also showing the same error.
Thanks for all the help.
+ 1
You are welcome 👍👍👍
0
Wow looks like I need to change
integer2 = int(float1)
to
integer2 = int(float(float1))
It is absurd! Why convert a float to a float again? Moreover, without that the program works on a console. Looks like a bug on Python!
Ref: https://stackoverflow.com/questions/1841565/valueerror-invalid-literal-for-int-with-base-10
0
Maybe because float1 its a string and not a float
0
Nah, float1 is a variable with a floating point value defined in it... And as I said, it does not show errors on running the code line by line on the console...
0
Interesting!
Python version: 3.6
int: 17
float: 2.0
integer1 is of class <class 'str'>
float1 is of class <class 'str'>
So why does it work without error at the console? Does console work like py2?