+ 1
Recursion Decimal to binary
How can i make this code right to convert decimal to binary. NB: Without changing the third line! https://code.sololearn.com/cr0Haqu5ePUF/?ref=app
1 Resposta
+ 10
You're just missing the base case.
def convert(num):
if num>=1:
return (num % 2 + 10 * convert(num // 2))
return num
print(convert(int(input())))