+ 2

Can you give reason why unsigned positive integer is treated smaller than signed negative character or integer in C?

Consider the comparisons below for unsigned int i = 23 and signed int j = -23 and signed char c = -23 : i < j // Outputs 1 (True) i > j // Outputs 0 (False) i > c // Outputs 0 (False) i < c // Outputs 1 (True) I have observed it in the following code in one of the C Challenge : #include <stdio.h> int main(){ unsigned int i = 23; signed char c = -23; if (i>c) printf("Yes\n"); else printf("No\n"); return 0; } The above code outputs No, meaning that i=23 < c=-23. The same happens for comparison of any negative signed character (or integer) with positive unsigned integer irrespective of their modulus (absolute values). The comparison gives correct output when both signed and unsigned integers are positive. I have seen the same results in both C and C++. Do you have any valid reason or explaination for this? If yes, please share it here. Also, does the same comparison happen in other languages too like Java, Javascript, Ruby, Swift, etc.?

24th May 2019, 3:28 AM
Harshit Gupta
Harshit Gupta - avatar
1 Réponse
+ 5
please use the search bar, this question have been asked an answered several times https://www.sololearn.com/Discuss/243562/?ref=app
24th May 2019, 4:39 AM
✳AsterisK✳
✳AsterisK✳ - avatar