+ 1
Doubt in C
I don't know How does output comes? https://code.sololearn.com/c3RpBdMiCNXp/?ref=app
13 odpowiedzi
+ 8
Adding on to Ipang 's answer
You can't type cast *short* to type *char* neither you can convert *char* to type *short*
So the only possible solution the compiler could think of was to promote both of them to the nearest data type which was *int* in this case. (Sizeof which is 4)
+ 7
Speculatively speaking, expression `c + i` is implicitly casted to `int`, thus `sizeof` operator returns 4, which is used as last argument to be printed by `printf`.
+ 7
Ipang I suspect it must be a challanges question
+ 6
Yogeshwaran
Honestly I can't tell you why, I just (as I wrote, speculatively) figured addition of a `char` and a `short` require a promotion to a larger data type (assuming `int`).
My speculation resulted from an attempt to cast `c + i` to various integral types e.g. `short`, and `long long` which gave me an impression that otherwise explicitly specified, `int` is the promote type used.
printf("%lu %lu %lu", sizeof(i), sizeof(c), sizeof((short)c + i));
printf("%lu %lu %lu", sizeof(i), sizeof(c), sizeof((long long)c + i));
I didn't understand what you mean by "predict the result manually" BTW ...
(Edit)
Arsenic say correct 👍
+ 5
Arsenic
Idk bro, maybe it is ...
Anyways great answer about casting possibilities 👍
+ 5
Martin Taylor yes, that makes more sense.
+ 3
Ipang why sizeof(c+i) implicitly converted it into int?
What is a reason behind it
And how can I predict the result manually,in such a above case?with other two data types instead of char and int
+ 3
Yogeshwaran
Honestly I don't know how to predict return value of `sizeof` operator, I guess it differs by compiler implementation and/or system platform (32/64 bit) somehow.
+ 2
Ipang it's ok no problem....☺️👍
+ 1
Thank you so much Martin Taylor ☺️
+ 1
why most compilers will error or at least give a warning about an undetermined result or non type parameter.
Agreed, it will give a different result on each type depending on the platform's bus width, but spit out a warning in the case in the example, about using a number when it's looking for a type. Because, regardless of if it returns a char, short, int, long or anything else, the parameter isn't a 'type', which 'sizeof' is expecting. That is set in the prototype/definition of 'sizeof'.
In the case of the example, it will just treat the parameter as a number, not a type and return and int (whatever size int is for the compiler/system it's on).
Which is why (sizeof(c) + sizeof(i)) is the way to get the sum of both.