[please read through the desc.] Why does this c program take more than defined outputs ?
CODE : #include <stdio.h> int main() { int marks[5], i, n, sum = 0, average; printf("Enter number of elements: "); scanf("%d", &n); for (i = 0; i < n; ++i) { printf("Enter number%d: ", i + 1); scanf("%d", &marks[i]); sum += marks[i]; } average = sum / n; printf("Average = %d", average); return 0; } OUTPUT : Output 1 : When i use input n to be in more than what is defined (say n=7 which is more than 5 , the defined no of blocks) In this case, the Average is always wrong Enter number of elements: 7 Enter number1: 11 Enter number2: 11 Enter number3: 11 Enter number4: 11 Enter number5: 11 Enter number6: 11 Enter number7: 11 Average = 3 3 can not be the average, so is this a garbage value? Output 2 : Enter number of elements: 5 Enter number1: 11 Enter number2: 11 Enter number3: 11 Enter number4: 11 Enter number5: 11 Average = 11 Its only when i input n=5 or less that the average comes out correct. This is Obvious. But my question is why does the program take input for more than the no of blocks defined, like in the case of output 1