so really odd question brought about accidentally by a freshman cs student.
I'm TAing the intro to programming class and the student created an array of size [digit-1], where digit is guaranteed to be at least one due to how the program executes.
However a 0 sized array doesn't make any sense to me. I created a test program and compiled it, and at least in GCC on the mac, it seems to work. I did a little searching on the interwebs and others are saying it's technically undefined behavior according to the C standard so I'm doubting this code would be portable. However I cannot find anything definitive on this.
Any C standard gurus care to chime in on this? Am I just crazy, how is this working?
Here is the quick and dirty sample I wrote:
I'm TAing the intro to programming class and the student created an array of size [digit-1], where digit is guaranteed to be at least one due to how the program executes.
However a 0 sized array doesn't make any sense to me. I created a test program and compiled it, and at least in GCC on the mac, it seems to work. I did a little searching on the interwebs and others are saying it's technically undefined behavior according to the C standard so I'm doubting this code would be portable. However I cannot find anything definitive on this.
Any C standard gurus care to chime in on this? Am I just crazy, how is this working?
Here is the quick and dirty sample I wrote:
Code:
#include <stdio.h>
int main(void) {
int r[0];
r[0]=78;
printf("%d\n",r[0]);
return 0;
}