Domanda

I know that negative length arrays have undefined behaviours, but with the standard of E1[E2] being identical to (*((E1)+(E2))) I expected something to work.

So in this instance what I do is I create an array that spans in direction -32, so the accessible indexes I expected to gain are -31 up to 0.

I am using 8bit unsigned chars of 0 to 255 and signed chars of -128 to +127, but this happens with 32bit integers and 64bit integers too.

I use C99's ability to declare arrays of variable length to construct the negative spanning array, specifically I am compiling to the GNU99 C standards.

I assign these values to the indexes and print them out as I go, all seems to work fine.

It goes strange when I make a pointer to the value at array index [-31] and then loop through that, 0 to 31, printing the values.

const signed char length = 32;
const signed char negativeLength = -length;

signed char array[negativeLength];
for ( signed char ii = 0; ii > negativeLength; ii-- ) {
    array[ii] = ii;
    printf( "array %d\n", array[ii] ); /* Prints out expected values */
}

printf( "==========\n" );

signed char * const pointer = &array[negativeLength + 1];
for ( unsigned char ii = 0; ii < length; ii++ ) {
    printf( "pointer %d\n", pointer[ii] ); /* Begins printing expected values then goes funky */
}

I get different results every time, but with 32 it generally starts out okay for the first 3 values, it then goes funky and starts printing out -93 up to +47 and then at index pointer[8], array[-23] it goes fine again.

I am running this on an iPad 2.

What exactly is going on here? Is the iPad messing with the pointer or the array when it detects the negative spanning array length?

È stato utile?

Soluzione

I sometimes advocate understanding the behavior observed in some C implementations in situations where the C standard does not define the behavior, because it can be illuminating about how certain implementing work or how computers work. In this case, however: Do not do that.

To access an array with arbitrary integer indices, from X (inclusive) to Y (exclusive), do this:

ElementType ArrayMemory[Y-X], *Array = ArrayMemory - X;

If X <= 0 <= Y and X < Y, the behavior of this is defined by the C standard.

Altri suggerimenti

Why would you expect something to work when you've done something with undefined behaviour?

While E1[E2] being equivalent to *(E1 + E2) is well defined, the data you're accessing is not well defined so all bets are off.

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top