Domanda

Absolutely flummoxed over this:

if the array length is set to 5 in the declaration, int a[5] then I get the expected result from sizeof(a) / sizeof(a[0]). However, with int a[10]; as below, sizeof(a) returns 50 and sizeof(a) / sizefo(a[0]) returns 12. Cannot see through this. Is my compiler gone bad? Using clang on OS X 10.9.

#include <stdio.h>


int main(void) {

    int a[10];
    size_t n = sizeof(a) / sizeof(a[0]);

    printf("sizeof(a): %lo\n", sizeof(a));
    printf("sizeof(a[0]): %lo\n", sizeof(a[0]));
    printf("%lo\n", n);

    return 0;

}
È stato utile?

Soluzione

You're printing the numbers in octal notation!

050 (octal) == 40 (decimal), and 012 == 10 as expected.

Change all of your %lo to %ld to display the values in decimal.

Altri suggerimenti

50 in octal = 40 in decimal

These lines print the numbers in octal.

printf("sizeof(a): %lo\n", sizeof(a));
printf("sizeof(a[0]): %lo\n", sizeof(a[0]));
printf("%lo\n", n);

Change them to below to print in decimal.

printf("sizeof(a): %ld\n", sizeof(a));
printf("sizeof(a[0]): %ld\n", sizeof(a[0]));
printf("%ld\n", n);
Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top