Domanda

When I use sizeof in printf function, it return the length, but if I assign to any int variable it doesn't give the length of the array. Instead the program stops. Here's the code:

int main(void) {
  int a[] = {1, 3, 5, 6, 23, 55, 93, 923, 112, 33, 5, 23};
  int len = sizeof(a)/sizeof(a[0]);   // It return anonymous integer
  int len2 = sizeof(a);
   // mergesort(a, 0, len); // This is an update, because of this, program stops.
  printf("%d\t%d\n", sizeof(a)/sizeof(a[0]), len);
  printf("%d", len2);  // The program stops when using this.
}

Update: Guys, sorry for the question. It's my mistake. It works fine, because I'm working with mergesort algorithm. It's an error in mergesort I think.

È stato utile?

Soluzione

@Acme has a good thought, using printf() format specifier %d for a size_t value is wrong.

On my Linux amd64 system, sizeof(int) = 4 and sizeof(size_t) = 8. That might be a problem.

It seems to work fine for me (perhaps the lack of a newline character after the second printf() statement is confusing you?) but due to the size_t vs int format specifier mismatch, the results are undefined.

$ ./size
12  12
48$
Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top