Question

I'm reading the code of an iPhone sample project (Xcode IDE, Apple LLVM compiler 4.2). In a header file of an external library (written in C) for that iPhone sample project, there're some events declared in enumeration type:

typedef enum _Application_Events
{
    EVENT_EXIT = 0x80000000,
    EVENT_TOUCH,
    EVENT_DRAG,
    EVENT_RELEASE_TOUCH,
    EVENT_ROTATE_0,
    EVENT_ROTATE_90,
    EVENT_ROTATE_180,
    EVENT_ROTATE_270
} Application_Events;

I don't understand what kind of values are assigned to those events. Is 0x80000000 supposed to be a big positive integer (2147483648), or negative zero, or a negative integer (-2147483648)?

I inspected in Xcode debugger, with the compiler being Apple LLVM compiler 4.2, the EVENT_EXIT equals (int) -2147483648 and the EVENT_RELEASE_TOUCH equals (int) -2147483645 and so on.

Apparently, they're treated in two's complement representation. An related post can be found here.

But what I'm not sure about now are these:

(1) The underlying data type for 0x80000000 always being int or something else in other situations? Is this depended on compiler or platform?

(2) If I assigned a hexadecimal value to a signed integer like this, is it always interpreted as the two's complement representation? Is this depended on compiler or platform? A related post can be found here. Another reference can be found here.

Please share some ideas. Thank you all :D

Was it helpful?

Solution

Like many things in C-like languages, an enumeration is just an integer. Setting the first value like this will cause the compiler to increment from there, guaranteeing that all enumeration values are less than 0. (as a signed integer by 2s compliment, the high bit being set will indicate a negative number)

Likely, the programmers chose this value to be able to send various kinds of events, and shouldn't collide with the others.

In a nutshell, don't worry about the actual value; it's just a number. Use the name and understand that's supposed to be the meaning in the context of the calls that use or return those codes.

OTHER TIPS

The base type of the enumeration is implementation defined. In this case, the base type should be unsigned int, because the standard requires the compiler to pick a base type that is wide enough to hold all enumeration values. From the C99 standard, section 6.7.2.2.4:

Each enumerated type shall be compatible withchar, a signed integer type, or an unsigned integer type. The choice of type is implementation-defined,108) but shall be capable of representing the values of all the members of the enumeration. The enumerated type is incomplete until after the}that terminates the list of enumerator declarations.

108) An implementation may delay the choice of which integer type until all enumeration constants have been seen.

  1. The type for an enum is int.
  2. 0x80000000 is just a number, but in hexadecimal notation. It's value is whatever you confirmed in the debugger (or any hexadecimal to decimal converter).
  3. The way enums work is that the values get assigned incrementally from any explicitly assigned value. So, in this case, the enums are getting assigned as EVENT_EXIT=0x80000000, EVENT_TOUCH=0x80000001, EVENT_DRAG=0x80000002, and so on.

The underlying type of the enum depends of the value it needs to hold. The compiler has some latitude on how that type may ultimately be defined. In your case, it's likely that the underlying type of Application_Events is unsigned int because it's greater than INT_MAX, assuming that int is 32-bits in size (what an enum is generally). But something like:

enum foo_t {
   FOO_Start,
   FOO_Thing,
   FOO_Another_Thing,
   FOO_End
};

The type of enum foo_t could be int or unsigned int.

However, enumeration constants, (e.g., EVENT_EXIT, FOO_Start, etc.) are of type int. That's what you're seeing in the debugger. If you do something like

Application_Events foo = EVENT_EXIT;

the type of foo could be unsigned. This question has changed a little, I think, so:

1) For iPhone the constant 0x80000000 is probably unsigned (the iPhone ARM processor has 32-bit ints). It's value depends on the platform, and the version of C used.

2) For practical reality, you can assume that your processor will support two's complement arithmetic, since most platforms use it. The C language itself does not guarantee that however. Other arithmetic schemes (ones' complement, signed magnitude) are allowed.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top