Question

In the IDL for a COM object I do the following:

enum TxMyFlags
{
,    flagOption = 1,
,    flagOtherOption = 2,
,    flagMoreOption = 4,
,    flagAnotherFlag = 8,
,    flagExtra = 128
// etc.
};

and have functions that can take the sum (or bitwise-OR, same thing) of flags, e.g. (in IDL)

HRESULT _stdcall a_method([in] enum TxMyFlags);

with an example of intended usage being:

a_method( flagExtra | flagMoreOption );

It seems to work but is this actually permitted, or is it possible that the RPC transport or whatever would reject values of an enum parameter that are not exactly in the enum definition?

Was it helpful?

Solution

If your client and server are in process (and there is no real mashaling happening, no RPC involved), you will not see any problem as the enum, however you define it, will be treated as its int/long/whatever integral type equivalent in size.

So it's the out-of-process (or cross apartment marshaling) case that maybe an issue. In this case, as stated in the doc here: enum attribute (and in your comments):

Objects of type enum are int types, and their size is system-dependent. By default, objects of enum types are treated as 16-bit objects of type unsigned short when transmitted over a network. Values outside the range 0 - 32,767 cause the run-time exception RPC_X_ENUM_VALUE_OUT_OF_RANGE. To transmit objects as 32-bit entities, apply the [v1_enum] attribute to the enum typedef.

So you basically have two options to use enums in idl: 1) use enum without typedef and 2) use enum with typedef and add the v1_enum attribute. In the first case, you can just declare the type in the method as you want on the wire, in the second, you will have to use that type (hence the v1_enum attribute):

enum MY_ENUM
{
    MY_ENUM_FIRST = 1,
    MY_ENUM_SECOND = 2,
};

typedef [v1_enum] enum
{
    MY_ENUM_TYPE_FIRST = 1,
    MY_ENUM_TYPE_SECOND = 2
} MY_ENUM_TYPE;

[object, uuid(15A7560E-901B-44D2-B841-58620B1A76C5)]
interface IMyInterface : IUnknown
{
    HRESULT MyMethod1(int myParam);
    HRESULT MyMethod2(MY_ENUM_TYPE myParam);
};

used like this:

IMyInterface *p = ...;
p->MyMethod1(MY_ENUM_FIRST | MY_ENUM_SECOND);
p->MyMethod2(MY_ENUM_TYPE::MY_ENUM_TYPE_FIRST | MY_ENUM_TYPE::MY_ENUM_TYPE_SECOND);

If you declare a method like this:

 HRESULT MyMethod1(enum MY_ENUM myParam);

Then you will use a 16-bit enum (and you can't add v1_enum on a not-typedef'd enum), so that's not good (unless you're ok with the 0-32767 limit).

Note I also declared this to ease typedef enum as flags conversion in the second line:

MY_ENUM_TYPE operator | (MY_ENUM_TYPE l, MY_ENUM_TYPE r) { return (MY_ENUM_TYPE)((int)l | (int)r); }

Well. The typedef way seems a bit overkill to me, but it has the advantage of being typed. If you scan Microsof's own .IDL files in the Windows SDK, you'll see they basically use both...

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top