When implementing some interface, as given in a header file, how best to prevent dangerous mismatches between the compilations of the library implementation and the header file?
Details: a library interface is provided by a header file, say, foo.h
, and its implementation by some source file, say, foo.cc
. The latter is compiled to create the library, say, libfoo.so
, while the former #include<>
ed by an application, which is linked against libfoo.so
.
Now, suppose in foo.h
// foo.h
namespace foo {
class bar
{
#ifdef SomeOption
std::int32_t x[2];
#else
std::int64_t x[2];
#endif
bar const*ptr;
/* ... */
};
}
Then the offset of ptr
is either 8 or 16 bytes, depending on SomeOption
(and the sizeof(bar)
also differs). Now, if the library was compiled with a different value for SomeOption
than the application, then obviously serious trouble will ensue (which is hard to debug for the unaware).
A solution? So, I came up with the following idea
// foo.h
namespace foo {
enum { hasSomeOption = 1 };
int options_flags()
{
return 0
#ifdef SomeOption
| hasSomeOption
#endif
;
}
class bar
{
/* ... as before */
bar(some_args, int);
public:
bar(some_args) : bar(some_args, options_flags()) {} // what's option_flags()?
/* ... */
};
}
and
// foo.cc
namespace {
const int src_flags = options_flags(); // flags used for compiling library source
}
namespace foo {
bar::bar(some_args, int app_flags)
{
assert(app_flags == src_flags);
/* ... */
}
}
with the idea that the assert
will catch any inconsistencies. However, this doesn't work: the compiler seems to optimize my idea away and the assert never triggers, even if SomeOption
was
different for library and application compilation.
Questions Is there a recommended best method for this type of problem?