質問

I am working with a source base with a unclear for me rule on pointer types definition: using _PTR_ macro instead of *. So, all the function prototypes and typedefs look like:

extern FILE_PTR    _io_fopen(const char _PTR_, const char _PTR_);

I wonder what could be the rationale behind this since for me this seems excessive.

EDIT

By the way, for double indirection I found:

_io_strtod(char _PTR_, char _PTR_ _PTR_);
役に立ちましたか?

解決

It is possible that the definition is for compatibility with DOS.

#ifdef DOS
#define _PTR_ far *
#else
#define _PTR_ *
#endif

The far / near keywords allow pointers to address memory inside / outside the current segment, allowing programs to address more than 64 KiB of memory while still keeping the benefits of 16 bit pointers for faster code / less memory usage.

It is more typical to exclude * from the definition. For example, in LibPNG, you can see definitions like:

typedef png_color FAR * png_colorp;
typedef png_color FAR * FAR * png_colorpp;

On most platforms, FAR will be #defined to nothing.

Although DOS is long past, some modern embedded architectures have similar issues. For Harvard architecture processors, pointers to program and data memory must be accessed using different instructions, so they have different types. Other processors have different "data models", and it is not uncommon to see special pointer types for pointers below 2^24, 2^16, or 2^8.

他のヒント

That's a good convention to easily (for sufficiently small definitions of easily) differentiate between multiplication and indirection

int _PTR_ arr = malloc(42 * sizeof _PTR_ arr);
ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top