From my understanding octal digits in a number map to 3 bit segments, and will never truly represent a binary string of 2^n length.

Out of curiosity, When/Why/How is this useful? :-)

..If the base of the primary representation of a word in an architecture is a prime(!) power of 2, at least make sure It's even ;-).

有帮助吗?

解决方案

Octal can represent exactly a binary string of any length. Although it's true that with 8 bit bytes and byte addressing, hexadecimal seems more natural. Historically, however...

  • a lot of machines had 36 bit words, where octal made a lot of sense, and

  • on the PDP-11 (on which the first C compilers ran), the machine instructions were divided into 3 bit groups: the high bit to flag whether the operation was on bytes or words, then a 3 bit op-code, and two six bit addresses, with the first 3 bits the addressing mode and the second 3 the register involved.

At the time when C was first being invented, octal was probably more frequently used than hexadecimal, and so the authors of the language provided for it. (I can't recall seeing it actually used in recent code for a very long time, however.)

其他提示

At the times of Unix development there were machines (DEC) with 36-bit word. A 36-bit word is made of four 9-bits bytes, each of them represented by three octal digits.

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top