Question

Are there many computers which use Big Endian? I just tested on 5 different computers, each purchased in different years, and different models. Each use Little Endian. Is Big Endian still used now days or was it for older processors such as the Motorola 6800?

Edit:

Thank you TreyA, intel.com/design/intarch/papers/endian.pdf is a very nice and handy article. It covers every answers bellow, and also expands upon them.

Was it helpful?

Solution

There's many processors in use today that is big endian, or allows the option to switch endian mode between big and little endian, (e.g. SPARC, PowerPC, ARM, Itanium..).

It depends on what you mean by "care about endian". You usually don't need to care that much specifically about endianess if you just program to the data you need. Endian matters when you need to communicate to the outside world, such as read/write a file, or send data over a network and you do that by reading/writing integers larger than 1 byte directly to/from memory.

When you do need to deal with external data, you need to know its format. Part of its format is to e.g. know how an integer is encoded in that data. If the format specifies that the first byte of an 4 byte integer is the most significant byte of said integer, you read that byte and place it at the most significant byte of the integer in your program, and you would be able to accomplish that fine with code that runs on both little and big endian machines.

So it's not so much specifically about the processor endianess, but the data you need to deal with. That data might have integers stored in either "endian", you need to know which, and various data formats will use various endianess depending on some specification, or depending on the whim of the guy that came up with the format.

OTHER TIPS

Big endian is still by far the most used, in terms of different architectures. In fact, outside of the Intel and the old DEC computers, I don't know of a small endian: Sparc, Power PC (IBM Unix machines), HP's Unix platforms, IBM mainframes, etc. are all big endian. But does it matter? About the only time I've had to consider endianness was when implementing some low level system routines, like modf. Otherwise, int is an integer value in a certain range, and that's it.

The following common platforms use big-endian encoding:

  • Java
  • Network data in TCP/UDP packets (maybe even on the IP level, but I'm not sure about that)

The x86/x64 CPUs are little-endian. If you are going to interface with binary data between the two, you should definitely be aware of this.

There is not enough context to the question. In general, you should simply be aware of it at all times, but you do not need to stress over it in everyday coding. If you plan on messing with the internal bytes of your integers, start worrying about endianness. If you plan on doing standard math on your integers, don't worry about it.

The two big places where endianness pops up is in networking (big endian standard) and binary records (have to research whether integers are stored big endian or little endian).

This qualifies more as a comment than an answer, but I can't comment and I think it's such a great article to read, that I think it worthwhile.

This is a classic on endianness by Danny Cohen, dating from 1980: ON HOLY WARS AND A PLEA FOR PEACE

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top