Question

This is a silly question but I'll ask anyway: Is a Gigabit = 1000000000 bits or is a Gigabit = 1073741824 bits?

A google search on "1 gigabit to bit" gives me 1073741824 bits, but the Wikipedia article thinks its a billion bits. Which is right?

Was it helpful?

Solution 2

From Wikipedia:

1 gigabit = 10^9bits = 1000000000bits.

The gigabit is closely related to the gibibit, a unit multiple derived from the binary prefix gibi (symbol Gi) of the same order of magnitude, which is equal to 2^30bits = 1073741824bits, or approximately 7% larger than the gigabit.

OTHER TIPS

It depends on the prefix : SI prefixes or Binary prefixes. You can have an explanation here :

Wikipedia article on Octet

Gibibit = 2^30 bits (1,073,741,824 bits)
Gigabit = 10^9 bits (1,000,000,000 bits)

Also, for reference:

Terrabit = 10^12 bits (1,000,000,000,000 bits)
Megabit = 10^6 bits (1,000,000 bits)
Kilobit = 10^3 bits (1,000 bits)
8 bits = 1 byte ( Hence, why binary is 8 bit. (Since they have 8 0s, and 1s) )
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top