The line about "a megabit is 1,000,000 bits" seems to suggest:
- 8 bits in a byte
- 1,000 bytes in a kilobyte
- 1,000 kilobytes in a megabyte
Therefore:
- 1,000,000 bytes in a megabyte; or
- 8,000,000 bits in a megabyte
سؤال
I'm doing an exercise. But I'm confused on how to calculate the megabytes
A “genome” can be represented as a sequence of “bases”, each of which can be encoded as a two-bit value. Assuming that a megabit is 1,000,000 bits, how many megabytes are required to encode a 16 million base genome?
المحلول
The line about "a megabit is 1,000,000 bits" seems to suggest:
Therefore:
نصائح أخرى
16mln * 2 bit / MB = 32 MBit / MB = 32 MB / 8MB = 32 / 8 = 4
1 byte = 8 bits. Today, at least.