I'm doing an exercise. But I'm confused on how to calculate the megabytes

A “genome” can be represented as a sequence of “bases”, each of which can be encoded as a two-bit value. Assuming that a megabit is 1,000,000 bits, how many megabytes are required to encode a 16 million base genome?

有帮助吗?

解决方案

The line about "a megabit is 1,000,000 bits" seems to suggest:

  • 8 bits in a byte
  • 1,000 bytes in a kilobyte
  • 1,000 kilobytes in a megabyte

Therefore:

  • 1,000,000 bytes in a megabyte; or
  • 8,000,000 bits in a megabyte

其他提示

16mln * 2 bit / MB = 32 MBit / MB = 32 MB / 8MB = 32 / 8 = 4

1 byte = 8 bits. Today, at least.

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top