문제

I'm doing an exercise. But I'm confused on how to calculate the megabytes

A “genome” can be represented as a sequence of “bases”, each of which can be encoded as a two-bit value. Assuming that a megabit is 1,000,000 bits, how many megabytes are required to encode a 16 million base genome?

도움이 되었습니까?

해결책

The line about "a megabit is 1,000,000 bits" seems to suggest:

  • 8 bits in a byte
  • 1,000 bytes in a kilobyte
  • 1,000 kilobytes in a megabyte

Therefore:

  • 1,000,000 bytes in a megabyte; or
  • 8,000,000 bits in a megabyte

다른 팁

16mln * 2 bit / MB = 32 MBit / MB = 32 MB / 8MB = 32 / 8 = 4

1 byte = 8 bits. Today, at least.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top