문제

In my book it says that transmission delay=(length of the packet)/(transmission speed). However in all of the study problems they do not follow this logic. For example, they ask for the transmission delay of a 1,000 byte packet over a 1Mbps connection. I get 1 microsecond, but somehow they get 8. Am I missing something?

도움이 되었습니까?

해결책

Because a byte is not a bit.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top