How do find transmission delay of a network?
-
14-11-2019 - |
문제
In my book it says that transmission delay=(length of the packet)/(transmission speed). However in all of the study problems they do not follow this logic. For example, they ask for the transmission delay of a 1,000 byte packet over a 1Mbps connection. I get 1 microsecond, but somehow they get 8. Am I missing something?
해결책
Because a byte is not a bit.
제휴하지 않습니다 StackOverflow