Question

In my book it says that transmission delay=(length of the packet)/(transmission speed). However in all of the study problems they do not follow this logic. For example, they ask for the transmission delay of a 1,000 byte packet over a 1Mbps connection. I get 1 microsecond, but somehow they get 8. Am I missing something?

Was it helpful?

Solution

Because a byte is not a bit.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top