Question

I'm developing a network support framework (initially for Android) which includes three basic services: Host discovery, Network Communication, and QoS Monitor.

For the last service, I'm trying to implement a method that returns the maximum Messages Per Second (MPS) a single host can send periodically to another host.

Based on the size of the object to be sent and the network speed, I can easily get a rough estimate with the ideal MPS that can be sent through the network. The problem appears when I try to include the Signal Strength (SS) in the equation.

protected int getMPS(NetworkApplicationData message, Context context) {
    int messageSizeBits  = MemoryUtils.sizeOf(message) * 8;
    int networkSpeedMbps = getNetworkSpeed(context);
    float signalStrength = getNetworkSignalStrength(context);
    // FIXME: what about signalStrength?
    return networkSpeedMbps * 1024 * 1024 / messageSizeBits;
}

So the basic question here is: Is there any pre-established study about the impact of the signal strengh on the speed in a wireless network?

Doing some tests, I've noticed that depending on the Signal Strength, the established Network Speed changes. For instance, with a "normalized" 100%SS, the Android API returns a 54Mbps network speed value; and with a 40%SS, the API returns 7Mbps network speed value. Should I rely only on the network speed value that the Android API returns? In this case, I will mostly get an overestimated MPS.

Anyway, I need to know which is the correct approach to solve this issue and base the calculations on formal studies.

Was it helpful?

Solution

It's a well-studied topic in EE and CS. WiFi network throughput mainly depends on two factors: signal-to-noise (SNR) value, and network congestion. The former one is a physical-layer (PHY) issue, while the latter one a MAC layer issue. Data rate measured at both layers are definitely different.

At PHY, data rate is determined by SS (or more precisely, signal-to-noise ratio). Higher SS means higher data rate, according to Shannon's information theory. Data rate at PHY assumes that a single WiFi device transmits continuously without any contention. But this is not the case in real world. 54Mbps is a typical PHY data rate with high SS.

At MAC, data rate is got by PHY data rate multiplied by the percentage of time a WiFi device win the contention under CSMA protocol. Even if there's only one pair WiFi devices, data rate measured at MAC layer (which is almost the same as measured at application layer) is roughly half of PHY data rate.

Back to your question, in a clear WiFi network, divide the network speed returned from Android API by half for a rough estimate on the real data rate. In a busy network your real data rate can be derived only by online measurement and it changes all the time.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top