How to calculate Network throughput based on Bandwidth and Latency? ?
How does one calculate the throughput of a network (assuming a certain SNR), affected by latency?
The thing is that in information theory, the formula that I know of is the Channel capacity formula, which has bits per second (bps) as unit, and is defined by:
C = B Log_2 (1+ SNR). (B: Bandwidth, SNR: Signal-to-Noise Ratio, C: Capacity).
Is there a formula that includes the latency as well?
Thank you!