I have been measuring the performance of a few target servers located around the world over the course of a week, sending 120 packets, 4 times a day at 6 hour intervals.
I have a lot of data to analyse and i'm trying to figure out the answer to a few questions that don't make sense to me at the moment:
I'm thinking there should be a correllation between large RTTs and packet loss events, as if a packet does not return before a timeout occurs then it will be considered as lost?
Based on that theory I was thinking that packets should get lost when the measured RTTs are close to their maximum values? However a lot of my results aren't showing that, losses occur sometimes when RTT max values are relatively low.
Am I right in thinking that these losses are influenced by network path and pinging frequency?
I hope I have explained myself well enough, any help that you could provide is much appreciated!
I have a lot of data to analyse and i'm trying to figure out the answer to a few questions that don't make sense to me at the moment:
I'm thinking there should be a correllation between large RTTs and packet loss events, as if a packet does not return before a timeout occurs then it will be considered as lost?
Based on that theory I was thinking that packets should get lost when the measured RTTs are close to their maximum values? However a lot of my results aren't showing that, losses occur sometimes when RTT max values are relatively low.
Am I right in thinking that these losses are influenced by network path and pinging frequency?
I hope I have explained myself well enough, any help that you could provide is much appreciated!