On Monday, April 02, 2012 00:23:36 dan wrote:
I am making some assumptions. I assume that the link will at some point become saturated. If we simply track the maximum then we can advertise an available amount.
This will result in a metric optimizing paths for the highest throughput ever recorded. In reality one can easily observe many links with variable throughput. Sometimes you get a spike of high throughput although the average speed is lower. Or your wifi environment changes with a negative impact on the throughput.
How do we identify this? Well, if the C radio has historically transfered 10Mb (on the interface closest to the gateway) and we have tracked it, we can take the current 5Mb away from that an see that there is 5Mb remaining. This does also assume that a specific interface has a consistent speed.
That is not a safe assumption.
I don't suggest making throughput the #1 route selection method, only what would be used if similar quality links where available. in this case, A<>B and A<>F are very similar quality so we would use available throughput in the decision making. Have a tunable threshold for TQ vs TQ before this load balancing is taken into account.
Interesting idea. Have to think about this a little bit.
I have another though on how to determine maximum speed but it is more 'destructive' Have batman-adv do a test on each link for tx, rx, and bi-directional and store the results and consider these the interfaces potential. Also identify if an interface is FD or HD. retest on an interval, and/or when the TQ on a link is consistently worse than when tested last. If the test was thorough enough, it would be able to identify at what throughput ping times and packet loss spike and have an effective 'safe' maximum vs absolute maximum.
Yes, we still have the "costly" way of detecting the link throughput ourselves. What do you think about the idea of asking the wifi rate algorithm for the link speed ?
Regards, Marek