Hi
First an introduction - I am a David Johnson - I've been working on mesh protocols for a few years now and I've built a 49 node wireless grid to benchmark mesh protocols in South Africa: http://wirelessafrica.meraka.org.za/wiki/index.php/49-node_Indoor_Mesh
Elektra and I did some experiments with batman (do I have to use the dots @#!!!) and olsr at the beginning of the year and we wrote a paper showing how batman outperforms olsr on CPU usage and better throughput due to less flapping. The results were very impressive!
I'm now in the USA starting a PhD in mesh and I've decided to work on batman in one of my networking projects. We have this idea to try and improve gateway location performance by measuring real throughput at the gateway and then allowing nodes to optimally pick gateways that will give them the best throughput to the Internet but in a fair manner - ie. one user musn't disadvantage another.
I would like to understand how the "fast internet connection" gateway location node works batmand -r 1 [interface]?:
It says it considers link quality and advertised gateway class - what formula does it used to combine these two together so that it compare others against each other.
Thanks and look forward to becoming part of the discussion
David