Hello David,
On Wed, Oct 29, 2008 at 11:49:30PM -0700, david johnson wrote:
Hi
First an introduction - I am a David Johnson - I've been working on mesh protocols for a few years now and I've built a 49 node wireless grid to benchmark mesh protocols in South Africa: http://wirelessafrica.meraka.org.za/wiki/index.php/49-node_Indoor_Mesh
Elektra and I did some experiments with batman (do I have to use the dots @#!!!) and olsr at the beginning of the year and we wrote a paper showing how batman outperforms olsr on CPU usage and better throughput due to less flapping. The results were very impressive!
Nice. :)
I'm now in the USA starting a PhD in mesh and I've decided to work on batman in one of my networking projects. We have this idea to try and improve gateway location performance by measuring real throughput at the gateway and then allowing nodes to optimally pick gateways that will give them the best throughput to the Internet but in a fair manner - ie. one user musn't disadvantage another.
That sounds interesting! Please keep us informed. ;)
I would like to understand how the "fast internet connection" gateway location node works batmand -r 1 [interface]?:
It says it considers link quality and advertised gateway class - what formula does it used to combine these two together so that it compare others against each other.
You can look it up at batman.c:393 (in the current trunk) The formula used is:
max( TQ^2 * download_speed )
(with a constant factor which is not relevant for the maximum, so i skipped it) TQ is the average transmit quality to the gateway, download_speed is its advertised download speed.
best regards, Simon