On Wed, Oct 09, 2013 at 06:10:28PM +0200, Antonio Quartulli wrote:
On Wed, Oct 09, 2013 at 04:49:53PM +0100, David Laight wrote:
All CRC are linear. Because '(a + b) mod c' is the same as '((a mod c) + (b mod c)) mod c'.
The CRC of a buffer is the XOR of the CRCs generated for each '1' bit. The CRC for each bit depends on how far it is from the end of the buffer.
In our tables we cannot make any assumption about the order of the entries: the node whom generated the table may store the entries in a different order from what we have got. This is why I did not implemented it as a simple CRC of the whole the GlobalTable/buffer but I CRC'd each MAC+VID on its own.
Presetting the CRC to all-ones generates a value that is dependent on the length of the buffer - otherwise missing/extra leading zeros are not detected.
Assuming what I said above (that we cannot make assumptions on the order of the entries), what is your suggestion?
Given that CRCs are commutative, the easiest answer to this question is probably to perform the XOR of all the entries and than computing the CRC32 once (and when computing the CRC I should start with 0xFFFFFFFF other than 0), right?
Regards,