crc - How long should a checksum be? -


tcp segments include checksum field of 16-bits , ip datagrams , udp packets. @ link layer, crc checksum 4 bits (for 802.3 , 802.4).

on extreme part of spectrum, computing parity bit might yield lot of false-positive, packet looks valid in fact not, because number of bits have been altered. on other hand, having 16-bit checksum when 4 bits suffice waste of memory/bandwidth/money.

how evaluate how many bits checksum should be?

ps: i have taken internet stack example, applies protocol/software really.

ps2: not sure forum use.

first, quick nomenclature correction - checksums , crcs 2 different approaches trying solve same problem: detect bit errors occurred during data transmission in noisy channels. in general crcs more powerful detecting errors @ expense of more complexity.

selecting right error detection scheme requires knowledge channel (e.g. error probability) , noise characteristics (e.g. impulsive, bursty). there papers out there problem analyzed in detail , guidance given in how select error detection method. suggest trying introductory presentation starting point:

http://www.ece.cmu.edu/~koopman/pubs/koopmancrcwebinar9may2012.pdf

it give better understanding of complexity of beautiful area , provide links other learning materials.


Comments

Popular posts from this blog

jquery - How can I dynamically add a browser tab? -

node.js - Getting the socket id,user id pair of a logged in user(s) -

keyboard - C++ GetAsyncKeyState alternative -