This Title All WIREs
How to cite this WIREs title:
WIREs Comp Stat

Coding theory

Full article on Wiley Online Library:   HTML PDF

Can't access this content? Tell your librarian.

Abstract Coding theory is a portion of information theory concerned with the explicit representation of data as a sequence of symbols, usually a sequence of bits. The entropy of a probability distribution measures the information content of data, giving a lower bound on the number of bits necessary to encode data. Source coding also defines a measure, the divergence, of the cost of using a suboptimal representation. Channel coding describes representations that maximize the rate at which information can be communicated through a noisy medium. Important applications of coding theory include data compression, signal processing, and the comparison of statistical models. Copyright © 2009 Wiley Periodicals, Inc. This article is categorized under: Applications of Computational Statistics > Signal and Image Processing and Coding

Arithmetic coding recursively identifies a location in the cumulative distribution.

[ Normal View | Magnified View ]

Binary symmetric channel.

[ Normal View | Magnified View ]

Arithmetic coding uses the position of a string in the cumulative distribution of five‐character messages to encode efficiently the string as a sequence of bits.

[ Normal View | Magnified View ]

Binary entropy function.

[ Normal View | Magnified View ]

Browse by Topic

Applications of Computational Statistics > Signal and Image Processing and Coding

Access to this WIREs title is by subscription only.

Recommend to Your
Librarian Now!

The latest WIREs articles in your inbox

Sign Up for Article Alerts