This complete introduction to two-dimensional (2-D) information theory and coding provides the key techniques for modeling data and estimating their information content. Throughout, special emphasis is placed on applications to transmission, storage, compression, and error protection of graphic information. The book begins with a self-contained introduction to information theory, including concepts of entropy and channel capacity, which requires minimal mathematical background knowledge. It then introduces error-correcting codes, particularly Reed-Solomon codes, the basic methods for error-correction, and codes applicable to data organized in 2-D arrays. Common techniques for data compression, including compression of 2-D data based on application of the basic source coding, are also covered, together with an advanced chapter dedicated to 2-D constrained coding for storage applications. Numerous worked examples illustrate the theory, whilst end-of-chapter exercises test the reader's understanding, making this an ideal book for graduate students and also for practitioners in the telecommunications and data storage industries.
Author(s): Jørn Justesen, Søren Forchhammer
Edition: 1
Publisher: Cambridge University Press
Year: 2009
Language: English
Pages: 185
Half-title......Page 3
Title......Page 5
Copyright......Page 6
Contents......Page 7
Preface......Page 13
1.2 Entropy of discrete sources......Page 15
1.3 Mutual information and discrete memoryless channels......Page 18
1.4 Source coding for discrete sources......Page 19
1.4.2 Huffman coding......Page 20
1.5 Arithmetic codes......Page 22
1.5.1 Ideal arithmetic coding......Page 23
1.5.2 Finite-precision arithmetic coding......Page 24
1.5.3 Decoding arithmetic codes......Page 25
1.5.4 Universal codes......Page 26
1.6 Entropy of images......Page 27
1.7 Notes......Page 28
Exercises......Page 29
References......Page 30
2.2 Finite-state sources......Page 31
2.2.2 Counting constrained sequences......Page 32
2.3 Markov sources and Markov chains......Page 33
2.3.3 Markov sources......Page 34
2.4.2 Entropy of Markov chains and Markov sources......Page 35
2.5 Maximum-entropy Markov chains......Page 36
2.7.1 Structure in two-dimensional sources......Page 38
2.7.3 Causal models......Page 40
2.8.1 Independence assumptions......Page 42
2.8.3 Constructing a two-dimensional random field from a Markov chain......Page 43
2.8.4 Symmetric Pickard random fields......Page 46
2.9 Markov random fields......Page 47
Exercises......Page 48
References......Page 49
3.2.1 Discrete memoryless channels and capacity......Page 50
3.2.2 Codes for discrete memoryless channels......Page 52
3.3.1 Linear codes and vector spaces......Page 53
3.3.2 Decodable errors......Page 54
3.3.3 Hamming codes and their duals......Page 56
3.3.4 Cyclic codes and generator polynomials......Page 57
3.3.5 Orthogonal sequences and arrays for synchronization......Page 58
3.3.6 Product codes......Page 59
3.4.1 The union bound for block codes......Page 60
3.4.2 A bound for rates close to capacity......Page 61
3.4.3 Performance of specific codes on the BSC......Page 63
3.5.1 Rate-distortion functions......Page 64
3.5.2 Encoding of correlated sources......Page 65
Exercises......Page 66
References......Page 67
4.2 Finite fields......Page 68
4.3 Definition of Reed-Solomon codes......Page 69
4.4 Decoding by interpolation......Page 70
4.5 Decoding from syndromes......Page 72
4.6 Reed-Solomon codes over the fields F(2m)......Page 75
4.7 Faster decoding methods......Page 76
4.8 Reed-Solomon codes and q-ary channels......Page 78
Exercises......Page 79
5 Source coding......Page 80
5.1 Context-based adaptive coding of sources with memory......Page 81
5.1.2 Context-adaptive code length......Page 82
5.1.3 Context-based adaptive image coding......Page 84
5.1.4 Context-based coding of binary images......Page 85
5.2 Universal coding......Page 86
5.2.1.1 The LZ78 principle......Page 87
5.2.2 Algorithm Context......Page 90
5.3 Dynamic context algorithms......Page 94
5.4 Code lengths and estimation of entropy......Page 96
5.5 Notes......Page 97
Exercises......Page 98
References......Page 99
6.2 Images and documents such as two-dimensional media......Page 100
6.2.1 Mixed media and document layout......Page 101
6.3 Media components......Page 102
6.3.2 Maps and line drawings......Page 103
6.3.2.1 Layered raster-graphic maps......Page 104
6.3.2.2 Models of discrete line graphics......Page 105
6.3.3 Natural images......Page 107
6.3.3.1 A model of halftone images......Page 108
6.3.3.2 Error diffusion......Page 109
6.4 Context-based coding of two-dimensional media......Page 110
6.4.1 Lossless context-based coding......Page 111
6.4.2.1 Relative pixel patterns......Page 112
6.4.2.2 Optimal context quantization......Page 114
6.4.3 Lossy context-based coding......Page 115
6.5.1 Refinement coding for quality progression......Page 117
6.5.2 Multi-resolution context-based coding......Page 118
6.5.3.1 Coding of layered graphics......Page 119
6.5.3.2 Results for coding of maps......Page 120
6.6 Model-based coding of two-dimensional document components......Page 121
6.6.1.2 Soft pattern matching......Page 122
6.6.2 Coding of halftone images......Page 123
6.7.1 Predictive image coding......Page 124
6.7.1.1 Lossy predictive image coding......Page 126
6.7.2 Transform-based image coding......Page 127
6.7.3 Subband- and wavelet-based image coding......Page 128
6.8.1 Coding document segments......Page 130
6.9 Notes......Page 131
References......Page 132
7.2 Two-dimensional fields with a finite constraint......Page 133
7.3 Counting two-dimensional patterns with constraints......Page 135
7.4 Bands of finite height......Page 137
7.4.1 Bounds for first-order-symmetric constraints......Page 138
7.4.2 Markov chains on bands......Page 140
7.5.1 Basic concepts......Page 141
7.6 Pickard random fields......Page 142
7.6.1 General Pickard random fields......Page 143
7.6.2 Remarks on causal models......Page 144
7.7 Markov random fields with maximum entropy......Page 145
7.8 Block Pickard construction......Page 146
7.8.1 Constructing Pickard random fields from two Markov chains......Page 147
7.8.2 Block Pickard random fields for two-dimensional constraints......Page 150
7.8.3 Block Pickard random fields - no isolated bit......Page 151
7.8.4 Tiling with dominoes using Pickard random fields......Page 154
7.9 Notes......Page 156
Exercises......Page 157
References......Page 159
8.2 Binary images of Reed-Solomon codes......Page 160
8.3 Concatenated codes......Page 162
8.4 Product codes......Page 163
8.5 Iterated decoding of product codes......Page 164
8.6 Graph codes for two-dimensional error-correction......Page 166
8.6.1 Bipartite graphs from geometries......Page 167
8.6.2 Minimum distances of graph codes......Page 168
8.7.1 Defects in physical media or in the writing/printing process......Page 169
8.8 Notes......Page 170
Exercises......Page 171
A1.1 Multiplication-free arithmetic coding......Page 172
B2 Maximum-entropy memoryless sources......Page 174
B4 Markov sources with a given structure......Page 175
Appendix C Decoding of Reed–Solomon code in F (16)......Page 177
Index......Page 184