外文科技图书简介
当前位置:首页 > 检索结果 >文献详细内容

书名:Information and life

责任者:Gérard Battail.

ISBN\ISSN:9789400770393,9400770391 

出版时间:2014

出版社:Springer,

分类号:生物科学


前言

Communication, one of the most important functions of life, occurs at any spatial scale from the molecular one up to that of populations and ecosystems, and any time scale from that of fast chemical reactions up to that of geological ages. Information theory, a mathematical science of communication initiated by Shannon in 1948, has been very successful in engineering, but biologists ignore it.
This book aims at bridging this gap. It proposes an abstract definition of information based on the engineers' experience which makes it usable in life sciences. It expounds information theory and error-correcting codes, its by-products, as simply as possible. Then, the fundamental biological problem of heredity is examined. It is shown that biology does not adequately account for the conservation of genomes during geological ages, which can be understood only if it is assumed that genomes are made resilient to casual errors by proper coding. Moreover, the good conservation of very old parts of genomes, like the HOX genes, implies that the assumed genomic codes have a nested structure which makes an information the more resilient to errors, the older it is.
The consequences that information theory draws from these hypotheses meet very basic but yet unexplained biological facts, e.g., the existence of successive generations, that of discrete species and the trend of evolution towards complexity. Being necessarily inscribed on physical media, information appears as a bridge between the abstract and the concrete. Recording, communicating and using information exclusively occur in the living world. Information is thus coextensive with life and delineates the border between the living and the inanimate.

查看更多

目录

1 Introduction 1

1.1 Aim of the Book 1

1.2 About the Method 5

1.3 On the Book Content 6

References 7

Part I Information as a Scientific Entity

2 What is Information? 11

2.1 Information in a Usual Meaning 11

2.2 Features of Information as a Scientific Entity 13

2.3 Comments on the Definitions of Information 16

2.4 An Information as a Nominable Entity 18

      2.4.1 Naming and Counting 18

      2.4.2 Defining and Representing Natural Integers 19

      2.4.3 Concept of Nominable Entity 22

      2.4.4 Representatives of Nominable Entities Need to be Protected. 26

2.5 A Short History of Communication Engineering 27

2.6 Communication Over Space or Over Time 30

References 31

3 Basic Principles of Communication Engineering 33

3.1 Physical Inscription of a Single Symbol 33

3.2 Physical Inscription of a Sequence 36

      3.2.1 Symbols and Sequences 36

      3.2.2 Representing a Sequence of Symbols by a Sequence of Signals 37

3.3 Receiving a Binary Symbol in the Presence of Noise 39

3.4 Communicating Sequences in the Presence of Noise: Channel Coding 43

      3.4.1 Channel Coding is Needed 43

      3.4.2 Redundancy Enables Channel Coding 45

References 49

4 Information Theory as the Science of Literal Communication 51

4.1 Shannon’s Paradigm and its Variants 52

      4.1.1 Basic Paradigm 52

      4.1.2 Variants of Shannon’s Paradigm 53

      4.1.3 Functions and Limits of the Coding Processes 55

4.2 Quantitative Measures of Information 58

      4.2.1 Principle of Information Measurement 58

      4.2.2 Proper and Mutual Information 59

      4.2.3 Entropy and Average Mutual Information 62

      4.2.4 Properties of the Entropy and of the Mean Mutual Information 66

      4.2.5 Information Rates; Extension of a Source 69

      4.2.6 Cross-Entropy 70

      4.2.7 Comments on the Measurement of Information 74

4.3 Source Coding 76

      4.3.1 Source Models 76

      4.3.2 Representation of a Code by a Tree, Kraft Inequality 78

      4.3.3 Fundamental Theorem of Source Coding 80

      4.3.4 Source Coding by the Huffman Algorithm 82

      4.3.5 Some Comments About Source Coding 85

References 91

5 Channel Capacity and Channel Coding 93

5.1 Channel Models 94

5.2 Capacity of a Channel 95

      5.2.1 Defining the Capacity of a Channel 95

      5.2.2 Capacity of Simple Discrete Input Channels 97

      5.2.3 Capacity of the Additive White Gaussian Noise Channel 97

      5.2.4 Kolmogorov’s ε-entropy 99

5.3 Channel Coding Needs Redundancy 100

5.4 On the Fundamental Theorem of Channel Coding 101

      5.4.1 A Geometrical Interpretation of Channel Coding 102

      5.4.2 Random Coding, its Geometrical Interpretation 104

      5.4.3 Random Coding for the Binary Erasure Channel 107

      5.4.4 Largest Minimum Distance of Error-Correcting Codes 107

      5.4.5 General Case: Feinstein’s Lemma. 108

5.5 Error-Correcting Codes 109

      5.5.1 Defining an Error-Correcting Code 109

      5.5.2 Using Error-Correcting Codes: Decoding and Regeneration 110

      5.5.3 Designing Error-Correcting Codes 111

      5.5.4 Recursive Convolutional Codes 112

      5.5.5 Turbocodes 116

      5.5.6 Low-Density Parity-Check Codes 118

      5.5.7 Decoding Random-Like Codes: Principles 119

      5.5.8 Decoding an LDPC Code. 120

      5.5.9 Decoding a Turbocode 123

      5.5.10 Variants and Comments 128

      5.5.11 Error-Correcting Codes Defined by Non-Mathematical Constraints: Soft Codes 129

References 130

6 Information as a Fundamental Entity 133

6.1 Algorithmic Information Theory 134

6.2 Emergent Information in Populations 139

6.3 Physical Entropy and Information 141

      6.3.1 Thermodynamics and Physical Entropy 141

      6.3.2 Boltzmann Constant as a Signal-to-Noise Ratio 145

      6.3.3 Exorcizing Laplace’s Demon. 147

      6.3.4 Information is not a Physical Entity 148

6.4 Information Bridges the Abstract and the Concrete 150

References 151

Part II Information is Coextensive with Life

7 An Introduction to the Second Part 155

7.1 Relationship with Biosemiotics 155

7.2 Content and Spirit of the Second Part 156

References 159

8 Heredity as a Communication Problem 161

8.1 The Enduring Genome 161

      8.1.1 A Blatant Contradiction 162

      8.1.2 An Upper Bound on the DNA Channel Capacity 165

      8.1.3 Main Hypothesis: Genomic Error-Correcting Codes must Exist 167

      8.1.4 Subsidiary Hypothesis: Nested Codes 168

8.2 Consequences of the Hypotheses Meet Biological Reality 170

      8.2.1 Genomes are Redundant 170

      8.2.2 Discrete Species Exist with a Hierarchical Taxonomy 172

      8.2.3 Nature Proceeds with Successive Generations 172

      8.2.4 Evolution is Contingent and Saltationist 174

      8.2.5 Evolution Trends Towards Increasing Complexity 176

      8.2.6 Some Comments About the Consequence of the Hypotheses 178

8.3 A Toy LivingWorld 179

      8.3.1 A Toy LivingWorld in Order to Mimic the RealWorld 179

      8.3.2 Permanence of a ‘Genome’ 180

      8.3.3 Populations of IndividualsWithin Species 181

      8.3.4 An Illustrative Simulation 181

      8.3.5 Natural Selection in the Toy LivingWorld 186

8.4 Identifying Genomic Error-Correcting Codes 187

References 191

9 Information is Specific to Life 193

9.1 Information and Life are Indissolubly Linked 193

9.2 Semantic Feedback Loops 194

      9.2.1 Semantic Feedback Loops and Genetic Mapping 194

      9.2.2 Semantic Feedbacks Implement Barbieri’s Organic Codes 197

      9.2.3 Semantic Feedback Loops are Compatible with Evolution 201

      9.2.4 Conjecture About the Origin of Semantic Feedback Loops 202

9.3 Information as a Fundamental Entity 203

      9.3.1 Information is an Abstract Entity 203

      9.3.2 On the Epistemological Status of Information 204

9.4 Nature as an Engineer 206

References 209

10 LifeWithin the PhysicalWorld 211

10.1 A Poorly Understood Divide 211

10.2 Maxwell’s Demon in Physics and in Life 214

10.3 A Measurement as a Means for Acquiring Information 219

References 221

11 Conclusion 223

Appendix A: Tribute to Shannon 225

Appendix B: Some Comments about Mathematics 237

Appendix C: A Short Glossary of Molecular Genetics 247

Index 255

查看更多

馆藏单位

中科院文献情报中心