Principles of Neural Information Theory

Contents (220 pages)

1. In the Light of Evolution
1.1 Introduction
1.2 All That We See
1.3 In the Light of Evolution
1.4 In Search of General Principles
1.5 Information Theory and Biology
1.6 An Overview of Chapters

2. Information Theory
2.1 Introduction
2.2 Finding a Route, Bit by Bit
2.3 Information and Entropy
2.4 Maximum Entropy Distributions
2.5 Channel Capacity
2.6 Mutual Information
2.7 The Gaussian Channel
2.8 Fourier Analysis
2.9 Summary

3. Measuring Neural Information
3.1 Introduction
3.2 The Neuron
3.3 Why Spikes?
3.4 Neural Information
3.5 Gaussian Firing Rates
3.6 Information About What?
3.7 Does Timing Precision Matter?
3.8 Rate Codes and Timing Codes
3.9 Summary

4. Pricing Neural Information
4.1 Introduction
4.2 The Efficiency-Rate Trade Off
4.3 Paying With Spikes
4.4 Paying With Hardware
4.5 Paying With Power
4.6 Optimal Axon Diameter
4.7 Optimal Distribution of Axon Diameters
4.8 Axon Diameter and Spike Speed
4.9 Optimal Mean Firing Rate
4.10 Optimal Distribution of Firing Rates
4.11 Optimal Synaptic Conductance
4.12 Summary

5. Encoding Colour
5.1 Introduction
5.2 The Eye
5.3 How Aftereffects Occur
5.4 The Problem With Colour
5.5 A Neural Encoding Strategy
5.6 Encoding Colour
5.7 Why Aftereffects Occur
5.8 Measuring Mutual Information
5.9 Maximising Mutual Information
5.10 Principal Component Analysis
5.11 PCA and Mutual Information
5.12 Evidence for Efficiency
5.13 Summary

6. Encoding Time
6.1 Introduction
6.2 Linear Models
6.3 Neurons and Wine Glasses
6.4 The LNP Model
6.5 Estimating LNP Parameters
6.6 The Predictive Coding Model
6.7 Estimating Predictive Parameters
6.8 Evidence for Predictive Coding
6.9 Summary

7. Encoding Space
7.1 Introduction
7.2 Spatial Frequency
7.3 Do Ganglion Cells Decorrelate Images?
7.4 Optimal Receptive Fields: Overview
7.5 Receptive Fields and Information
7.6 Measuring Mutual Information
7.7 Maximising Mutual Information
7.8 van Hateren's Model
7.9 Predictive Coding of Images
7.10 Evidence For Predictive Coding
7.11 Is Receptive Field Spacing Optimal?
7.12 Summary

8. Encoding Visual Contrast
8.1 Introduction
8.2 The Compound Eye
8.3 Not Wasting Entropy
8.4 Measuring the Eye's Response
8.5 Maximum Entropy Encoding
8.6 Efficiency of Maximum Entropy Encoding
8.7 Summary

9. The Neural Rubicon
9.1 Introduction
9.2 The Darwinian Cost of Efficiency
9.3 Crossing the Neural Rubicon

Further ReadingAppendices A. Glossary
B. Mathematical Symbols
C. Correlation and Independence
D. A Vector Matrix Tutorial
E. Neural Information Methods
F. Key Equations

References

Index

Back to Neural Information Theory book.