Mit Press - Theoretical Neuroscience - Computational And Ma~1.pdf

(8322 KB) Pobierz
THEORETICAL NEUROSCIENCE
THEORETICAL NEUROSCIENCE
THEORETICAL NEUROSCIENCE
Peter Dayan and L.F. Abbott
Preface
PART I - ANALYZING AND MODELING NEURAL RESPONSES
Chapter 1 - Neural Encoding I: Firing Rates and Spike Statistics
Introduction
Properties of Neurons
Recording Neuronal Responses
From Stimulus to Response
Spike Trains and Firing Rates
Measuring Firing Rates
Tuning Curves
Spike-Count Variability
What Makes a Neuron Fire?
Describing the Stimulus
The Spike-Triggered Average
White-Noise Stimuli
Multiple-Spike-Triggered Averages and Spike-Triggered Correlations
Spike Train Statistics
The Homogeneous Poisson Process
The Spike-Train Autocorrelation Function
The Inhomogeneous Poisson Process
The Poisson Spike Generator
Comparison with Data
The Neural Code
Independent-Spike, Independent Neuron and Correlation Codes
Temporal Codes
Chapter Summary
Appendices
A) The Power Spectrum of White Noise
B) Moments of the Poisson Distribution
D) Inhomogeneous Poisson Statistics
Annotated Bibliography
Chapter 2 - Neural Encoding II: Reverse Correlation and Receptive Fields
Introduction
Estimating Firing Rates
The Most Effective Stimulus
Static Nonlinearities
Introduction to the Early Visual System
file:///E|/Media_folder/Books/books.pdox.net/Physics/Theoretical_Neuroscience/TOC.htm (1 of 7) [15-02-2002 0:32:12]
THEORETICAL NEUROSCIENCE
The Retinotopic Map
Visual Stimuli
The Nyquist Frequency
Reverse Correlation Methods - Simple Cells
Spatial Receptive Fields
Temporal Receptive Fields
Response of a Simple Cell to a Counterphase Grating
Space-Time Receptive Fields
Nonseparable Receptive Fields
Static Nonlinearities - Simple Cells
Static Nonlinearities - Complex Cells
Receptive Fields in the Retina and LGN
Constructing V1 Receptive Fields
Chapter Summary
Appendices
A) The Optimal Kernel
B) The Most Effective Stimulus
C) Bussgang's Theorem
Annotated Bibliography
Chapter 3 - Neural Decoding
Encoding and Decoding
Discrimination
ROC Curves
ROC Analysis of Motion Discrimination
The Likelihood Ratio Test
Population Decoding
Encoding and Decoding Direction
Optimal Decoding Methods
Fisher Information
Optimal Discrimination
Spike Train Decoding
Chapter Summary
Appendices
A) The Neymann-Pearson Lemma
B) The Cramér-Rao Bound
C) The Optimal Spike-Decoding Filter
Annotated Bibliography
Chapter 4 - Information Theory
Entropy and Mutual Information
Entropy
Mutual Information
file:///E|/Media_folder/Books/books.pdox.net/Physics/Theoretical_Neuroscience/TOC.htm (2 of 7) [15-02-2002 0:32:12]
THEORETICAL NEUROSCIENCE
Entropy and Mutual Information for Continuous Variables
Information and Entropy Maximization
Entropy Maximization for a Single Neuron
Populations of Neurons
Application to Retinal Ganglion Cell Receptive Fields
The Whitening Filter
Filtering Input Noise
Temporal Processing in the LGN
Cortical Coding
Entropy and Information for Spike Trains
Chapter Summary
Appendix
Positivity of the Kulback-Leibler Divergence
Annotated Bibliography
PART II - MODELING NEURONS AND NETWORKS
Chapter 5 - Model Neurons I: Neuroelectronics
Levels of Neuron Modeling
Electrical Properties of Neurons
Intracellular Resistance
Membrane Capacitance and Resistance
Equilibrium and Reversal Potentials
The Membrane Current
Single-Compartment Models
Integrate-and-Fire Models
Spike-Rate Adaptation and Refractoriness
Voltage-Dependent Conductances
Persistent Conductances
Transient Conductances
Hyperpolarization-Activated Conductances
The Hodgkin-Huxley Model
Modeling Channels
Synaptic Conductances
The Postsynaptic Conductance
Release Probability and Short-Term Plasticity
Synapses on Integrate-and-Fire Neurons
Regular and Irregular Firing Modes
Chapter Summary
Appendices
A) Integrating the Membrane Potential
B) Integrating the Gating Variables
Annotated Bibliography
Chapter 6 - Model Neurons II: Conductances and Morphology
file:///E|/Media_folder/Books/books.pdox.net/Physics/Theoretical_Neuroscience/TOC.htm (3 of 7) [15-02-2002 0:32:12]
THEORETICAL NEUROSCIENCE
Levels of Neuron Modeling
Conductance-Based Models
The Connor-Stevens Model
Postinhibitory Rebound and Bursting
The Cable Equation
Linear Cable Theory
An Infinite Cable
An Isolated Branching Node
The Rall Model
The Morphoelectrotonic Transform
Multi-Compartment Models
Action Potential Propagation Along an Unmyelinated Axon
Propagation Along a Myelinated Axon
Chapter Summary
Appendices
A) Gating Functions for Conductance-Based Models
Connor-Stevens Model
Transient Ca 2+ Conductances
Ca 2+ -dependent K + Condutances
B) Integrating Multi-Compartment Models
Annotated Bibliography
Chapter 7 - Network Models
Introduction
Firing-Rate Models
Feedforward and Recurrent Networks
Continuously Labelled Networks
Feedforward Networks
Neural Coordinate Transformations
Recurrent Networks
Linear Recurrent Networks
Selective Amplification
Input Integration
Continuous Linear Recurrent Networks
Nonlinear Recurrent Networks
Nonlinear Amplification
A Recurrent Model of Simple Cells in Primary Visual Cortex
A Recurrent Model of Complex Cells in Primary Visual Cortex
Winner-Take-All Input Selection
Gain Modulation
Sustained Activity
Maximum Likelihood and Network Recoding
Network Stability
file:///E|/Media_folder/Books/books.pdox.net/Physics/Theoretical_Neuroscience/TOC.htm (4 of 7) [15-02-2002 0:32:12]
THEORETICAL NEUROSCIENCE
Associative Memory
Excitatory-Inhibitory Networks
Homogeneous Excitatory and Inhibitory Populations
Phase-Plane Methods and Stability Analysis
The Olfactory Bulb
Oscillatory Amplification
Stochastic Networks
Chapter Summary
Appendix
Lyapunov Function for the Boltzman Machine
Annotated Bibliography
PART III - PLASTICITY AND LEARNING
Chapter 8 - Plasticity and Learning
Introduction
Stability and Competition
Synaptic Plasticity Rules
The Basic Hebb Rule
The Covariance Rule
The BCM Rule
Synaptic Normalization
Subtractive Normalization
Multiplicative Normalization and the Oja Rule
Timing-Based Rules
Unsupervised Learning
Single Postsynaptic Neuron
Principal Component Projection
Hebbian Development and Ocular Dominance
Hebbian Development of Orientation Selectivity
Temproal Hebbian Rules and Trace Learning
Multiple Postsynaptic Neurons
Fixed Linear Recurrent Connections
Competitive Hebbian Learning
Feature-Based Models
Anti-Hebbian Modification
Timing-Based Plasticity and Prediction
Supervised Learning
Supervised Hebbian Learning
Classification and the Perceptron
Function Approximation
Supervised Error-Correcting Rules
The Perceptron Learning Rule
The Delta Rule
Contrastive Hebbian Learning
file:///E|/Media_folder/Books/books.pdox.net/Physics/Theoretical_Neuroscience/TOC.htm (5 of 7) [15-02-2002 0:32:12]
Zgłoś jeśli naruszono regulamin