HOME > Detail View

Detail View

Pulsed neural networks

Pulsed neural networks (Loan 2 times)

Material type
단행본
Personal Author
Maass, Wolfgang, 1949 August 21-. Bishop, Christopher M.
Title Statement
Pulsed neural networks / edited by Wolfgang Maass, Christopher M. Bishop.
Publication, Distribution, etc
Cambridge, Mass. :   MIT Press,   c1999.  
Physical Medium
xxix, 377 p. : ill. ; 26 cm.
ISBN
0262133504 (hc. : alk. paper)
General Note
"A Bradford book."  
Bibliography, Etc. Note
Includes bibliographical references.
Subject Added Entry-Topical Term
Neural networks (Computer science).
000 00000cam u2200205 a 4500
001 000045975744
005 20190315103755
008 190314s1999 maua b 000 0 eng d
010 ▼a 98038511
020 ▼a 0262133504 (hc. : alk. paper)
035 ▼a (KERIS)REF000005066178
040 ▼a DLC ▼c DLC ▼d DLC ▼d 211009
050 0 0 ▼a QA76.87 ▼b .P85 1999
082 0 0 ▼a 006.3/2 ▼2 23
084 ▼a 006.32 ▼2 DDCK
090 ▼a 006.32 ▼b P9822
245 0 0 ▼a Pulsed neural networks / ▼c edited by Wolfgang Maass, Christopher M. Bishop.
260 ▼a Cambridge, Mass. : ▼b MIT Press, ▼c c1999.
300 ▼a xxix, 377 p. : ▼b ill. ; ▼c 26 cm.
500 ▼a "A Bradford book."
504 ▼a Includes bibliographical references.
650 0 ▼a Neural networks (Computer science).
700 1 ▼a Maass, Wolfgang, ▼d 1949 August 21-.
700 1 ▼a Bishop, Christopher M.
945 ▼a KLPA

Holdings Information

No. Location Call Number Accession No. Availability Due Date Make a Reservation Service
No. 1 Location Science & Engineering Library/Sci-Info(Stacks2)/ Call Number 006.32 P9822 Accession No. 121248276 Availability Available Due Date Make a Reservation Service B M

Contents information

Author Introduction

크리스토퍼 비숍(엮은이)

마이크로소프트 리서치 케임브리지의 부 디렉터이자 에든버러 대학교 컴퓨터 공학과의 학과장을 맡고 있다. 또한, 케임브리지 다윈 칼리지와 왕립 공학회의 펠로우이기도 하다. 크리스는 양자론에 관한 논문으로 세인트 캐서린 대학과 옥스퍼드 대학교에서 물리학 학사, 에든버러 대학교에서 이론 물리학 박사 학위를 취득했다.

Wolfgang Maass(엮은이)

Information Provided By: : Aladin

Table of Contents

Section	Section Description	Page Number
Foreword	
Neural Pulse Coding	
Spike Timing	
Population Codes	
Hippocampal Place Field	
Hardware Models	
References	
Preface	
The Isaac Newton Institute	
Overview of the Book	
Acknowledgments	
Contributors	
Part I	Basic Concepts and Models	
1	    Spiking Neurons	
1.1	        The Problem of Neural Coding	
1.1.1	            Motivation	
1.1.2	            Rate Codes	
1.1.2.1	                Rate as a Spike Count (Average over Time)	
1.1.2.2	                Rate as a Spike Density (Average over Several Runs)	
1.1.2.3	                Rate as Population Activity (Average over Several Neurons)	
1.1.3	            Candidate Pulse Codes	
1.1.3.1	                Time-to-First-Spike	
1.1.3.2	                Phase	
1.1.3.3	                Correlations and Synchrony	
1.1.3.4	                Stimulus Reconstruction and Reverse Correlation	
1.1.4	            Discussion: Spikes or Rates?	
1.2	        Neuron Models	
1.2.1	            Simple Spiking Neuron Model	
1.2.2	            First Steps towards Coding by Spikes	
1.2.3	            Threshold-Fire Models	
1.2.3.1	                Spike Response Model -- Further Details	
1.2.3.2	                Integrate-and-Fire Model	
1.2.3.3	                Models of Noise	
1.2.4	            Conductance-Based Models	
1.2.4.1	                Hodgkin-Huxley Model	
1.2.4.2	                Relation to the Spike Response Model	
1.2.4.3	                Compartmental Models	
1.2.5	            Rate Models	
1.3	        Conclusions	
        References	
2	    Computing with Spiking Neurons	
2.1	        Introduction	
2.2	        A Formal Computational Model for a Network of Spiking Neurons	
2.3	        McCulloch-Pitts Neurons versus Spiking Neurons	
2.4	        Computing with Temporal Patterns	
2.4.1	            Conincidence Detection	
2.4.2	            RBF-Units in the Temporal Domain	
2.4.3	            Computing a Weighted Sum in Temporal Coding	
2.4.4	            Universal Approximation of Continuous Functions with Spiking Neurons Remarks:	
2.4.5	            Other Computations with Temporal Patterns in Networks of Spiking Neurons	
2.5	        Computing with a Space-Rate Code	
2.6	        Computing with Firing Rates	
2.7	        Computing with Firing Rates and Temporal Correlations	
2.8	        Networks of Spiking Neurons for Storing and Retrieving Information	
2.9	        Computing on Spike Trains	
2.10	        Conclusions	
        References	
3	    Pulse-Based Computation in VLSI Neural Networks	
3.1	        Background	
3.2	        Pulsed Coding: A VLSI Perspective	
3.2.1	            Pulse Amplitude Modulation	
3.2.2	            Pulse Width Modulation	
3.2.3	            Pulse Frequency Modulation	
3.2.4	            Phase or Delay Modulation	
3.2.5	            Noise, Robustness, Accuracy and Speed	
3.3	        A MOSFET Introduction	
3.3.1	            Subthreshold Circuits for Neural Networks	
3.4	        Pulse Generation in VLSI	
3.4.1	            Pulse Intercommunication	
3.5	        Pulsed Arithmetic in VLSI	
3.5.1	            Addition of Pulse Stream Signals	
3.5.2	            Multiplication of Pulse Stream Signals	
3.5.3	            MOS Transconductance Multiplier	
3.5.4	            MOSFET Analog Multiplier	
3.6	        Learning in Pulsed Systems	
3.7	        Summary and Issues Raised	
        References	
4	    Encoding Information in Neuronal Activity	
4.1	        Introduction	
4.2	        Synchronization and Oscillations	
4.3	        Temporal Binding	
4.4	        Phase Coding	
4.5	        Dynamic Range and Firing Rate Codes	
4.6	        Interspike Interval Variability	
4.7	        Synapses and Rate Coding	
4.8	        Summary and Implications	
        References	
Part II	Implementations	
5	    Building Silicon Nervous Systems with Dendritic Tree Neuromorphs	
5.1	        Introduction	
5.1.1	            Why Spikes?	
5.1.2	            Dendritic Processing of Spikes	
5.1.3	            Tunability	
5.2	        Implementation in VLSI	
5.2.1	            Artificial Dendrites	
5.2.2	            Synapses	
5.2.3	            Dendritic Non-Linearities	
5.2.4	            Spike-Generating Soma	
5.2.5	            Excitability Control	
5.2.6	            Spike Distribution -- Virtual Wires	
5.3	        Neuromorphs in Action	
5.3.1	            Feedback to Threshold-Setting Synapses	
5.3.2	            Discrimination of Complex Spatio-Temporal Patterns	
5.3.3	            Processing of Temporally Encoded Information	
5.4	        Conclusions	
        Acknowledgments	
        References	
6	    A Pulse-Coded Communications Infrastructure for Neuromorphic Systems	
6.1	        Introduction	
6.2	        Neuromorphic Computational Nodes	
6.3	        Neuromorphic aVLSI Neurons	
6.4	        Address Event Representation (AER)	
6.5	        Implementations of AER	
6.6	        Silicon Cortex	
6.6.1	            Basic Layout	
6.7	        Functional Tests of Silicon Cortex	
6.7.1	            An Example Neuronal Network	
6.7.2	            An Example of Sensory Input to SCX	
6.8	        Future Research on AER Neuromorphic Systems	
        Acknowledgements	
        References	
7	    Analog VLSI Pulsed Networks for Perceptive Processing	
7.1	        Introduction	
7.2	        Analog Perceptive Nets Communication Requirements	
7.2.1	            Coding Information with Pulses	
7.2.2	            Multiplexing of the Signals Issued by Each Neuron	
7.2.3	            Non-Arbitered PFM Communication	
7.3	        Analysis of the NAPFM Communication Systems	
7.3.1	            Statistical Assumptions	
7.3.2	            Detection	
7.3.2.1	                Detection by Time-Windowing	
7.3.2.2	                Direct Interpulse Time Measurement	
7.3.3	            Performance	
7.3.3.1	                Detection by Time-Windowing	
7.3.3.2	                Direct Interpulse Time Measurement	
7.3.4	            Data Dependency of System Performance	
7.3.5	            Discussion	
7.3.5.1	                Detection by Time-Windowing	
7.3.5.2	                Detection by Direct Interpulse Time Measurement	
7.4	        Address Coding	
7.5	        Silicon Retina Equipped with the NAPFM Communication System	
7.5.1	            Circuit Description	
7.5.2	            Noise Measurement Results	
7.6	        Projective Field Generation	
7.6.1	            Overview	
7.6.2	            Anisotropic Current Pulse Spreading in a Nonlinear Network	
7.6.3	            Analysis of the Spatial Response of the Nonlinear Network	
7.6.4	            Analysis of the Size and Shape of the Bubbles Generable by the Nonlinear Network	
7.7	        Description of the Integrated Circuit for Orientation Enhancement	
7.7.1	            Overview	
7.7.2	            Circuit Description	
7.7.3	            System Measurement Results	
7.7.4	            Other Applications	
7.7.4.1	                Weighted Projective Field Generation	
7.7.4.2	                Complex Projective Field Generation	
7.8	        Display Interface	
7.9	        Conclusion	
        References	
8	    Preprocessing for Pulsed Neural VLSI Syste	
8.1	        Introduction	
8.2	        A Sound Segmentation System	
8.3	        Signal Processing in Analog VLSI	
8.3.1	            Continuous Time Active Filters	
8.3.2	            Sampled Data Active Switched Capacitor (SC) Filters	
8.3.3	            Sampled Data Active Switched Current (SI) Filters	
8.3.4	            Discussion	
8.4	        Palmo -- Pulse Based Signal Processing	
8.4.1	            Basic Palmo Concepts	
8.4.1.1	                The Palmo Signal Representation	
8.4.1.2	                The Analog Palmo Cell	
8.4.1.3	                A Palmo Signal Processing System	
8.4.1.4	                Sources of Harmonic Distortion in a Palmo System	
8.4.2	            A CMOS Analog Palmo Cell Implementation	
8.4.2.1	                The Analog Palmo Cell: Details of Circuit Operation	
8.4.3	            Interconnecting Analog Palmo Cells	
8.4.4	            Results from a Palmo VLSI Device	
8.4.5	            Digital Processing of Palmo Signals	
8.4.6	            CMOS Analog Palmo Cell: Performance	
8.5	        Conclusions	
8.6	        Further Work	
8.7	        Acknowledgements	
        References	
9	    Digital Simulation of Spiking Neural Networks	
9.1	        Introduction	
9.2	        Implementation Issues of Pulse-Coded Neural Networks	
9.2.1	            Discrete-Time Simulation	
9.2.2	            Requisite Arithmetic Precision	
9.2.3	            Basic Procedures of Network Computation	
9.3	        Programming Environment	
9.4	        Concepts of Efficient Simulation	
9.5	        Mapping Neural Networks on Parallel Computers	
9.5.1	            Neuron-Parallelism	
9.5.2	            Synapse-Parallelism	
9.5.3	            Pattern-Parallelism	
9.5.4	            Partitioning of the Network	
9.6	        Performance Study	
9.6.1	            Single PE Workstations	
9.6.2	            Neurocomputer	
9.6.3	            Parallel Computers	
9.6.4	            Results of the Performance Study	
9.6.5	            Conclusions	
        References	
Part III	Design and Analysis of Pulsed Neural Systems	
10	    Populations of Spiking Neurons	
10.1	        Introduction	
10.2	        Model	
10.3	        Population Activity Equation	
10.3.1	            Integral Equation for the Dynamics	
10.3.2	            Normalization	
10.4	        Noise-Free Population Dynamics	
10.5	        Locking	
10.5.1	            Locking Condition	
10.5.2	            Graphical Interpretation	
10.6	        Transients	
10.7	        Incoherent Firing	
10.7.1	            Determination of the Activity	
10.7.2	            Stability of Asynchronous Firing	
10.8	        Conclusions	
        References	
11	    Collective Excitation Phenomena and Their Applications	
11.1	        Introduction	
11.1.1	            Two Variable Formulation of IAF Neurons	
11.2	        Synchronization of Pulse Coupled Oscillators	
11.3	        Clustering via Temporal Segmentation	
11.4	        Limits on Temporal Segmentation	
11.5	        Image Analysis	
11.5.1	            Image Segmentation	
11.5.2	            Edge Detection	
11.6	        Solitary Waves	
11.7	        The Importance of Noise	
11.8	        Conclusions	
        Acknowledgment	
        References	
12	    Computing and Learning with Dynamic Synapses	
12.1	        Introduction	
12.2	        Biological Data on Dynamic Synapses	
12.3	        Quantitative Models	
12.4	        On the Computational Role of Dynamic Synapses	
12.5	        Implications for Learning in Pulsed Neural Nets	
12.6	        Conclusions	
        References	
13	    Stochastic Bit-Stream Neural Networks	
13.1	        Introduction	
13.2	        Basic Neural Modelling	
13.3	        Feedforward Networks and Learning	
13.3.1	            Probability Level Learning	
13.3.2	            Bit-Stream Level Learning	
13.4	        Generalization Analysis	
13.5	        Recurrent Networks	
13.6	        Applications to Graph Colouring	
13.7	        Hardware Implementation	
13.7.1	            The Stochastic Neuron	
13.7.2	            Calculating Output Derivatives	
13.7.3	            Generating Stochastic Bit-Streams	
13.7.4	            Recurrent Networks	
13.8	        Conclusions	
        References	
14	    Hebbian Learning of Pulse Timing in the Barn Owl Auditory System	
14.1	        Introduction	
14.2	        Hebbian Learning	
14.2.1	            Review of Standard Formulations	
14.2.2	            Spike-Based Learning	
14.2.3	            Example	
14.2.4	            Learning Window	
14.3	        Barn Owl Auditory System	
14.3.1	            The Localization Task	
14.3.2	            Auditory Localization Pathway	
14.4	        Phase Locking	
14.4.1	            Neuron Model	
14.4.2	            Phase Locking -- Schematic	
14.4.3	            Simulation Results	
14.5	        Delay Tuning by Hebbian Learning	
14.5.1	            Motivation	
14.5.2	            Selection of Delays	
14.6	        Conclusions	
References

New Arrivals Books in Related Fields

Baumer, Benjamin (2021)
Harrison, Matt (2021)
데이터분석과인공지능활용편찬위원회 (2021)