ebook img

Algorithms and Architectures (Neural Network Systems Techniques and Applications) PDF

485 Pages·1997·18.8 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Algorithms and Architectures (Neural Network Systems Techniques and Applications)

Algorithms and Architectures Algorithms and Architectures Neural Network Systems Techniques and Applications Edited by Cornelius T, Leondes VOLUME 1. Algorithms and Architectures VOLUME 2. Optimization Techniques VOLUME 3. Implementation Techniques VOLUME 4. Industrial and Manufacturing Systems VOLUME 5. Image Processing and Pattern Recognition VOLUME 6. Fuzzy Logic and Expert Systems Applications VOLUME 7. Control and Dynamic Systems Algorithms and Architectures Edited by Cornelius T. Leondes Professor Emeritus University of California Los Angeles, California V O L U ME 1 OF Neural Network Systems Techniques and Applications ACADEMIC PRESS San Diego London Boston New York Sydney Tokyo Toronto This book is printed on acid-free paper. © Copyright © 1998 by ACADEMIC PRESS All Rights Reserved. No part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopy, recording, or any information storage and retrieval system, without permission in writing from the publisher. Academic Press a division of Harcourt Brace & Company 525 B Street, Suite 1900, San Diego, California 92101-4495, USA http://www.apnet.com Academic Press Limited 24-28 Oval Road, London NWl 7DX, UK http://www.hbuk.co.uk/ap/ Library of Congress Card Catalog Number: 97-80441 International Standard Book Number: 0-12-443861-X PRINTED IN THE UNTIED STATES OF AMERICA 97 98 99 00 01 02 ML 9 8 7 6 5 4 3 2 1 Contents Contributors xv Preface xix Statistical Theories of Learning in Radial Basis Function Networks Jason A. S. Freeman, Mark }. L Orr, and David Saad I. Introduction 1 A. Radial Basis Function Network 2 II. Learning in Radial Basis Function Networks 4 A. Supervised Learning 4 B. Linear Models 5 C. Bias and Variance 9 D. Cross-Validation 11 E. Ridge Regression 13 F. Forward Selection 17 G. Conclusion 19 III. Theoretical Evaluations of Network Performance 21 A. Bayesian and Statistical Mechanics Approaches 21 B. Probably Approximately Correct Framework 31 C. Approximation Error/Estimation Error 37 D. Conclusion 39 IV. Fully Adaptive Training—An Exact Analysis 40 A. On-Line Learning in Radial Basis Function Networks 41 B. Generalization Error and System Dynamics 42 C. Numerical Solutions 43 D. Phenomenological Observations 45 vi Contents E. Symmetric Phase 47 F. Convergence Phase 49 G. Quantifying the Variances 50 H. Simulations 52 I. Conclusion 52 V. Summary 54 Appendix 55 References 57 Synthesis of Three-Layer Threshold Networks Jung Hwan Kim, Sung-Kwon Park, Hyunseo Oh, and Youngnam Han I. Introduction 62 II. Preliminaries 63 III. Finding the Hidden Layer 64 IV. Learning an Output Layer 73 V. Examples 77 A. Approximation of a Circular Region 77 B. Parity Function 80 C. 7-Bit Function 83 VI. Discussion 84 VII. Conclusion 85 References 86 Weight Initialization Techniques Mikko Lehtokangas, Petri Salmela, Jukka Saarinen, and Kimmo Kaski I. Introduction 87 II. Feedforward Neural Network Models 89 A. Multilayer Perceptron Networks 89 B. Radial Basis Function Networks 90 III. Stepwise Regression for Weight Initialization 90 IV. Initialization of Multilayer Perceptron Networks 92 A. Orthogonal Least Squares Method 92 B. Maximum Covariance Method 93 C. Benchmark Experiments 93 Contents y. Initial Training for Radial Basis Function Networks 98 A. Stepwise Hidden Node Selection 98 B. Benchmark Experiments 99 VI. Weight Initialization in Speech Recognition Application 103 A. Speech Signals and Recognition 103 B. Principle of the Classifier 104 C. Training the Hybrid Classifier 106 D. Results 109 VII. Conclusion 116 Appendix I: Chessboard 4X4 116 Appendix II: Two Spirals 117 Appendix III: GaAs MESFET 117 Appendix IV: Credit Card 117 References 118 Fast Computation in Hamming and Hopfield Networks Isaac Meilijson, Eytan Ruppin, and Moshe Sipper I. General Introduction 123 II. Threshold Hamming Networks 124 A. Introduction 124 B. Threshold Hamming Network 126 C. Hamming Network and an Optimal Threshold Hamming Network 128 D. Numerical Results 132 E. Final Remarks 134 III. Two-Iteration Optimal Signaling in Hopfield Networks 135 A. Introduction 135 B. Model 137 C. Rationale for Nonmonotone Bayesian Signaling 140 D. Performance 142 E. Optimal Signaling and Performance 146 F. Results 148 G. Discussion 151 IV. Concluding Remarks 152 References 153 viii Contents Multilevel Neurons /. Si and A. N. Michel I. Introduction 155 II. Neural System Analysis 157 A. Neuron Models 158 B. Neural Networks 160 C. Stability of an Equilibrium 162 D. Global Stability Results 164 III. Neural System Synthesis for Associative Memories 167 A. System Constraints 168 B. Synthesis Procedure 170 IV. Simulations 171 V. Conclusions and Discussions 173 Appendix 173 References 178 Probabilistic Design Sumio Watanabe and Kenji Fukumizu I. Introduction 181 II. Unified Framework of Neural Networks 182 A. Definition 182 B. Learning in Artificial Neural Networks 185 III. Probabilistic Design of Layered Neural Networks 189 A. Neural Network That Finds Unknown Inputs 189 B. Neural Network That Can Tell the Reliability of Its Own Inference 192 C. Neural Network That Can Illustrate Input Patterns for a Given Category 196 IV. Probability Competition Neural Networks 197 A. Probability Competition Neural Network Model and Its Properties 198 B. Learning Algorithms for a Probability Competition Neural Network 203 C. Applications of the Probability Competition Neural Network Model 210 Contents ix V. Statistical Techniques for Neural Network Design 218 A. Information Criterion for the Steepest Descent 218 B. Active Learning 225 VI. Conclusion 228 References 228 Short Time Memory Problems M. Daniel Tom and Manoel Fernando Tenorio I. Introduction 231 II. Background 232 III. Measuring Neural Responses 233 IV. Hysteresis Model 234 V. Perfect Memory 237 VI. Temporal Precedence Differentiation 239 VII. Study in Spatiotemporal Pattern Recognition 241 VIII. Conclusion 245 Appendix 246 References 260 Reliability Issue and Quantization Effects in Optical and Electronic Network Implementations of Hebbian-Type Associative Memories Pau-Choo Chung and Ching-Tsorng Tsai I. Introduction 261 II. Hebbian-Type Associative Memories 264 A. Linear-Order Associative Memories 264 B. Quadratic-Order Associative Memories 266 III. Network Analysis Using a Signal-to-Noise Ratio Concept 266 IV. Reliability Effects in Network Implementations 268 A. Open-Circuit Effects 269 B. Short-Circuit Effects 274

Description:
This volume is the first diverse and comprehensive treatment of algorithms and architectures for the realization of neural network systems. It presents techniques and diverse methods in numerous areas of this broad subject. The book covers major neural network systems structures for achieving effect
See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.