ebook img

29721562-Zurada-Introduction-to-Artificial-Neural-Systems-WPC-1992 PDF

764 Pages·2007·33.39 MB·English
by  
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview 29721562-Zurada-Introduction-to-Artificial-Neural-Systems-WPC-1992

Introduction to Art if icial Neural Sy s JACEK M. ZURADA Professor of Electrical Engineering and of Computer Science and Engineering WEST PUBLISHING COMPANY St. Paul New York Los Angeles San Francisco Text Design: Geri Davis, Quadrata Composition and Art: Technique Typesetting Copyediting: Loretta Palagi Indexing: Barbara Farabaugh WEST'S COMMITMENT TO THE ENVIRONMENT In 1906, West Publishing Company began recycling materials left over from the production of books. This began a tradition of efficient and responsible use of resources. Today, up to 95 percent of our legal books and 70 percent of our college texts are printed on recycled, acid-free stock. West also recycles nearly 22 million pounds of scrap paper annually-the equivalent of 181 ,7 17 trees. Since the 1960s, West has devised ways to capture and recycle waste inks, solvents, oils, and vapors created in the printing process. We also recycle plastics of all kinds, wood, glass, corrugated cardboard, and batteries, and have eliminated the use of styrofoam book packaging. We at West are proud of the longevity and the scope of our commitment to our environment. Production, Prepress, Printing and Binding by West Publishing Company. COPYRIGHT 0 1992 By WEST PUBLISHING COMPANY 610 Opperman Drive P.O. Box 64526 St. Paul, MN 55164-0526 All rights reserved Printed inhe United States of America 99 98 97 96 95 94 93 92 8 7 6 5 4 3 2 1 0 Library of Congress Cataloging-in-Publication Data Zurada, Jacek M. Introduction to artificial neural systems 1 Jacek M. Zurada p. cm. Includes index. ISBN 0-3 14-93391- 3 (alk. paper) 1. Neural networks (Computer science) I. Title QA76.87.287 1992 006.34~20 92-7 12 @ CIP To Anna, Joanna, and Mark Contents Preface 1 Artificial Neural Systems: Preliminaries 1 1.1 Neural Computation: Some Examples and Applications 3 Classifiers, Approximators, and Autonomous Drivers 4 Simple Memory and Restoration of Patterns 10 Optimizing Networks 14 Clustering and Feature Detecting Networks 16 1.2 History of Artificial Neural Systems Development 17 1.3 Future Outlook 21 References 22 vii viii CONTENTS Fundamental Concepts and Models of Artificial Neural Systems 25 Biological Neurons and Their Artificial Models 26 Biological Neuron 27 McCulloch-Pitts Neuron Model 30 Neuron Modeling for Artificial Neural Systems 31 Models of Artificial Neural Networks 37 Feedforward Network 37 Feedback Network 42 Neural Processing 53 Learning and Adaptation 55 Learning as Approximation or Equilibria Encoding 55 Supervised and Unsupervised Learning 56 Neural Network Learning Rules 59 Hebbian Learning Rule 60 Perceptron Learning Rule 64 Delta Learning Rule 66 Widrow-Hoff Learning Rule 69 Correlation Learning Rule 69 Winner-Take-All Learning Rule 70 Outstar Learning Rule 7 1 Summary of Learning Rules 72 Overview of Neural Networks 74 Concluding Remarks 76 Problems 78 References 89 Single-Layer Perceptron Classifiers 3.1 Classification Model, Features, and Decision Regions 94 3.2 Discriminant Functions 99 3.3 Linear Machine and Minimum Distance Classification 106 3.4 Nonparametric Training Concept 114 3.5 Training and Classification Using the Discrete Perceptron: Algorithm and Example 120 3.6 Single-Layer Continuous Perceptron Networks for Linearly Separable Classifications 132 3.7 Multicategory Single-Layer Perceptron Networks 142 - CONTENTS ix 3.8 Concluding Remarks 152 Problems 153 References 161 4 Multilayer Feedforward Networks 4.1 Linearly Nonseparable Pattern Classification 165 4.2 Delta Learning Rule for Multiperceptron Layer 175 4.3 Generalized Delta Learning Rule 181 4.4 Feedforward Recall and Error Back-Propagation Training 185 Feedforward Recall 185 Error Back-Propagation Training 186 Example of Error Back-Propagation Training 190 Training Errors 195 Multilayer Feedforward Networks as Universal Approximators 196 4.5 Learning Factors 206 Initial Weights 208 Cumulative Weight Adjustment versus Incremental Updating 208 Steepness of the Activation Function 209 Learning Constant 2 10 Momentum Method 21 1 Network Architectures Versus Data Representation 214 Necessary Number of Hidden Neurons 2 16 4.6 Classifying and Expert Layered Networks 220 Character Recognition Application 221 Expert Systems Applications 225 Learning Time Sequences 229 4.7 Functional Link Networks 230 4.8 Concluding Remarks 234 Problems 235 References 248 5 Single-Layer Feedback Networks 5.1 Basic Concepts of Dynamical Systems 253 - 7 5.2 Mathematical Foundations of Discrete-Time Hopfield Networks 254 CONTENTS +5.3' Mathematical Foundations of Gradient-Type Hopfield Networks 264 5.4 Transient Response of Continuous-Time Networks 276 5.5 Relaxation Modeling in Single-Layer Feedback Networks 283 5.6 Example Solutions of Optimization Problems 287 Summing Network with Digital Outputs 287 Minimization of the Traveling Salesman Tour Length 294 5.7 Concluding Remarks 299 Problems 301 References 3 10 Associative Memories 31 3 6.1 Basic Concepts 314 6.2 Linear Associator 320 6.3 Basic Concepts of Recurrent Autoassociative Memory 325 Retrieval Algorithm 327 Storage Algorithm 328 Performance Considerations 336 6.4 Performance Analysis of Recurrent Autoassociative Memory 339 Energy Function Reduction 342 Capacity of Autoassociative Recurrent Memory 343 Memory Convergence versus Corruption 345 Fixed Point Concept 349 Modified Memory Convergent Toward Fixed Points 35 1 Advantages and Limitations 354 6.5 Bidirectional Associative Memory 354 Memory Architecture 355 Association Encoding and Decoding 357 Stability Considerations 359 Memory Example and Performance Evaluation 360 Improved Coding of Memories 363 - Multidirectional Associative Memory 368 6.6 Associative Memory of Spatio-temporal Patterns 370 6.7 Concluding Remarks 375 Problems 377 References 386 CONTENTS xi 7 Matching and Self-Organizing Networks 7.1 Hamming Net and MAXNET 391 7.2 Unsupervised Learning of Clusters 399 Clustering and Similarity Measures 399 Winner-Take-All Learning 401 Recall Mode 406 Initialization of Weights 406 Separability Limitations 409 7.3 Counterpropagation Network 410 7.4 Feature Mapping 414 - --V7.5 Self-organizing Feature Maps 423 7.6 Cluster Discovery Network (ART1) 432 7.7 Concluding Remarks 444 Problems 445 References 452 8 Applications of Neural Algorithms and Systems 455 8.1 Linear Programming Modeling Network 456 8.2 Character Recognition Networks 464 Multilayer Feedforward Network for Printed Character Classification 464 Handwritten Digit Recognition: Problem Statement 476 Recognition Based on Handwritten Character Skeletonization 478 Recognition of Handwritten Characters Based on Error Back-propagation Training 482 8.3 Neural Networks Control Applications 485 Overview of Control Systems Concepts 485 Process Identification 489 Basic Nondynarnic Learning Control Architectures 494 Inverted Pendulum Neurocontroller 499 Cerebellar Model Articulation Controller 504 Concluding Remarks 5 11 8.4 Networks for Robot Kinematics 513 Overview of Robot Kinematics Problems 514 Solution of the Forward and Inverse Kinematics Problems 516 xii CONTENTS Comparison of Architectures for the Forward Kinematics Problem 5 19 Target Position Learning 523 8.5 Connectionist Expert Systems for Medical Diagnosis 527 Expert System for Skin Diseases Diagnosis 528 Expert System for Low Back Pain Diagnosis 532 Expert System for Coronary Occlusion Diagnosis 537 Concluding Remarks 539 8.6 Self-organizing Semantic Maps 539 8.7 Concluding Remarks 546 Problems 548 References 559 9 Neural Networks Implementation 565 9.1 Artificial Neural Systems: Overview of Actual Models 566 Node Numbers and Complexity of Computing Systems 567 Neurocomputing Hardware Requirements 569 Digital and Analog Electronic Neurocomputing Circuits 575 9.2 Integrated Circuit Synaptic Connections 585 Voltage-controlled Weights 587 Analog Storage of Adjustable Weights 592 Digitally Programmable Weights 595 Learning Weight Implementation 605 9.3 Active Building Blocks of Neural Networks 608 Current Mirrors 610 Inverter-based Neuron 6 13 Differential Voltage Amplifiers 6 17 Scalar Product and Averaging Circuits with Transconductance Amplifiers 624 Current Comparator 626 Template Matching Network 628 9.4 Analog Multipliers and Scalar Product Circuits 630 Depletion MOSFET Circuit 63 1 Enhancement Mode MOS Circuit 636 Analog Multiplier with Weight Storage 638 Floating-Gate Transistor Multipliers 640

Description:
Page 1. Introduction to. Art if icial. Neural Sy s. JACEK M. ZURADA. Professor of Electrical Engineering and of. Computer Science and
See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.