ebook img

Neural Networks for Perception. Computation, Learning, and Architectures PDF

370 Pages·1992·20.813 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Neural Networks for Perception. Computation, Learning, and Architectures

NEURAL NETWORKS for PERCEFTION Volume 2 Computation, Learning, and Architectures Edited by Hany Wechsler George Mason university Fairfax, Virginia ® ACADEMIC PRESS, INC. Harcourt Brace Jovanovich, Publishers Boston San Diego New York London Sydney Tokyo Toronto This book is printed on acid-free paper. Θ Copyright © 1992 by Academic Press, Inc. All rights reserved. No part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopy, recording, or any information storage and retrieval system, without permission in writing from the publisher. Cover design by Elizabeth E. Tustian ACADEMIC PRESS, INC. 1250 Sixth Avenue, San Diego, CA 92101 United Kingdom Edition published by ACADEMIC PRESS LIMITED 24-28 Oval Road, London NW1 7DX Library of Congress Cataloging-in-Publication Data Neural networks for perception / edited by Harry Wechsler. p. cm. Includes bibliographical references and index. Contents: v. 1. Human and machine perception — v. 2. Computation, learning, and architecture. ISBN 0-12-741251-4 (v. 1). — ISBN 0-12-741252-2 (v. 2) 1. Neural networks (Computer science) 2. Perception. I. Wechsler, Harry. QA76.87.N485 1991 006.3—dc20 91-24207 CIP Printed in the United States of America 91 92 93 94 9 8 7 6 5 4 3 2 1 To my daughter, Gabriela Anya Contents of Volume 1 PARTI Human Perception I. Introduction H, Wechsler 1.1 Visual Cortex: Window on the Biological Basis of Learning and Memory L.N. Cooper 1.2 A Network Model of Object Recognition in Human Vision S. Ed el man 1.3 A Cortically Based Model for Integration in Visual Perception L.H. Finkel, G.N. Reeke, and CM. Edelman 1.4 The Symmetric Organization of Parallel Cortical Systems for Form and Motion Perception S. Grossberg 1.5 The Structure and Interpretation of Neuronal Codes in the Visual System BJ. Richmond and L. M'. Optican 1.6 Self-Organization of Functional Architecture in the Cerebral Cortex S. Tanaka 1.7 Filters versus Textons in Human and Machine Texture Discrimination D. Williams and B. Julesz 1.8 Two-Dimensional Maps and Biological Vision: Representing Three-Dimensional Space G.L. Zimmerman IX χ Contents of Volume 1 PART II Machine Perception II. Introduction H. Wechsler 11.1 WISARD and Other Weightless Neurons /. Aleksander 11.2 Multi-Dimensional Linear Lattice for Fourier and Gabor Transforms, Multiple-Scale Gaussian Filtering, and Edge Detection J. Ben-Arie 11.3 Aspects of Invariant Pattern and Object Recognition T. Caelli, M. Ferraro, and E. Barth 11.4 A Neural Network Architecture for Fast On-Line Supervised Learning and Pattern Recognition G.A. Carpenter, S. Grossberg, and J. Reynolds 11.5 Neural Network Approaches to Color Vision A.C. Hurlhert 11.6 Adaptive Sensory-Motor Coordination Through Self-Consistency M. Kuperstein 11.7 Finding Boundaries in Images J. Malik and P. Perona 11.8 Compression of Remotely Sensed Images Using Self-Organizing Feature Maps M. Manohar and J.C. Tilton 11.9 Self-Organizing Maps and Computer Vision E. Oja 11.10 Region Growing Using Neural Networks T.R. Reed 11.11 Vision and Space-Variant Sensing G. Sandini and M. Tistarelli 11.12 Learning and Recognizing 3D Objects from Multiple Views in a Neural System M. Seiben and A.M. Waxman 11.13 Hybrid Symbolic-Neural Methods for Improved Recognition Using High-Level Visual Features G.G. Towell and J.W. Shavlik 11.14 Multiscale and Distributed Visual Representations and Mappings for Invariant Low-Level Perception H. Wechsler Contents of Volume 1 xi 11.15 Symmetry: A Context Free Cue for Foveated Vision Y. Yeshurun, D. Reisfeld, and H. Wolfson 11.16 A Neural Network for Motion Processing Υ.Ί. Zhou and R. Chellappa Contributors Numbers in parentheses indicate pages on which the authors' contributions begin. Dana Z. Anderson (214), Department of Physics and Joint Institute for Lab- oratory Astrophysics, University of Colorado and National Institute of Standards and Technology, Boulder, Colorado 80309 Dana H. Ballard (8), Department of Computer Science, University of Roch- ester, Rochester, New York 14627 Claus Benkert (214), Department of Physics and Joint Institute for Labora- tory Astrophysics, University of Colorado and National Institute of Stan- dards and Technology, Boulder, Colorado 80309 David Casasent (253), Electrical and Computer Engineering, Carnegie Mel- lon University, Pittsburgh, Pennsylvania 15213 H. John Caulfield (282), Center for Applied Optics, University of Alabama, Hunts ville, Alabama 35899 Vladimir Cherkassky (40), Department of Electrical Engineering, University of Minnesota, Minneapolis, Minnesota 55455 David D. Crouch (214), University of Colorado and National Institute of Standards and Technology, Boulder, Colorado 80309 Robert Hecht-Nielsen (65), HNC, Inc., 5501 Oberlin Drive, San Diego, California 92121 Charles Hester (282), Teledyne Brown Engineering, 300 Sparkman Drive, Mailstop 60, Huntsville, Alabama 35807 Ho-in Jeon (282), Center for Applied Optics, University of Alabama, Hunts- ville, Alabama 35899 R. Barry Johnson (282), Center for Applied Optics, University of Alabama, Huntsville, Alabama 35899 Behrooz Kamgar-Parsi (94), Code 5510, Naval Research Laboratory, Wash- ington, DC 20375 Behzad Kamgar-Parsi (94), Code 5510, Naval Research Laboratory, Wash- ington, DC 20375 Jason Kinser (282), Teledyne Brown Engineering, 300 Sparkman Drive, Mailstop 60, Huntsville, Alabama 35807 Xlll xjv Contributors Hossein Lari-Najafi (40), Department of Electrical Engineering, University of Minnesota, Minneapolis, Minnesota 55455 Jim Mann (310), Digital Integrated Circuits Group, MIT Lincoln Lab, 244 Wood Street, Lexington, Massachusetts 02173 Wolfgang Pölzleitner (111), Joanneum Research Center, Wastiangasse 6, A- 8010 Graz, Austria Jack Raffel (310), Digital Integrated Circuits Group, MIT Lincoln Lab, 244 Wood Street, Lexington, Massachusetts 02173 Joseph Shamir (282), Center for Applied Optics, University of Alabama, Hunts ville, Alabama 35899 Victor M. Stern (128), Intelligent Systems Technology Inc., 12048 Winding Creek Court, Clifton, Virginia 22024 Mark Temmen (282), Teledyne Brown Engineering, 300 Sparkman Drive, Mailstop 60, Hunts ville, Alabama 35807 Leonard Uhr (147), Department of Computer Sciences, University of Wis- consin, Madison, Wisconsin 53706 Santosh S. Venkatesh (173), Moore School of Electrical Engineering, Uni- versity of Pennsylvania, Philadelphia, Pennsylvania 19104 Steven D. Whitehead (8), Department of Computer Science, University of Rochester, Rochester, New York 14627 Stephen S. Wilson (335), Applied Intelligent Systems, Inc., Ann Arbor, Michigan 48103 Foreword Neural Networks for Perception explores perception and the recent research in neural networks that has advanced our understanding of both human and machine perception. Perception is a major facet of our senses and provides us with the essential information needed to broaden our horizons and to connect us to the surrounding world, enabling safe movement and advanta- geous manipulation. Far beyond being merely a scientific challenge, the pos- sibility of emulating the human sense of perception would revolutionize countless technologies, such as visual tracking and object recognition, ro- botics and flexible manufacturing, automation and control, and autonomous navigation for future space missions. As Aristotle noted, "All men, by nature, desire to know. An indication of this is the delight we take in our senses, for even apart from their usefulness they are loved for them- selves and above all others the sense of sight. For not only with a view to action, but even when we are not going to do anything we prefer seeing to everything else. The reason is that this, most o fall senses, makes us know and brings to light many differences between things." Indeed, reflecting the intricate connection between perception and purpose- ful activity, many of the papers in this book deal with meaningful tasks. Meanwhile, we are witnessing the rapid growth of neural networks re- search as a novel and viable approach to emulating intelligence in general and to achieving the recognition and perceptual learning functions of vision. Neural network research is a synergetic endeavor that draws from cognitive and neuro-sciences, physics, signal processing, and pattern recognition. Neural networks (NN), also known as artificial neural systems (ANS), are implemented as parallel and distributed processing (PDP) models of com- putation consisting of dense interconnections among computational process- ing elements (PE or "neuron"). The competitive processes that take place among the PEs enable neural networks to display fault-tolerance and ro- bustness with respect to noisy and/or incomplete sensory inputs, while al- lowing graceful degradation with respect to faulty memory storage and in- ternal processing. Neural Networks for Perception showcases the work of preeminent prac- xv xvi Foreword titioners in the field of neural networks and enhances our understanding of what neural networks are and how they can be gainfully employed. It is organized into two volumes: The first, subtitled Human and Machine Per- ception, focuses on models for understanding human perception in terms of distributed computation and examples of PDP models for machine percep- tion. The second, subtitled Computation, Learning, and Architectures, ex- amines computational and adaptation problems related to the use of neu- ronal systems, and the corresponding hardware architectures capable of implementing neural networks for perception and of coping with the com- plexity inherent in massively distributed computation. Perception is just one of the capabilities needed to implement machine intelligence. The discussion on perception involves, by default, the full range of dialectics on the fundamentals of both human and machine intelligence. Normal science and technological development are always conducted within some predefined paradigm and this work is no exception. The paradigms attempt to model the everlasting dichotomy of brain and matter using spe- cific metaphors. One of the metaphors for neural networks is statistical physics and thermodynamics; nonetheless, some thoughts on the feasibility and future use of evolution and quantum mechanics are contemplated as well. NN advancements parallel those underway in artificial intelligence to- ward the development of perceptual systems. Consequently, the possibility of hybrid systems, consisting of NN and AI components, is also considered. Many have postulated possible arguments about what intelligence is and how it impinges on perception. Apparently, recognition is a basic biological function crucial to biological systems in recognizing specific patterns and responding appropriately: antibodies attack foreign intruders; our ears cap- ture sound and speech; animals have to locate edible plants; and sensory- motor interactions involved in navigation and manipulation are predicated on adequate recognition capabilities. Failure to recognize can be fatal; rec- ognition should therefore be the ultimate goal of the perceptual system, and indeed, it probably underlies much of what is intelligence. Albert Szent-Gyorgi said that "The brain is not an organ of thinking but an organ o sfurvival, like claws and fangs. It is made in such a way as to make us accept as truth that which is only advantage. It is an exceptional, almost pathological constitution one has, if one follows thoughts logically through, regardless of consequences. Such people make martyrs, apostles, or scientists, and mostly end on the stake, or in a chair, electric or academic." The concepts of recognition and reasoning by analogy underlie recent views on both planning and learning as espoused by the case-based reasoning methodology. Perception involves information processing, and one has to address those descriptive levels along which visual tasks of varying complexity can be ana-

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.