ebook img

NEURAL NETWORKS SATISFYING STONE-WEIESTRASS THEOREM AND APPROXIMATING ... PDF

129 Pages·2004·1.21 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview NEURAL NETWORKS SATISFYING STONE-WEIESTRASS THEOREM AND APPROXIMATING ...

UUnniivveerrssiittyy ooff CCeennttrraall FFlloorriiddaa SSTTAARRSS Electronic Theses and Dissertations, 2004-2019 2004 NNeeuurraall NNeettwwoorrkkss SSaattiissffyyiinngg SSttoonnee--wweeiieessttrraassss TThheeoorreemm AAnndd AApppprrooxxiimmaattiinngg Pinal Thakkar University of Central Florida Part of the Mathematics Commons Find similar works at: https://stars.library.ucf.edu/etd University of Central Florida Libraries http://library.ucf.edu This Masters Thesis (Open Access) is brought to you for free and open access by STARS. It has been accepted for inclusion in Electronic Theses and Dissertations, 2004-2019 by an authorized administrator of STARS. For more information, please contact [email protected]. SSTTAARRSS CCiittaattiioonn Thakkar, Pinal, "Neural Networks Satisfying Stone-weiestrass Theorem And Approximating" (2004). Electronic Theses and Dissertations, 2004-2019. 247. https://stars.library.ucf.edu/etd/247 NEURAL NETWORKS SATISFYING STONE-WEIESTRASS THEOREM AND APPROXIMATING SCATTERED DATA BY KOHONEN NEURAL NETWORKS by PINAL B. THAKKAR B. S. Maharaja Sayajirao University of Baroda, 2000 M.S. Maharaja Sayajirao University of Baroda, 2001 A thesis submitted in partial fulfillment of the requirements for the degree of Master of Science in the Department of Mathematics in the College of Arts and Sciences at the University of Central Florida Orlando, Florida Fall Term 2004 ABSTRACT Neural networks are an attempt to build computer networks called artificial neurons, which imitate the activities of the human brain. Its origin dates back to 1943 when neurophysiologist Warren Me Cello and logician Walter Pits produced the first artificial neuron. Since then there has been tremendous development of neural networks and their applications to pattern and optical character recognition, speech processing, time series prediction, image processing and scattered data approximation. Since it has been shown that neural nets can approximate all but pathological functions, Neil Cotter considered neural network architecture based on Stone-Weierstrass Theorem. Using exponential functions, polynomials, rational functions and Boolean functions one can follow the method given by Cotter to obtain neural networks, which can approximate bounded measurable functions. Another problem of current research in computer graphics is to construct curves and surfaces from scattered spatial points by using B-Splines and NURBS or Bezier surfaces. Hoffman and Varady used Kohonen neural networks to construct appropriate grids. This thesis is concerned with two types of neural networks viz. those which satisfy the conditions of the Stone-Weierstrass theorem and Kohonen neural networks. We have used self-organizing maps for scattered data approximation. Neural network Tool Box from MATLAB is used to develop the required grids for approximating scattered data in one and two dimensions. ii ACKNOWLEDGMENTS In the process of my graduate study, I have benefited from the faculty of the Department of Mathematics of the university of Central Florida. My gratitude goes to all the professors who helped to shape my graduate career. I would like to express my gratitude in particular to Dr. Ram Mohapatra for not only providing guidance in the process of the preparation of my thesis but also sharing in my excitement of studying such a “strange” and wonderful field. My thanks goes to my loving husband, Vipul, for his interest and aid in the computer programming aspects of my research and also for enduring the many long hours involved in the advancement of my education. Finally, I would like to thank my parents and my brother for encouraging me and for their support. iii TABLE OF CONTENTS LIST OF FIGURES......................................................................................................................vii CHAPTER 1: INTRODUCTION...................................................................................................1 Human Neurons v/s Artificial Neurons......................................................................................1 Different Architectures Of ANN’S.............................................................................................3 Perceptron...............................................................................................................................3 Feed Forward Network...........................................................................................................4 Single Layer Feed Forward Network......................................................................................4 Multilayer Feed Forward Network.........................................................................................5 Recurrent Network..................................................................................................................6 Hopefield Network..................................................................................................................6 Training Algorithms...................................................................................................................7 Supervised Learning...............................................................................................................8 Unsupervised Learning...........................................................................................................8 Applications................................................................................................................................9 Training Of Neural Network.......................................................................................................9 Basic Learning Rules................................................................................................................10 Hebbian Learning Law..........................................................................................................11 Perceptron Learning Law......................................................................................................14 Delta Learning Law..............................................................................................................17 Widrow And Hoff LMS Learning Law................................................................................19 Correlation Learning Law.....................................................................................................21 iv Instar (Winner-Take-All) Learning Law...............................................................................23 Outstar Learning Law...........................................................................................................26 Multilayer Neural Networks.....................................................................................................27 Delta Learning Rule For Multi-perceptron Layer.................................................................27 Summary Of The Subsequent Chapters....................................................................................35 CHAPTER 2: STONE-WEIERSTRASS THEOREM IN NEURAL NETWORKS...................36 Stone -Weierstrass Theorem.....................................................................................................37 Applying The Stone-Weierstrass Theorem...............................................................................41 Generic Network Architecture..............................................................................................41 Networks Satisfying The Stone-Weierstrass Theorem.........................................................45 Advantage Of Neural Networks Satisfying Stone-Weierstrass Theorem.................................52 Conclusion................................................................................................................................52 CHAPTER 3: B-SPLINES...........................................................................................................53 Definition And Properties Of B-Spline Basis Function............................................................53 Properties Of The B-Spline Basis Functions........................................................................55 Definition And Properties Of B-Spline Curves........................................................................55 Properties Of B-Spline Curves..............................................................................................56 Definition And Properties Of B-Spline Surfaces......................................................................57 Properties Of B-Spline Surfaces...........................................................................................58 CHAPTER 4: KOHONEN NEURAL NETWORK.....................................................................60 The Kohonen Neural Network And The Training Procedure...................................................60 Numerical Controls Of The Training: The Gain Term And The Neighborhood Function.64 The Dynamic Kohonen Network..............................................................................................65 v Insertion Of New Neuron......................................................................................................66 Conclusion................................................................................................................................68 CHAPTER 5: CREATING A SELF-ORGANIZING MAP USING MATLAB (NEURAL NETWORK TOOL BOX)............................................................................................................69 Self Organizing Feature MAP Architecture.............................................................................70 Training Of SOFM Network.....................................................................................................72 Examples...................................................................................................................................74 One- Dimensional Self Organizing Map..............................................................................74 Two - Dimensional Self-Organizing Map............................................................................77 APPENDIX A: PROGRAM CODE.............................................................................................84 APPENDIX B: INPUT MATRIX................................................................................................90 LIST OF REFERENCES............................................................................................................120 vi LIST OF FIGURES Figure 1: Model of the Artificial Neuron........................................................................................2 Figure 2: Single Layer Feed Forward Network:.............................................................................5 Figure 3: Multi-Layer Feed Forward Network...............................................................................6 Figure 4: A Hopefield Network with five nodes............................................................................7 Figure 5: Neurons organized in a layer.........................................................................................23 Figure 6: Outstar Learning is related to a group of units in a layer..............................................26 Figure 7: Single - Layer Network with Continuous Perceptron...................................................28 Figure 8: Tree Structured Network for Modified Logistic Network............................................42 Figure 9: The Application of The Kohonen Network for Surface Fitting....................................61 Figure 10: Self-Organizing Feature Map Architecture.................................................................70 Figure 11: Input Curve..................................................................................................................75 Figure 12: Network created by Newsom is plotted using PLOTSOM.........................................75 Figure 13: Self-Organizing Map after 30 epochs.........................................................................76 Figure 14: Scattered Input Data....................................................................................................78 Figure 15: Network created by Newsom is plotted using PLOTSOM.........................................78 Figure 16: Self-Organizing Map After 10 epochs........................................................................79 Figure 17: Scattered Input Data....................................................................................................81 Figure 18: Network is created by NEWSOM is plotted using PLOTSOM..................................81 Figure 19: Self-Organizing Map after 100 epochs.......................................................................82 vii CHAPTER 1: INTRODUCTION Although the concept of neural network seems to have a recent history, it’s origin traces back to 1943 when neurophysiologist Warren Mc Cello and logician Walter Pits produced the first artificial neuron. Human Neurons v/s Artificial Neurons A human brain neuron with the help of fine structures called dendrites collects various informatics, through structures called axons. The neuron sends out electrical impulses, the axon in turn is extensively branched; the end of each branch is called synapse. The synapse creates electrical impulses, which may inhibit or excite the activities of connected neurons. When the extent of excitation is sufficiently more than the extent of inhibition, the neurons transfer the electrical impulses to the axon. When we now apply this concept for computation, we need to design a network of such neurons to make up what is called an Artificial Neural Network. Work on artificial neural network commonly referred as ‘Neural Network’, is motivated by analogy with the brain, which computes in an entirely different way from conventional digital computers. Biological neuron, believed to be the structural constituents of brain, is much slower than silicon logic gates. But inference in biological neural networks is faster than the fastest computer. 1 A biological neural network is a nonlinear, highly paralleled device characterized by robustness and fault tolerance. • It can learn by adapting its synaptic weights to changes in the surroundings. • It can handle imprecise, fuzzy, noisy, and probabilistic information. • It can generalize from known tasks or examples to unknown ones. ARTIFICIAL NEURAL NETWORK is an attempt to form a model including some or all of these characteristics. θ bias Wk1 X1 Yak ∑ X2 Wk2 F ( . ) : (O/p) : : : SUMMING ACTIVATION : UNIT FUNCTION Wok Xm SYNAPTIC WEIGHTS Figure 1: Model of the Artificial Neuron Description: 1) A set of synapses wokj, j = 1,2,m or connecting links, each of which is characterize by a weight or strength of its own. 2) An adder for summing the input signal weighted by the respective synapses of the neuron. 3) An activation function for limiting the amplitude of the output of a neuron. 2

Description:
Neural networks are an attempt to build computer networks called artificial for scattered data approximation. Neural network Tool Box from MATLAB is used .. Figure 7: Single - Layer Network with Continuous Perceptron . Example. Perform two training steps of the network using delta learning rule
See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.