ebook img

Artificial Neural Networks in Pattern Recognition: 5th INNS IAPR TC 3 GIRPR Workshop, ANNPR 2012, Trento, Italy, September 17-19, 2012. Proceedings PDF

252 Pages·2012·12.639 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Artificial Neural Networks in Pattern Recognition: 5th INNS IAPR TC 3 GIRPR Workshop, ANNPR 2012, Trento, Italy, September 17-19, 2012. Proceedings

Lecture Notes in Artificial Intelligence 7477 Subseries of Lecture Notes in Computer Science LNAISeriesEditors RandyGoebel UniversityofAlberta,Edmonton,Canada YuzuruTanaka HokkaidoUniversity,Sapporo,Japan WolfgangWahlster DFKIandSaarlandUniversity,Saarbrücken,Germany LNAIFoundingSeriesEditor JoergSiekmann DFKIandSaarlandUniversity,Saarbrücken,Germany Nadia Mana Friedhelm Schwenker Edmondo Trentin (Eds.) Artificial Neural Networks in Pattern Recognition 5th INNS IAPRTC 3 GIRPRWorkshop,ANNPR 2012 Trento, Italy, September 17-19, 2012 Proceedings 1 3 SeriesEditors RandyGoebel,UniversityofAlberta,Edmonton,Canada JörgSiekmann,UniversityofSaarland,Saarbrücken,Germany WolfgangWahlster,DFKIandUniversityofSaarland,Saarbrücken,Germany VolumeEditors NadiaMana FondazioneBrunoKessler(FBK) 38123Trento,Italy E-mail:[email protected] FriedhelmSchwenker UniversityofUlm InstituteofNeuralInformationProcessing 89069Ulm,Germany E-mail:[email protected] EdmondoTrentin UniversitàdiSiena DipartimentodiIngegneriadell’Informazione 53100Siena,Italy E-mail:[email protected] ISSN0302-9743 e-ISSN1611-3349 ISBN978-3-642-33211-1 e-ISBN978-3-642-33212-8 DOI10.1007/978-3-642-33212-8 SpringerHeidelbergDordrechtLondonNewYork LibraryofCongressControlNumber:2012945738 CRSubjectClassification(1998):I.2.6,I.5.1-3,I.4,J.3,H.5 LNCSSublibrary:SL7–ArtificialIntelligence ©Springer-VerlagBerlinHeidelberg2012 Thisworkissubjecttocopyright.Allrightsarereserved,whetherthewholeorpartofthematerialis concerned,specificallytherightsoftranslation,reprinting,re-useofillustrations,recitation,broadcasting, reproductiononmicrofilmsorinanyotherway,andstorageindatabanks.Duplicationofthispublication orpartsthereofispermittedonlyundertheprovisionsoftheGermanCopyrightLawofSeptember9,1965, initscurrentversion,andpermissionforusemustalwaysbeobtainedfromSpringer.Violationsareliable toprosecutionundertheGermanCopyrightLaw. Theuseofgeneraldescriptivenames,registerednames,trademarks,etc.inthispublicationdoesnotimply, evenintheabsenceofaspecificstatement,thatsuchnamesareexemptfromtherelevantprotectivelaws andregulationsandthereforefreeforgeneraluse. Typesetting:Camera-readybyauthor,dataconversionbyScientificPublishingServices,Chennai,India Printedonacid-freepaper SpringerispartofSpringerScience+BusinessMedia(www.springer.com) Preface This 5th INNS IAPR TC3 GIRPR Workshop on Artificial Neural Networks for Pattern Recognition (ANNPR 2012), whose proceedings are presented in this volume, endeavored to bring together recent novel research in this area and to provideaforumforfurtherdiscussion.TheworkshopwasheldattheFondazione Bruno Kessler (FBK) in Trento, Italy, during September 17–19,2012. ANNPR 2012 was supported by the International Neural Network Society (INNS),bytheInternationalAssociationforPatternRecognition(IAPR),bythe IAPRTechnicalCommitteeonNeuralNetworksandComputationalIntelligence (TC3),andbytheGruppoItalianoRicercatoriinPatternRecognition(GIRPR). IAPR-TC3 is one of the 20 Technical Committees of IAPR, focusing on the application of computational intelligence to pattern recognition. The workshop featured regular oral presentations and a poster session, plus three precious IAPR invited speeches, namely: – “Developmental Vision Agents” given by Marco Gori of the Universita` di Siena, Dip. di Ingegneria dell’Informazione – “NeuCube EvoSpike Architecture for Spatio-Temporal Modelling and Pat- tern Recognition of Brain Signals” given by Nikola (Nik) Kasabov of the AUT - Auckland University of Technology – “ClassifierFusionwithBeliefFunctions”givenbyGu¨ntherPalmofthe Uni- versity of Ulm, Institute of Neural Information Processing Itisourfirmconvictionthatallthepapers,includedinthepresentbook,are ofhighqualityandsignificancetotheareasofneuralnetwork-basedandmachine learning-basedpatternrecognition.Wesincerelyhopethatreadersofthisvolume may, in turn, enjoy it and get inspired from its different contributions. We wouldliketo acknowledgethe factthatthe organizationofthe workshop moveditsfirststepswithintheframeworkoftheVigoniProjectforinternational exchanges between the universities of Siena (Italy) and Ulm (Germany). Also, we wish to acknowledge the generosity of the ANNPR 2012 sponsors: INNS, IAPR, IAPR-TC3, GIRPR, the University of Ulm, the Dipartimento di Ingeg- neria dell’Informazione (DII) of the University of Siena, and the Fondazione Bruno Kessler which hosted the event. We are grateful to all the authors who submitted a paper to the workshop. Specialthankstothe localchairsandorganizationstaff,inparticulartoOswald Lanz, Stefano Messelodi and Moira Osti. The contribution from the members of the Program Committee in promoting the event and reviewing the papers is gratefully acknowledged. Finally, we wish to express our gratitude toward Springer for publishing these proceedings within their LNCS/LNAI series, and for their constant support. July 2012 Nadia Mana Friedhelm Schwenker Edmondo Trentin Organization Organizing Committee Nadia Mana Fondazione Bruno Kessler, Trento, Italy Friedhelm Schwenker University of Ulm, Germany Edmondo Trentin University of Siena, Italy Program Committee Shigeo Abe (Japan) Oswald Lanz (Italy) Amir Atiya (Egypt) Marco Loog (The Netherlands) Erwin Bakker (The Netherlands) Simone Marinai (Italy) Yoshua Bengio (Canada) Stefano Messelodi (Italy) Ludovic Denoyer (France) Heiko Neumann (Germany) Neamat El Gayar (Canada) Erkki Oja (Finland) Antonino Freno (France) Gu¨nther Palm (Germany) Markus Hagenbuchner (Australia) Lionel Prevost(France) Barbara Hammer (Germany) Raul Rojas (Germany) Tom Heskes (The Netherlands) Stefan Scherer (USA) Lakhmi Jain (Australia) Alessandro Sperduti (Italy) Nik Kasabov (New Zealand) Ah-Chung Tsoi (Macau) Hans A. Kestler (Germany) Ian Witten (New Zealand) Local Arrangements Oswald Lanz Stefano Messelodi Moira Osti Sponsoring Institutions International Neural Network Society (INNS) International Association for Pattern Recognition (IAPR) Technical Committee 3 (TC3) of the IAPR Gruppo Italiano Ricercatori in Pattern Recognition (GIRPR) Fondazione Bruno Kessler (FBK), Trento, Italy University of Ulm, Germany DII, University of Siena, Italy Table of Contents Learning Algorithms How to Quantitatively Compare Data Dissimilarities for Unsupervised Machine Learning?............................................... 1 Bassam Mokbel, Sebastian Gross, Markus Lux, Niels Pinkwart, and Barbara Hammer Kernel Robust Soft Learning Vector Quantization.................... 14 Daniela Hofmann and Barbara Hammer Incremental Learning by Message Passing in Hierarchical Temporal Memory ........................................................ 24 Davide Maltoni and Erik M. Rehn Representative Prototype Sets for Data Characterization and Classification.................................................... 36 Ludwig Lausser, Christoph Mu¨ssel, and Hans A. Kestler Feature Selection by Block Addition and Block Deletion .............. 48 Takashi Nagatani and Shigeo Abe Gradient Algorithms for Exploration/Exploitation Trade-Offs: Global and Local Variants ........................................ 60 Michel Tokic and Gu¨nther Palm Towards a Novel Probabilistic Graphical Model of Sequential Data: Fundamental Notions and a Solution to the Problem of Parameter Learning........................................................ 72 Edmondo Trentin and Marco Bongini Towards a Novel Probabilistic Graphical Model of Sequential Data: A Solution to the Problem of Structure Learning and an Empirical Evaluation ...................................................... 82 Marco Bongini and Edmondo Trentin Statistical Recognition of a Set of Patterns Using Novel Probability Neural Network.................................................. 93 Andrey V. Savchenko VIII Table of Contents On Graph-Associated Matrices and Their Eigenvalues for Optical Character Recognition............................................ 104 Miriam Schmidt, Gu¨nther Palm, and Friedhelm Schwenker Applications Improving Iris Recognition through New Target Vectors in MLP Artificial Neural Networks ........................................ 115 Jos´e Ricardo Gon¸calves Manzan, Shigueo Nomura, Keiji Yamanaka, Milena Bueno Pereira Carneiro, and Antoˆnio C. Paschoarelli Veiga Robustness of a CAD System on Digitized Mammograms ............. 127 Antonio Garc´ıa-Manso, Carlos J. Garc´ıa-Orellana, Ram´on Gallardo-Caballero, Nico Lanconelli, Horacio Gonza´lez-Velasco, and Miguel Mac´ıas-Mac´ıas Facial Expression Recognition Using Game Theory................... 139 Kaushik Roy and Mohamed S. Kamel Classification of Segmented Objects through a Multi-net Approach..... 151 Alessandro Zamberletti, Ignazio Gallo, Simone Albertini, Marco Vanetti, and Angelo Nodari A Decision Support System for the Prediction of the Trabecular Fracture Zone ................................................... 163 Vasileios Korfiatis, Simone Tassani, and George K. Matsopoulos Teeth/Palate and Interdental Segmentation Using Artificial Neural Networks ....................................................... 175 Kelwin Fernandez and Carolina Chang On Instance Selection in Audio Based Emotion Recognition ........... 186 Sascha Meudt and Friedhelm Schwenker Traffic Sign Classifier Adaption by Semi-supervised Co-training........ 193 Matthias Hillebrand, Ulrich Kreßel, Christian Wo¨hler, and Franz Kummert Using Self Organizing Maps to Analyze Demographics and Swing State Voting in the 2008 U.S. Presidential Election ........................ 201 Paul T. Pearson and Cameron I. Cooper Grayscale Images and RGB Video: Compression by Morphological Neural Network.................................................. 213 Osvaldo de Souza, Paulo C´esar Cortez, and Francisco A.T.F. da Silva Table of Contents IX Invited Paper NeuCube EvoSpike Architecture for Spatio-temporal Modelling and Pattern Recognition of Brain Signals ............................... 225 Nikola Kasabov Author Index.................................................. 245 How to Quantitatively Compare Data Dissimilarities for Unsupervised Machine Learning? Bassam Mokbel1, Sebastian Gross2, Markus Lux1, Niels Pinkwart2, and Barbara Hammer1 1 CITEC Centre of Excellence, Bielefeld University,Germany 2 Computer Science Institute,Clausthal University of Technology, Germany [email protected] Abstract. For complex data sets, the pairwise similarity or dissimilar- ity of data often serves as the interface of the application scenario to the machine learning tool. Hence, the final result of training is severely influencedbythechoiceofthedissimilaritymeasure.Whiledissimilarity measuresforsupervisedsettingscaneventuallybecomparedbytheclas- sificationerror,thesituationislessclearinunsuperviseddomainswhere a clear objective is lacking. The question occurs, how to compare dis- similarity measures and their influence on the final result in such cases. In this contribution, we propose to use a recent quantitative measure introduced in the context of unsupervised dimensionality reduction, to compare whether and on which scale dissimilarities coincide for an un- supervised learning task. Essentially, the measure evaluates in how far neighborhoodrelationsarepreservedifevaluatedbasedonrankings,this wayachievingarobustnessofthemeasureagainstscalingofdata.Apart fromaglobalcomparison,localversionsallowtohighlightregionsofthe data where two dissimilarity measures inducethesame results. 1 Introduction In many application areas, data are becoming more and more complex such that a representation of data as finite-dimensional vectors and their treatment in terms of the Euclidean distance or norm is no longer appropriate. Exam- ples include structured data such as bioinformatics sequences, graphs, or tree structures as they occur in linguistics, time series data, functional data aris- ing in mass spectrometry, relational data stored in relational databases, etc. In consequence,a varietyoftechniques has been developedto extendpowerful sta- tisticalmachinelearningtoolstowardsnon-vectorialdatasuchaskernelmethods using structure kernels, recursive and graph networks, functional methods, re- lational approaches, and similar [9,12,5,27,6,26,10,11]. One very prominent way to extend statistical machine learning tools is offered by the choice of problem- specific measures of data proximity, which can often directly be used in ma- chine learning tools based on similarities, dissimilarities, distances, or kernels. Thelatterinclude populartechniquessuchasthe supportvectormachine,other N.Mana,F.Schwenker,andE.Trentin(Eds.):ANNPR2012,LNAI7477,pp.1–13,2012. (cid:2)c Springer-VerlagBerlinHeidelberg2012 2 B. Mokbel et al. kernel approaches such as kernel self-organizing maps or kernel linear discrim- inant analysis, or distance-based approaches such as k-nearest neighbor tech- niques or distance-based clustering or visualization, see e.g. [23]. Here, we are interestedindissimilarity-basedapproachesingeneral,treatingmetricdistances as a special case of (non-metric) dissimilarities. With the emergence of more and more complex data structures, several dedicated structure metrics have become popular. Classical examples include alignmentfor sequencesinbioinformatics[22], shapedistances [21], ormeasures motivated by information theory [4]. Often, there exists more than one generic possibilitytoencodeandcomparethegivendata.Inaddition,dissimilaritymea- suresoftencomewithparameters,thechoiceofwhichisnotclearapriori.Hence, thequestionoccurshowtochooseanappropriatemetricinagivensetting.More generally, how can we decide whether a change of the metric or its parameters changes the data representation which is relevant for the subsequent machine learning task? Are there possibilities to compare whether and, if so, in which regions two metrics differ if used for machine learning? Manyapproacheswhichareusedinmachinelearningforstructureshavebeen proposed in the supervised domain. Here, a clear objective of the task is given by the classification or regression error. Therefore, it is possible to evaluate the difference of dissimilarities by comparing the classification error obtained when using these different data representations. A few extensive comparisons how differentdissimilaritiesinfluence theoutcomehavebeenconducted;see,e.g.[18] forthe performanceofdifferentdissimilaritiesfor content-basedimage retrieval, [19] for an according study in the symbolic domain, [2] for the comparison of distances for probability measures, or [3] for the performance of classifiers on differently preprocessed dissimilarities to arrive at a valid kernel. Thesituationislessclearwhendealingwithunsuperviseddomains.Unsuper- vised learning is essentially ill-posed and the final objective depends on expert evaluation. The primary mathematical goal is often to cluster or visualize data, suchthatanunderlyingstructurebecomesapparent.Quiteafewapproachesfor unsupervised learning for structures based on general dissimilarities have been proposed in the past: kernel clustering techniques such as kernel self-organizing maps (SOM) or kernel neural gas (NG) [34,24] or relational clustering such as proposed for fuzzy-k-means, SOM, NG, or the generative topographic mapping (GTM)[13,7,8].Further,manystate-of-theartnonlinearvisualizationtechniques such as t-distributed stochastic neighbor embedding are based on pairwise dis- similarities rather than vectors [31,15]. In this contribution, we will investigate how to compare dissimilarity mea- sureswithregardtotheirinfluenceonunsupervisedmachinelearningtasks,and discuss different possibilities in Sec. 2. Thereafter, we will focus on a princi- pled approachindependent of the chosenmachine learning technique, rather we will propose a framework which compares two dissimilarity measures based on theirinducedneighborhoodstructureinSec.3.This way,itispossibletodecide prior to learning whether and, if so, in which regions two different dissimilarity measures or different choices of parameters lead to different results, which we

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.