ebook img

Solution Manual to accompany Pattern Classification (2nd ed.) PDF

446 Pages·2003·2.68 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Solution Manual to accompany Pattern Classification (2nd ed.)

Solution Manual to accompany Pattern Classification (2nd ed.) David G. Stork Solution Manual to accompany Pattern Classification (2nd ed.) by R. O. Duda, P. E. Hart and D. G. Stork David G. Stork Copyright 2001. All rights reserved. This Manual is for the sole use of designated educators and must not be distributed to students except in short, isolated portions and in conjunction with the use of Pattern Classification (2nd ed.) June 18, 2003 Preface In writing this Solution Manual I have learned a very important lesson. As a student, I thought that the best way to master a subject was to go to a superb university and study with an established expert. Later, I realized instead that the best way was to teach a course on the subject. Yet later, I was convinced that the best way was to write a detailed and extensive textbook. Now I know that all these years I have been wrong: in fact the best way to master a subject is to write the Solution Manual. In solving the problems for this Manual I have been forced to confront myriad technical details that might have tripped up the unsuspecting student. Students and teacherscanthankmeforsimplifyingorscreeningoutproblemsthatrequiredpagesof unenlightening calculations. Occasionally I had to go back to the text and delete the word“easily”fromproblemreferencesthatread“itcaneasilybeshown(Problem...).” Throughout, I have tried to choose data or problem conditions that are particularly instructive. In solving these problems, I have found errors in early drafts of this text (as well as errors in books by other authors and even in classic refereed papers), and thus the accompanying text has been improved for the writing of this Manual. I have tried to make the problem solutions self-contained and self-explanatory. I havegonetogreatlengthstoensurethatthesolutionsarecorrectandclearlypresented —manyhavebeenreviewedbystudentsinseveralclasses. Surelythereareerrorsand typos in this manuscript, but rather than editing and rechecking these solutions over months or even years, I thought it best to distribute the Manual, however flawed, as early as possible. I accept responsibility for these inevitable errors, and humbly ask anyone finding them to contact me directly. (Please, however, do not ask me to explainasolutionorhelpyousolveaproblem!) Itshouldbeasmallmattertochange the Manual for future printings, and you should contact the publisher to check that you have the most recent version. Notice, too, that this Manual contains a list of knowntyposanderratainthetextwhichyoumightwishtophotocopyanddistribute to students. Ihavetriedtobethoroughinordertohelpstudents,eventotheoccassionalfaultof verbosity. Youwillnoticethatseveralproblemshavethesimple“explainyouranswer in words” and “graph your results.” These were added for students to gain intuition and a deeper understanding. Graphing per se is hardly an intellectual challenge, but if the student graphs functions, he or she will develop intuition and remember the problem and its results better. Furthermore, when the student later sees graphs of data from dissertation or research work, the link to the homework problem and the material in the text will be more readily apparent. Note that due to the vagaries of automatic typesetting, figures may appear on pages after their reference in this Manual; be sure to consult the full solution to any problem. I have also included worked examples and so sample final exams with solutions to 1 2 covermaterialintext. Idistributealistofimportantequations(withoutdescriptions) with the exam so students can focus understanding and using equations, rather than memorizing them. I also include on every final exam one problem verbatim from a homework, taken from the book. I find this motivates students to review carefully their homework assignments, and allows somewhat more difficult problems to be in- cluded. These will be updated and expanded; thus if you have exam questions you find particularly appropriate, and would like to share them, please send a copy (with solutions) to me. It should be noted, too, that a set of overhead transparency masters of the figures from the text are available to faculty adopters. I have found these to be invaluable for lecturing, and I put a set on reserve in the library for students. The files can be accessed through a standard web browser or an ftp client program at the Wiley STM ftp area at: ftp://ftp.wiley.com/public/sci tech med/pattern/ or from a link on the Wiley Electrical Engineering software supplements page at: http://www.wiley.com/products/subject/engineering/electrical/ software supplem elec eng.html I have taught from the text (in various stages of completion) at the University of California at Berkeley (Extension Division) and in three Departments at Stan- ford University: Electrical Engineering, Statistics and Computer Science. Numerous students and colleagues have made suggestions. Especially noteworthy in this re- gard are Sudeshna Adak, Jian An, Sung-Hyuk Cha, Koichi Ejiri, Rick Guadette, John Heumann, Travis Kopp, Yaxin Liu, Yunqian Ma, Sayan Mukherjee, Hirobumi Nishida,ErhanOztop,StevenRogers,CharlesRoosen,SergioBermejoSanchez,God- fried Toussaint, Namrata Vaswani, Mohammed Yousuf and Yu Zhong. Thanks too go to Dick Duda who gave several excellent suggestions. IwouldgreatlyappreciatenoticesofanyerrorsinthisManualorthetextitself. I would be especially grateful for solutions to problems not yet solved. Please send any suchinformationtomeatthebelowaddress. Iwillincorporatethemintosubsequent releases of this Manual. This Manual is for the use of educators and must not be distributed in bulk to students in any form. Short excerpts may be photocopied and distributed, but only in conjunction with the use of Pattern Classification (2nd ed.). I wish you all the best of luck in teaching and research. Ricoh Innovations, Inc. David G. Stork 2882 Sand Hill Road Suite 115 Menlo Park, CA 94025-7022 USA [email protected] Contents Preface 1 Introduction 5 2 Bayesian decision theory 7 Problem Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 Computer Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74 3 Maximum likelihood and Bayesian parameter estimation 77 Problem Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77 Computer Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130 4 Nonparametric techniques 131 Problem Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131 Computer Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174 5 Linear discriminant functions 177 Problem Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 177 Computer Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 217 6 Multilayer neural networks 219 Problem Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 219 Computer Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 254 7 Stochastic methods 255 Problem Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 255 Computer Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 276 8 Nonmetric methods 277 Problem Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 277 Computer Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 294 9 Algorithm-independent machine learning 295 Problem Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 295 Computer Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 304 10 Unsupervised learning and clustering 305 Problem Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 305 Computer Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 355 Sample final exams and solutions 357 3 4 CONTENTS Worked examples 415 Errata and ammendations in the text 417 First and second printings . . . . . . . . . . . . . . . . . . . . . . . . . . . . 417 Fifth printing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 443 Chapter 1 Introduction Problem Solutions There are neither problems nor computer exercises in Chapter 1. 5 6 CHAPTER 1. INTRODUCTION Chapter 2 Bayesian decision theory Problem Solutions Section 2.1 1. Equation 7 in the text states | | | P(error x)=min[P(ω x),P(ω x)]. 1 2 (a) We assume, without loss of generality, that for a given particular x we have | ≥ | | | P(ω x) P(ω x), and thus P(error x) = P(ω x). We have, moreover, the 2 1 1 | − | | normalizationconditionP(ω x)=1 P(ω x). TogethertheseimplyP(ω x)> 1 2 2 | 1/2 or 2P(ω x)>1 and 2 | | | | 2P(ω x)P(ω x)>P(ω x)=P(error x). 2 1 1 This is true at every x, and hence the integrals obey (cid:2) (cid:2) | | ≥ | 2P(ω x)P(ω x)dx P(error x)dx. 2 1 | | | In short, 2P(ω x)P(ω x) provides an upper bound for P(error x). 2 1 | (b) From part (a), we have that P(ω x) > 1/2, but in the current conditions not 2 | greater than 1/α for α < 2. Take as an example, α = 4/3 and P(ω x) = 0.4 1 | | and hence P(ω x)=0.6. In this case, P(error x)=0.4. Moreover, we have 2 | | × × | αP(ω x)P(ω x)=4/3 0.6 0.4<P(error x). 1 2 | This does not provide an upper bound for all values of P(ω x). 1 | | (c) Let P(error x)=P(ω x). In that case, for all x we have 1 | | | | (cid:2) P(ω2 x)P(ω1 x) < (cid:2)P(ω1 x)P(error x) | | | | P(ω x)P(ω x)dx < P(ω x)P(error x)dx, 2 1 1 and we have a lower bound. 7 8 CHAPTER 2. BAYESIAN DECISION THEORY (d) The solution to part (b) also applies here. Section 2.2 2. We are given that the density is of the form p(x|ωi)=ke−|x−ai|/bi. (a) We seek k so that the function is normalized, as required by a true density. We integrate this function, set it to 1.0, ⎡ ⎤ (cid:2)ai (cid:2)∞ ⎣ − − − ⎦ k exp[(x a )/b ]dx+ exp[ (x a )/b ]dx =1, i i i i −∞ ai whichyields2b k =1ork =1/(2b ). Notethatthenormalizationisindependent i i of a , which corresponds to a shift along the axis and is hence indeed irrelevant i to normalization. The distribution is therefore written p(x|ω )= 1 e−|x−ai|/bi. i 2b i (b) The likelihood ratio can be written directly: (cid:7) (cid:8) | | − | | − | p(xω ) b x a x a 1 2 − 1 2 = exp + . | p(xω ) b b b 2 1 1 2 (c) For the case a =0, a =1, b =1 and b =2, we have the likelihood ratio is 1 2 1 2 ⎧ | ⎨ 2e(x+1)/2 x≤0 p(x|ω2) =⎩ 2e(1−3x)/2 0<x≤1 p(xω1) 2e(−x−1)/2 x>1, as shown in the figure. p(x|ω) 1 p(x|ω) 2 4 3.5 3 2.5 2 1.5 1 0.5 x -2 -1 0 1 2 Section 2.3 3. We are are to use the standard zero-one classification cost, that is λ = λ = 0 11 22 and λ =λ =1. 12 21

Description:
Solution manual to Pattern Classification by Duda and Hart
See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.