ebook img

Introduction to Languages and the Theory of Computation PDF

448 Pages·2010·21.902 MB·
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Introduction to Languages and the Theory of Computation

Introduction to Languages and The Theory of Computation Fourth Edition John C. Martin NorthDakotaState University INTRODUCTIONTOLANGUAGESANDTHETHEORYOFCOMPUTATION,FOURTHEDITION PublishedbyMcGraw-Hill,abusinessunitofTheMcGraw-HillCompanies,Inc.,1221Avenueofthe Americas,NewYork,NY10020.Copyright�c 2011byTheMcGraw-HillCompanies,Inc.Allrightsreserved. Previouseditions�c 2003,1997,and1991.Nopartofthispublicationmaybereproducedordistributedinany formorbyanymeans,orstoredinadatabaseorretrievalsystem,withoutthepriorwrittenconsentofThe McGraw-HillCompanies,Inc.,including,butnotlimitedto,inanynetworkorotherelectronicstorageor transmission,orbroadcastfordistancelearning. Someancillaries,includingelectronicandprintcomponents,maynotbeavailabletocustomersoutsidethe UnitedStates. Thisbookisprintedonacid-freepaper. 1234567890DOC/DOC109876543210 ISBN978–0–07–319146–1 MHID0–07–319146–9 VicePresident&Editor-in-Chief:MartyLange VicePresident,EDP:KimberlyMeriwetherDavid GlobalPublisher:RaghothamanSrinivasan DirectorofDevelopment:KristineTibbetts SeniorMarketingManager:CurtReynolds SeniorProjectManager:JoyceWatters SeniorProductionSupervisor:LauraFuller SeniorMediaProjectManager:TammyJuran DesignCoordinator:BrendaA.Rolwes CoverDesigner:StudioMontage,St.Louis,Missouri (USE)CoverImage:�c GettyImages Compositor:LaserwordsPrivateLimited Typeface:10/12TimesRoman Printer:R.R.Donnelley Allcreditsappearingonpageorattheendofthebookareconsideredtobeanextensionofthecopyrightpage. LibraryofCongressCataloging-in-PublicationData Martin,JohnC. Introductiontolanguagesandthetheoryofcomputation/JohnC.Martin.—4thed. p.cm. Includesbibliographicalreferencesandindex. ISBN978-0-07-319146-1 (alk.paper) 1.Sequentialmachinetheory.2.Computablefunctions.I.Title. QA267.5.S4M292010 511.3�5–dc22 2009040831 www.mhhe.com Tothe memoryof MaryHelenBaldwinMartin,1918–2008 D. EdnaBrown,1927–2007 andto JohnC. Martin DennisS.Brown C O N T E N T S Preface vii CHAPTER 3 Introduction x Regular Expressions, Nondeterminism, and Kleene’s Theorem 92 CHAPTER 1 3.1 Regular Languages and Regular Mathematical Tools and Expressions 92 Techniques 1 3.2 Nondeterministic Finite Automata 96 1.1 Logic and Proofs 1 3.3 The Nondeterminism in an NFA Can 1.2 Sets 8 Be Eliminated 104 1.3 Functions and Equivalence Relations 12 3.4 Kleene’s Theorem, Part 1 110 1.4 Languages 17 3.5 Kleene’s Theorem, Part 2 114 1.5 Recursive Definitions 21 Exercises 117 1.6 Structural Induction 26 Exercises 34 CHAPTER 4 Context-Free Languages 130 CHAPTER 2 4.1 Using Grammar Rules to Define a Finite Automata and the Language 130 Languages They Accept 45 4.2 Context-Free Grammars: Definitions and More Examples 134 2.1 Finite Automata: Examples and Definitions 45 4.3 Regular Languages and Regular Grammars 138 2.2 Accepting the Union, Intersection, or Difference of Two Languages 54 4.4 Derivation Trees and Ambiguity 141 2.3 Distinguishing One String 4.5 Simplified Forms and Normal Forms 149 from Another 58 Exercises 154 2.4 The Pumping Lemma 63 2.5 How to Build a Simple Computer CHAPTER 5 Using Equivalence Classes 68 Pushdown Automata 164 2.6 Minimizing the Number of States in a Finite Automaton 73 5.1 Definitions and Examples 164 Exercises 77 5.2 Deterministic Pushdown Automata 172 iv Contents v 5.3 A PDA from a Given CFG 176 8.3 More General Grammars 271 5.4 A CFG from a Given PDA 184 8.4 Context-Sensitive Languages and the 5.5 Parsing 191 Chomsky Hierarchy 277 Exercises 196 8.5 Not Every Language Is Recursively Enumerable 283 Exercises 290 CHAPTER 6 Context-Free and Non-Context-Free Languages 205 CHAPTER 9 Undecidable Problems 299 6.1 The Pumping Lemma for 9.1 A Language That Can’t Be Context-Free Languages 205 Accepted, and a Problem That Can’t 6.2 Intersections and Complements of Be Decided 299 CFLs 214 9.2 Reductions and the Halting 6.3 Decision Problems Involving Problem 304 Context-Free Languages 218 9.3 More Decision Problems Involving Exercises 220 Turing Machines 308 9.4 Post’s Correspondence Problem 314 CHAPTER 7 9.5 Undecidable Problems Involving Turing Machines 224 Context-Free Languages 321 7.1 A General Model of Computation 224 Exercises 326 7.2 Turing Machines as Language Acceptors 229 CHAPTER 10 7.3 Turing Machines That Compute Computable Functions 331 Partial Functions 234 10.1 Primitive Recursive Functions 331 7.4 Combining Turing Machines 238 10.2 Quantification, Minimalization, and 7.5 Multitape Turing Machines 243 μ-Recursive Functions 338 7.6 The Church-Turing Thesis 247 10.3 Go¨del Numbering 344 7.7 Nondeterministic Turing Machines 248 10.4 All Computable Functions Are 7.8 Universal Turing Machines 252 μ-Recursive 348 Exercises 257 10.5 Other Approaches to Computability 352 Exercises 353 CHAPTER 8 Recursively Enumerable CHAPTER 11 Languages 265 Introduction to Computational Complexity 358 8.1 Recursively Enumerable and Recursive 265 11.1 The Time Complexity of a Turing 8.2 Enumerating a Language 268 Machine, and the Set P 358 vi Contents 11.2 The Set NP and Polynomial Solutions to Selected Verifiability 363 Exercises 389 11.3 Polynomial-Time Reductions and Selected Bibliography 425 NP-Completeness 369 11.4 The Cook-Levin Theorem 373 Index of Notation 427 11.5 Some Other NP-Complete Problems 378 Index 428 Exercises 383 P R E F A C E T his book is an introduction to the theory of computation. After a chapter presentingthemathematicaltoolsthatwillbeused,thebookexaminesmodels ofcomputationandtheassociatedlanguages,fromthemostelementarytothemost general: finite automata and regular languages; context-free languages and push- down automata; and Turing machines and recursively enumerable and recursive languages. There is a chapter on decision problems, reductions, and undecidabil- ity, one on the Kleene approach to computability, and a final one that introduces complexity and NP-completeness. Specificchangesfromthethirdeditionaredescribedbelow.Probablythemost noticeable difference is that this edition is shorter, with three fewer chapters and fewer pages. Chapters have generally been rewritten and reorganized rather than omitted. The reductionin lengthis a resultnotsomuch ofleaving outtopics as of trying to write and organize more efficiently. My overall approach continues to be to rely on the clarity and efficiency of appropriate mathematical language and to add informal explanations to ease the way, not to substitute for the mathematical language but to familiarize it and make it more accessible. Writing “more effi- ciently” has meant (among other things) limiting discussions and technical details towhatisnecessaryfortheunderstandingofanidea,andreorganizingorreplacing examples so that each one contributes something not contributed by earlier ones. In each chapter, there are several exercises or parts of exercises marked with a (†). These are problems for which a careful solution is likely to be less routine or to require a little more thought. Previous editions of the text have been used at North Dakota State in a two-semestersequencerequiredofundergraduatecomputersciencemajors.Aone- semestercoursecouldcoverafewessentialtopicsfromChapter1andasubstantial portion of the material on finite automata and regular languages, context-free languages and pushdown automata, and Turing machines. A course on Turing machines, computability, and complexity could cover Chapters 7–11. As I was beginning to work on this edition, reviewers provided a number of thoughtfulcommentsonboththethirdeditionandasamplechapterofthenewone. Iappreciatedthesuggestions,whichhelpedmeinreorganizingthefirstfewchapters andthelastchapterandprovidedafewgeneralguidelinesthatIhavetriedtokeep in mind throughout. I believe the book is better as a result. Reviewers to whom I amparticularlygratefularePhilipBernhard,FloridaInstituteofTechnology;Albert M. K. Cheng, University of Houston; Vladimir Filkov, University of California- Davis; Mukkai S. Krishnamoorthy, Rensselaer Polytechnic University; Gopalan Nadathur,UniversityofMinnesota;PrakashPanangaden,McGillUniversity;Viera K. Proulx, Northeastern University; Sing-Ho Sze, Texas A&M University; and Shunichi Toida, Old Dominion University. vii viii Preface I have greatly enjoyed working with Melinda Bilecki again, and Raghu Srini- vasan at McGraw-Hill has been very helpful and understanding. Many thanks to MichelleGardner,ofLaserwordsMaine,forherattentiontodetailandherunfailing cheerfulness. Finally, one more thank-you to my long-suffering wife, Pippa. What’s New in This Edition Thetexthasbeensubstantiallyrewritten,andonlyoccasionallyhavepassagesfrom the third edition been left unchanged. Specific organizational changes include the following. 1. One introductory chapter, “Mathematical Tools and Techniques,” replaces Chapters 1 and 2 of the third edition. Topics in discrete mathematics in the first few sections have been limited to those that are used directly in subsequent chapters. Chapter 2 in the third edition, on mathematical induction and recursive definitions, has been shortened and turned into the last two sections of Chapter 1. The discussion of induction emphasizes “structural induction” and is tied more directly to recursive definitions of sets, of which the definition of the set of natural numbers is a notable example. In this way, the overall unity of the various approaches to induction is clarified, and the approach is more consistent with subsequent applications in the text. 2. Three chapters on regular languages and finite automata have been shortened to two. Finite automata are now discussed first; the first of the two chapters begins with the model of computation and collects into one chapter the topics that depend on the devices rather than on features of regular expressions. Those features, along with the nondeterminism that simplifies the proof of Kleene’s theorem, make up the other chapter. Real-life examples of both finite automata and regular expressions have been added to these chapters. 3. In the chapter introducing Turing machines, there is slightly less attention to the “programming” details of Turing machines and more emphasis on their role as a general model of computation. One way that Chapters 8 and 9 were shortened was to rely more on the Church-Turing thesis in the presentation of an algorithm rather than to describe in detail the construction of a Turing machine to carry it out. 4. The two chapters on computational complexity in the third edition have become one, the discussion focuses on time complexity, and the emphasis has been placed on polynomial-time decidability, the sets P and NP, and NP-completeness. A section has been added that characterizes NP in terms of polynomial-time verifiability, and an introductory example has been added to clarify the proof of the Cook-Levin theorem, in order to illustrate the idea of the proof. 5. In order to make the book more useful to students, a section has been added at the end that contains solutions to selected exercises. In some cases these are exercises representative of a general class of problems; in other cases the Preface ix solutions may suggest approaches or techniques that have not been discussed in the text. An exercise or part of an exercise for which a solution is provided will have the exercise number highlighted in the chapter. PowerPoint slides accompanying the book will be available on the McGraw- Hill website at http://mhhe.com/martin, and solutions to most of the exercises will be available to authorized instructors. In addition, the book will be available in e-book format, as described in the paragraph below. John C. Martin Electronic Books If you or your students are ready for an alternative version of the traditional text- book, McGraw-Hill has partnered with CourseSmart to bring you an innovative and inexpensive electronic textbook. Students can save up to 50% off the cost of a print book, reduce their impact on the environment, and gain access to powerful Webtoolsforlearning,includingfulltextsearch,notesandhighlighting,andemail tools for sharing notes between classmates. eBooks from McGraw-Hill are smart, interactive, searchable, and portable. To review comp copies or to purchase an eBook, go to either www. CourseSmart.com <http://www.coursesmart.com/>. Tegrity TegrityCampusisaservicethatmakesclasstimeavailableallthetimebyautomat- ically capturing every lecture in a searchable format for students to review when they study and complete assignments. With a simple one-click start and stop pro- cess, you capture all computer screens and corresponding audio. Students replay any part of any class with easy-to-use browser-based viewing on a PC or Mac. Educators know that the more students can see, hear, and experience class resources, the better they learn. With Tegrity Campus, students quickly recall key moments by using Tegrity Campus’s unique search feature. This search helps stu- dents efficiently find what they need, when they need it, across an entire semester of class recordings. Help turn all your students’ study time into learning moments immediately supported by your lecture. To learn more about Tegrity, watch a 2-minute Flash demo at http:// tegritycampus.mhhe.com I N T R O D U C T I O N Computers play such an important part in our lives that formulating a “theory of computation” threatens to be a huge project. To narrow it down, we adopt an approach that seems a little old-fashioned in its simplicity but still allows us to think systematically about what computers do. Here is the way we will think about a computer: It receives some input, in the form of a string of characters; it performs some sort of “computation”; and it gives us some output. In the first part of this book, it’s even simpler than that, becausethe questions we willbeasking thecomputercan allbe answeredeitheryesorno.Forexample, we might submit an input string and ask, “Is it a legal algebraic expression?” At this point the computer is playing the role of a language acceptor. The language accepted is the set of strings to which the computer answers yes—in our example, thelanguageoflegalalgebraicexpressions.Acceptingalanguageisapproximately the same as solving a decision problem, by receiving a string that represents an instance of the problem and answering either yes or no. Many interesting compu- tational problems can be formulated as decision problems, and we will continue to study them even after we get to models of computation that are capable of producing answers more complicated than yes or no. If we restrict ourselves for the time being, then, to computations that are supposed to solve decision problems, or to accept languages, then we can adjust the level of complexity of our model in one of two ways. The first is to vary the problems we try to solve or the languages we try to accept, and to formulate a model appropriate to the level of the problem. Accepting the language of legal algebraic expressions turns out to be moderately difficult; it can’t be done using the first model of computation we discuss, but we will get to it relatively early in the book. The second approach is to look at the computations themselves: to say at the outset how sophisticated the steps carried out by the computer are allowed to be, and to see what sorts of languages can be accepted as a result. Our first model, a finite automaton, is characterized by its lack of any auxiliary memory, and a language accepted by such a device can’t require the acceptor to remember very much information during its computation. Afiniteautomatonproceedsbymovingamongafinitenumberofdistinctstates in response to input symbols. Whenever it reaches an accepting state, we think of it as giving a “yes” answer for the string of input symbols it has received so far. Languages that can be accepted by finite automata are regular languages; they can be described by either regular expressions or regular grammars, and generated by combining one-element languages using certain simple operations. One step up fromafinite automatonisapushdownautomaton,andthelanguagesthesedevices accept can be generated by more general grammars called context-free grammars. Context-freegrammarscandescribemuchofthesyntaxofhigh-levelprogramming x

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.