01-08-12 23:44 HISTORY OF COMPUTER SCIENCE Gordana Dodig - Crnkovic, e-mail: [email protected] INTRODUCTION: MILESTONES IN THE DEVELOPMENT OF COMPUTERS...............2 Before 1900: First Computing Devices..................................................................................2 1900 – 1939 The Rise of Mathematics...................................................................................4 1940's: First Electronic Digital Computer..............................................................................4 1950's......................................................................................................................................5 1960's......................................................................................................................................5 1970's......................................................................................................................................6 1980's......................................................................................................................................6 1990's and Beyond..................................................................................................................7 1 LEIBNIZ: LOGICAL CALCULUS...................................................................................8 Calculus Ratiocinator.......................................................................................................10 Language and Mind..........................................................................................................11 2 BOOLE: LOGIC AS ALGEBRA.....................................................................................13 Bool’s Algebra of Logic...................................................................................................14 3 FREGE: MATEMATICS AS LOGIC..............................................................................17 Frege’s Advances in Logic...............................................................................................18 The Predicate Calculus.....................................................................................................18 The Analysis of Atomic Sentences and Quantifier Phrases.............................................18 Proof.................................................................................................................................19 4 CANTOR: INFINITY......................................................................................................20 5 HILBERT: PROGRAM FOR MATHEMATICS............................................................22 Hilbert's Program..............................................................................................................25 6 GÖDEL: END OF HILBERTS PROGRAM...................................................................26 7 TURING: UNIVERSAL AUTOMATON........................................................................28 The Turing Machine.........................................................................................................30 8 VON NEUMAN: COMPUTER.......................................................................................33 APPENDIX 1...........................................................................................................................36 Charles Babbage...................................................................................................................36 Augusta Ada King, Countess of Lovelace...........................................................................39 APPENDIX 2...........................................................................................................................40 Russels paradox....................................................................................................................40 1 01-08-12 23:44 INTRODUCTION: MILESTONES IN THE DEVELOPMENT OF COMPUTERS Before 1900: First Computing Devices A number of early cultures have developed mechanical computing devices. For example, the abacus probably existed in Babylonia about 3000 B.C. The Chinese abacus is an excellent example, remaining unsurpassed in speed and accuracy of operation well into this century. This dominance of several thousand years is unequaled for a specific type of computer. Certainly our nowadays computers are unlikely to equal it. The classical abacus was called suan pan by the Chinese- which meant “counting” or “reckoning” board. Small beads of bone or ivory were strung on parallel bamboo runners and set into a rectangular frame. Each raw of these beads corresponded to one column of written numbers. It was an incredibly powerful tool for rapidly adding and subtracting large numbers. The ancient Greeks developed some very sophisticated analog machines. In 1901, an ancient Greek shipwreck was discovered off the island of Antikythera. Inside was a salt-encrusted device (now called the Antikythera mechanism) that consisted of rusted metal gears and pointers. When this c. 80 B.C. device was reconstructed, it produced a mechanism for predicting the motions of the stars and planets. The Romans used their hands to calculate. Because of their extremely cumbersome system of numbers, they evolved very elaborate “finger” arithmetic. Arabic numbering system that came originally from India had a big advantage over Roman numerals because of its concept of place value. One column stands for the ones, the next column for tens, next for hundreds, and so on. Take the problem of multiplying MMMCCCCLVIII (3458) by CCCCLIX (459). Employing the Arabic system, pen and paper, we could get answer in 30 seconds or so. By Roman method, even if we try our best waggling finger as fast as possible, it takes something like ten minutes! After classical antique come middle ages, with focus on completely different matters. No traces of any significant improvements in science. No new calculating devices either. It was Renaissance that brought the secular1 themes back to focus. As mathematicians expanded the boundaries of geometry, algebra and number theories, the outcry for help became greater and greater. John Napier (1550-1617), the Scottish inventor of logarithms, invented Napier's rods (sometimes called "Napier's bones") to simplify the task of multiplication. The first to really achieve any success with mechanical calculating machine was Wilhelm Schickard (1592-1635), a graduate of the University of Tübingen (Germany). A brief description of the device (that could add, subtract, multiply, and divide) is contained in two letters to Johannes Kepler. Unfortunately, at least one copy of the machine burned up in a fire, and Schickard himself died of bubonic plague in 1635, during the Thirty Years' War. So this invention was lost. 1 worldly, non-sacral 2 01-08-12 23:44 In 1641 the French mathematician and philosopher Blaise Pascal (1623-1662) built a mechanical adding machine “arithmetique”, a brass box the size of a loaf of bread, with eight dials on its face, that one operated by using stylus to input numbers. “Arithmetique” was less sophisticated than Schickard’s lost machine, and it could only add and subtract. Similar work was done by Gottfried Wilhelm Leibniz (1646-1716), but his device was capable of performing all four basic arithmetic operations. Leibniz also advocated use of the binary system for doing calculations. Up until the dawn of the 19th century, the attempts to extend the human mind were limited to manually operated devices. The abacus, Pascal’s “arithmetique”, Leibniz’s Wheel- they all required an operator who did each step in sequence. Yet concept of programming was not that new. Music boxes clocks and various automata had made use of programming principle for hundreds of years. Joseph-Marie Jacquard (1752-1834) invented a loom that could automate textile manufacturing and weave complicated patterns described by holes in punched cards. Charles Babbage (1791-1871) worked on two mechanical devices: the Difference Engine and the far more ambitious Analytical Engine (a predecessor of the modern digital computer), but neither worked satisfactorily. He envisioned a steam-powered machine with two major parts. The first was a “mill” to perform arithmetical operations (a “central processing unit”). The second was a “store” to manage variables and retain the answers to problems solved (“memory”). Babbage intended to use Jackard’s system of punched cards to program the machine. One of Babbage's friends, matematician Ada Augusta Byron, Countess of Lovelace (1815- 1852), sometimes called the "first programmer" has written on Babbage's machine. The programming language Ada was named for her. William Stanley Jevons (1835-1882), a British economist and logician, built a machine in 1869 to solve logic problems. It was "the first such machine with sufficient power to solve a complicated problem faster than the problem could be solved without the machine's aid." (Gardner) It is now in the Oxford Museum of the History of Science. Herman Hollerith (1860-1929) invented the modern punched card (inspired by Jacquards solution) for use in a machine he designed to help tabulate the American 1890 census. 3 01-08-12 23:44 1900 – 1939 The Rise of Mathematics In 1928, the German mathematician David Hilbert (1862-1943) addressed the International Congress of Mathematicians. He posed among others following three fundamental questions: • Is mathematics complete; i.e. can every mathematical statement be either proved or disproved? • Is mathematics consistent, that is, is it true that statements such as "0 = 1" cannot be proved by valid methods? • Is mathematics decidable, that is, is there a mechanical method that can be applied to any mathematical assertion and (at least in principle) will eventually tell whether that assertion is true or not? This last question was called the Entscheidungsproblem. In 1931, Kurt Gödel (1906-1978) answered two of Hilbert's questions. He showed that every sufficiently powerful formal system is either inconsistent or incomplete. Also, even if an axiom system is consistent, this consistency cannot be proved within itself. The third question remained open, with 'provable' substituted for 'true'. In 1936, Alan Turing (1912-1954) provided a solution to Hilbert's Entscheidungsproblem by conceiving a formal model of a computer - the Turing machine - and showing that there were problems that a machine could not solve. One such problem is the so-called "halting problem": given a program, does it halt on all inputs? 1940's: First Electronic Digital Computer The World War II urged the development of the general-purpose electronic digital computer for ballistics calculations. At Harvard, Howard H. Aiken (1900-1973) built the Mark I electromechanical computer in 1944, with the assistance of IBM. Military code-breaking also led to computational projects. Alan Turing was involved in the breaking of the code behind the German machine, the Enigma, at Bletchley Park in England. The British built a computing device, the Colossus, to assist with code-breaking. At Iowa State University in 1939, John Vincent Atanasoff (1904-1995) and Clifford Berry designed and built an electronic computer for solving systems of linear equations, but it never worked properly. John William Mauchly (1907-1980) with J. Presper Eckert, Jr. (1919-1995), designed and built the ENIAC, a general-purpose electronic computer originally intended for artillery calculations. The ENIAC was built at the Moore School at the University of Pennsylvania, and was finished in 1946. In 1944, Mauchly, Eckert, and John von Neumann (1903-1957) were designing a stored- program electronic computer, the EDVAC. Von Neumann's report, "First Draft of a Report on the EDVAC", was very influential and contains many of the ideas still used in most modern digital computers, including a merge sort. Eckert and Mauchly went on to build UNIVAC. 4 01-08-12 23:44 Maurice Wilkes (b. 1913), working in Cambridge, England, built the EDSAC, a computer based on the EDVAC. F. C. Williams (b. 1911) and others at Manchester University built the Manchester Mark I, one version of which was working as early as June 1948. This machine is sometimes called the first stored-program digital computer. The invention of the transistor in 1947 by Bardeen, Brattain, and Shockley transformed the computer and made the microprocessor revolution possible. For this discovery they won the 1956 Nobel Prize in physics. Jay Forrester (b. 1918) invented magnetic core memory 1949. 1950's Grace Murray Hopper (1906-1992) conceived of the idea of a compiler, in 1951. She even invented the language APT.2 John Backus and others developed the first FORTRAN compiler in April 1957. LISP, a list-processing language for artificial intelligence programming, was invented by John McCarthy about 1958. Alan Perlis, John Backus, Peter Naur and others developed Algol. In hardware, Jack Kilby (Texas Instruments) and Robert Noyce (Fairchild Semiconductor) invented the integrated circuit in 1959. Edsger Dijkstra created an efficient algorithm for shortest paths in graphs as a demonstration of the ARMAC computer in 1956. He also made an efficient algorithm for the minimum spanning tree in order to minimize the wiring needed for the X1 computer. 1960's In the 1960's, computer science came into its own as a discipline. In fact, the term was coined by George Forsythe, a numerical analyst. The first computer science department was formed at Purdue University in 1962. Operating systems made major advances. Fred Brooks at IBM designed System/360. Edsger Dijkstra at Eindhoven designed the THE multiprogramming system. At the end of the decade, ARPAnet, a precursor to today's Internet, began to be constructed. Many new programming languages were invented, such as BASIC (developed c. 1964 by John Kemeny (1926-1992) and Thomas Kurtz (b. 1928)). The 1960's also saw the rise of automata theory and the theory of formal languages. Big names here include Noam Chomsky and Michael Rabin. Chomsky later became well-known for his theory that language is "hard-wired" in human brains. 2 Earlier, in 1947, Hopper was first to coine the word "computer bug". 5 01-08-12 23:44 Proving correctness of programs using formal methods also began to be more important in this decade. The work of Tony Hoare played an important role. Hoare also invented Quicksort. Ted Hoff and Federico Faggin at Intel designed the first microprocessor (computer on a chip) in 1969-1971. A rigorous mathematical basis for the analysis of algorithms began with the work of Donald Knuth, author of The Art of Computer Programming. 1970's The theory of databases saw major advances with the work of Edgar F. Codd on relational databases. Unix, a very influential operating system, was developed at Bell Laboratories by Ken Thompson and Dennis Ritchie. Brian Kernighan and Ritchie together developed programming language C. Other new programming languages, such as Pascal (invented by Niklaus Wirth) and Ada (developed by a team led by Jean Ichbiah), appeared. The first RISC architecture was begun by John Cocke in 1975, at the Thomas J. Watson Laboratories of IBM. Similar projects started at Berkeley and Stanford around this time. The 1970's also bring the rise of the supercomputer. Seymour Cray designed the CRAY-1, which was first shipped in March 1976. It could perform 160 million operations in a second. Cray Research was taken over by Silicon Graphics. There were also major advances in algorithms and computational complexity. In 1971, Steve Cook published his seminal paper on NP-completeness, and shortly thereafter, Richard Karp showed that many natural combinatorial problems were NP-complete. Whit Diffie and Martin Hellman published a paper that introduced the theory of public-key cryptography. In 1979, three graduate students in North Carolina developed a distributed news server which eventually became Usenet. 1980's This decade also saw the rise of the personal computer, thanks to Steve Wozniak and Steve Jobs, founders of Apple Computer. In 1981, the first truly successful portable computer was marketed, the Osborne I. In 1984, Apple first marketed the Macintosh computer. 6 01-08-12 23:44 1990's and Beyond Parallel computers continue to be developed. Biological computing, with the recent work of Len Adleman on doing computations via DNA, has great promise. Quantum computing gets a boost with the discovery by Peter Shor that integer factorization can be performed efficiently on a (theoretical) quantum computer. The "Information Superhighway" links more and more computers worldwide. Computers get smaller and smaller; the birth of nano-technology. To conclude this introductory historical overview we can note that the development of computing devices follows the needs of human civilization. In the beginning, computers were needed to solve practical problem of performing long and tedious calculations in a fast and reliable way. The era of focusing on algorithmic aspects of computing is coupled to the dominance of mathematicians, logicians and even physicists, as they were not only the first users of computers, but also, they were creating new concepts, designing and building new machines. More recent developments however opened entirely new areas of application for computers. Nowadays computers can be found virtually everywhere, and the majority of users are no longer scientists or technicians but “common people”. ‘That has as a consequence a new ever growing demand for improved graphic features (especially stimulated by the enormous market for computer games) even improved audio capabilities and user-friendly interface. The enormous expansion of Internet is one of the motors of progress for personal computers. Modern development doesn’t only include home-markets but also targets the professional segment that demands advanced data bases, expert systems, distributed parallel systems, and many more fields. 7 01-08-12 23:44 1 LEIBNIZ: LOGICAL CALCULUS Gottfried Wilhelm von Leibniz Born: 1 July 1646 in Leipzig, Saxony (now Germany) Died: 14 Nov 1716 in Hannover, Hanover (now Germany) G. W. Leibniz was born in Leipzig in 1646. His father, a professor of philosophy at the university of Leipzig, has died when child was six years old, and the boy was brought up by his mother. In his early teens Leibniz started studying Aristotle´s logic. What fascinated him the most was the Aristotelian division of concepts in categories. He was inspired to develop his own logical system based on alphabet whose elements would represent concepts. Starting from Aristotle Leibniz concieved of a universal artificial language to formulate propositions in which all human knowledge could be expressed, with “calculational” rules that would expose logical interrelationships among those propositions. He has remained under Aristotle´s spell the rest of his life. For his bachelor’s degree he wrote on Aristotelian metaphysics. His other field of interest was legal studies, so he obtained his second bachelor’s degree in law, dealing in his thesis with the use of systematic logic in legal matters. Leibniz made his first contribution to mathematic in his doctoral dissertation in philosophy. As a step towards the alphabet of concepts, he systematically studied complex arrangements made of basic elements, that he even continued to develop in a monograph “Dissertatio de Arte Combinatoria”. Leibniz had a great vision of really amazing scope. He has developed the notations for the differential and integral calculus that are still in use today, that made it easy to perform complicated calculations with help of simple and clear-cut rules. In his vision similar could be done for all human knowledge. 8 01-08-12 23:44 He also dreamed of machines capable of carrying out calculations, freeing the mind for creative thought: “For it is unworthy of excellent men to lose hours like slaves in the labor of calculation which could safely be relegated to anyone else if the machine were used.” Leibniz developed a machine that could not only add and subtract but also multiply, divide, and extract square roots. The Leibniz calculator was gear-operated, and it provided a carry from one order to the next. Figure 1 Leibniz´s calculating machine It is example of algebra that Leibniz cites constantly to show how a system of properly chosen symbols is useful and indeed crucial for deductive thought. “Part of the secret of algebra consists of the characteristic, that is to say of the art of properly using the symbolic expressions. This care for proper use of symbols was to be the “thread of Ariadne” that would guide the scholar…” The seventeenth-century mathematics has two major developments that stimulated the development of mathematical research: • Systematization of techniques dealing with algebraic expressions • Reduction of geometry to algebra by representing points by pairs of numbers (Descartes, Fermat) Much of work has been done investigating limit processes, that is, searching approximate solutions that approach the exact solution in the limit. Following elegant expression is the Leibniz’s own result: p 1 1 1 1 1 =1- + - + - +... 4 3 5 7 9 11 1 That can be interpreted as the area of a circle with radius , expressed as infinite series of 2 odd numbers alternately added and subtracted. Generally, finding areas of figures with curved boundaries was one kind of problems solved by limit processes. Another kind of problem was finding the rates of change, such as the varying speed. Leibniz recognized that mathematical operations required for calculation of those two types were inverse of each other (fundamental theorem of the calculus). Nowadays these operations are called integration (in Leibniz notation (cid:242) is actually modified “S”) and differentiation (in Leibniz notation “d” suggests “difference”). 9 01-08-12 23:44 Calculus Ratiocinator “I am convinced more and more of the utility and reality of this general science, and I see that very few people have understood its extent…This characteristic consists of a certain script or language…that perfectly represents the relationships between our thoughts. The characters would be quite different from what has been imagined up to now. Because one has forgotten the principle that the characters of this script should serve invention and judgement as in algebra or arithmetic. This script will have great advantages, among others; there is one that seems particularly important to me. This is that it will be impossible to write, using these characters, chimerical notions (chimères) such as suggest themselves to us. An ignoramus will not be able to use it, or, in striving in doing so, he himself will become erudite. “3 Leibniz saw his grand program as consisting of three major steps: 1. Before the appropriate symbols could be selected, it would be necessary to create an information bank encompassing the full extent of human knowledge. 2. Select the key underlying notions and provide appropriate symbols for each of them. 3. Finally, the rules of deduction could be reduced to manipulations of these symbols. That is what Leibniz called a calculus ratiocinator and what nowadays might be called symbolic logic. For Leibniz nothing, absolutely nothing about the world was in any way undetermined or accidental; everything followed a plan, clear in the mind of God, by means of which he created the best world that can be created. Hence all aspects of the world were connected by links one could hope to discover by rational means. Leibniz really did attempts to produce calculus ratiocinator, as illustrated in following. DEFINITION 3. A is in L, or L contains A, is the same as to say that L can be made to coincide with a plurality of terms taken together of which A is one. B ¯ N = L signifies that B is in L and that B and N together compose or constitute L. The same thing holds for larger number of terms. AXIOM 1. B ¯ N = N ¯ B. POSTULATE. Any plurality of terms, as A and B, can be added to compose A ¯ B. AXIOM 2. A ¯ A = A. PROPOSITION 5. If A is in B and A = C, then C is in B. PROPOSITION 6. If C is in B and A = B, then C is in A. PROPOSITION 7. A is A. (For A is in A ¯ A (by Definition 3). Therefore (by Proposition 6) A is in A.) …. PROPOSITION 20. If A is in M and B is in N, then A ¯ B is in M ¯ N. Figure 2 Sample from one of Leibniz´s logical calculi 3 The letter from Leibniz to Jean Galloys, dated December 1678. 10