Before the 19th century
Pioneers in the age of mechanical computers
In Western Europe, the great social changes from the Middle Ages to the Renaissance greatly promoted the development of natural science and technology, and people's creativity suppressed by theocracy for a long time was unprecedentedly released. Building a machine that can help people calculate is one of the brightest sparks of thought. Since then, one scientist after another has worked tirelessly to turn this spark of thought into a torch to guide humanity into the realm of freedom. However, limited to the overall level of science and technology at that time, most of them failed, which is the common fate of pioneers: often see no fruitful fruit. When future generations enjoy this sweetness, they should be able to taste some of the taste of sweat and tears...
1614: The Scotsman John Napier (1550-1617) publishes a paper in which he mentions that he has invented a contraption that can calculate the four operations and the root operation.
1623: Wilhelm Schickard (1592-1635) makes a 'calculating clock' that can add and subtract numbers up to six digits and output answers by ringing. It is operated by turning the gear.
1625: William Oughtred (1575-1660) invents the slide rule
1642: French mathematician Pascal improved the slide rule on the basis of WILLIAM Oughtred's slide rule to perform eight-digit calculations. And it sold a lot. It became a fashionable commodity.
1668: The Englishman Samuel Morl and (1625-1695) make a non-decimal addition device suitable for counting coins.
1671: German mathematician Gottfried Leibniz designs a machine that can multiply, and the final answer can be up to 16 bits.
1775: Charles of England builds a machine similar to Leibniz's computer. But a little more advanced.
1776: German Mathieus Hahn successfully builds a multiplier.
1801: Joseph-Maire Jacuard develops an automatic loom that can be controlled with punched cards.
1820: Frenchman Charles Xavier Thomas de Colmar (1785-1870) makes the first finished computer, reliable enough to be placed on a desktop and sold on the market for more than 90 years.
1822: British Charles Babbage (1792-1871) designed the Difference engine and the Analytical engine, which were very advanced in their design, similar to the electronic computer a hundred years later, especially the use of cards to input programs and data.
1832: Babbage and Joseph Clement produce a finished difference engine that can perform six-digit operations. Later, it grew to 20 and 30, nearly the size of a house. The result is printed as a punch. But their designs were difficult to make because of the manufacturing technology available at the time.
1834: George Scheutz of Stockholm builds a difference engine out of wood.
1834: Babbage conceived of a general-purpose analytical machine that would store programs and data in read-only memory (punched cards). Babbage continued his work at a later time, increasing the operand number to 40 bits by 1840, and basically implementing the control center (CPU) and stored programs. Moreover, the program can jump according to the conditions, and can make general addition in a few seconds, and multiply and divide in a few minutes.
1842: Babbage's difference engine project is cancelled by the government because it is too expensive to develop. But he still spent a lot of time and energy on his analytical engine research.
1843: Scheutz and his son Edvard Scheutz build a difference engine, and the Swedish government agrees to continue supporting their work.
1847: Babbage spent two years designing a simpler, 31-bit difference engine, but no one was interested in building it. But when the Science Museum in London copied the machine using modern technology, it did indeed work accurately.
1848: British mathematician George Boole establishes binary algebra. Almost a century in advance paved the way for the modern binary computer.
1853: To Babbage's delight, the Scheutzes succeeded in building a true proportional difference engine capable of 15-digit arithmetic. Deliver the results, as Babbage envisaged. Brian Donkin of London later built a second, more reliable one.
1858: The first tabulator is bought by the Dudley Observatory in Albany. The second was bought by the British government. But the observatory did not make full use of it, and it was later sent to the museum. But the second one was lucky enough to last a long time.
1871: Babbage builds parts of the analytical engine and the printer.
1878: Ramon Verea, a Spaniard in New York, builds a successful desktop calculator. Faster than any of the previous ones. But he wasn't interested in bringing it to market, just showing that Spaniards could do better than Americans.
1879: A commission of inquiry begins to study the feasibility of the Analytical engine, and concludes that it simply cannot work. By this time Babbage was dead. After the investigation, his analytical engine was completely forgotten. But not Howard Aiken.
1885: More computers were created during this period. Such as the United States, Russia, Sweden and so on. They began to replace faulty gears with slotted cylinders.
1886: Dorr E. Felt (1862-1930), of Chicago, built the first push-button calculator, and it was so fast that the key was lifted and the result was produced.
1889: Felt introduces the desktop printer calculator.
1890: 1890 United States Census. The 1880 census took seven years to compile. This means that the 1890 count will be more than 10 years old. The U.S. Census wants a machine to help make the count more efficient. Herman Hollerith, the man who founded the tabulator Company, which later became IBM. Drawing on Babbage's invention, the machine was designed to store data on punched cards. It took just six weeks to produce accurate data (62,6222,250 people). Herman Hollerith made a fortune.
1892: William S. Burroughs (1857-1898) of SAO Tome and Principe builds a machine that is more powerful than the Felt, truly starting the office automation industry.
1896: Herman Hollerith founded the predecessor of IBM. 1900 ~ 1910
1906: Henry Babbage, Charles Babbage's son, with the support of R. W. Munro, completes his father's analytical engine, but only proves that it works and does not launch it as a product.
The first days of electronic computers
Before this, computers were based on mechanical operation, although some individual products began to introduce some electrical content, but they are subordinate and mechanical, and have not entered the flexible field of computer: logical calculation. After that, with the rapid development of electronic technology, the computer began a transition from machinery to the electronic era, electronics increasingly become the main body of the computer, machinery more and more become subordinate, the status of the two has changed, the computer has also begun a qualitative change. Here are the main events of this transition period:
1906: Lee De Forest of the United States invents the electron tube. It is impossible to build a digital electronic computer before then. This laid the foundation for the development of electronic computers.
1920 ~ 1930
February 1924 :IBM, a landmark company, is founded
1930 ~ 1940
1935: IBM introduces the IBM 601. This is a punch card computer that can multiply in one second. This machine has an important position in both natural science and commercial sense. About 1,500 were built.
1937: Alan M. Turing (1912-1954) of the University of Cambridge, England, publishes his paper and proposes a mathematical model that later became known as the Turing Machine.
1937: George Stibitz of the BELL laboratory demonstrates a device for representing binaries with relays. Although it was only a demonstration, it was the first binary electronic computer.
1938: Claude E. Shannon publishes a paper on logical representations using relays.
1938: Konrad Zuse and his assistants in Berlin complete a computer in mechanically programmable binary form, based on Boolean algebra. It was later named Z1. It is relatively powerful, using something like movie film as a storage medium. Can operate seven exponents and 16 decimals. You can use a keyboard to enter numbers and a light bulb to display the results.
January 1, 1939: David Hewlet and William Packard in California build the Hewlett-Packard computer in their garage. The name was decided by the two of them flipping a coin. Including parts of both names.
November 1939: American John V. Atanasoff and his student Clifford Berry complete a 16-bit adder, the first vacuum tube computer.
1939: Beginning of World War II, military needs greatly promote the development of computer technology.
1939: Zuse and Schreyer begin developing the Z2 computer based on their Z1 computer. It uses relays to improve its storage and computing units. But the project was suspended for a year because of Zuse's military service.
1939-1940: Schreyer completes a 10-bit adder using vacuum tubes and uses neon lights as storage.
1940 ~ 1950
January 1940: Samuel Williams and Stibitz of Bell LABS successfully build a computer capable of performing complex calculations. A lot of relays were used, and some telephone technology was borrowed, and advanced coding technology was adopted.
Summer 1941: Atanasoff and student Berry completed a Computer capable of solving linear algebraic equations, named 'ABC' (Atanasoff-berry Computer), using capacitors as memory and punched cards as auxiliary memory, the holes being actually 'burned'. The clock rate is 60HZ, and it takes one second to complete an addition operation.
December 1941: Zuse, Germany completed the development of the Z3 computer. It was the first programmable electronic computer. It can handle 7 exponents and 14 decimals. A large number of vacuum tubes were used. It can add three or four times per second. A multiplication takes three to five seconds.
1943: Computers from 1943 to 1959 are often referred to as the first generation of computers. Using vacuum tubes, all programs were written in machine code, using punched cards. Typical machine: UNIVAC.
January 1943: Mark I, an automatic sequence control computer, is developed in the United States. The entire machine was 51 feet long, weighed 5 tons, had 750,000 parts, and used 3,304 relays and 60 switches as mechanical read-only memory. The program is stored on a paper tape, and the data can come from a paper tape or card reader. Used to calculate ballistic fire tables for the United States Navy.
April 1943: Max Newman, Wynn-Williams and their research team develop 'Heath Robinson', a code-breaking machine that is not strictly a computer. But it uses some logic components and vacuum tubes, and its optics can read 2,000 characters per second. It is also of epoch-making significance.
September 1943: Williams and Stibitz complete the 'Relay Interpolator', later named the 'Model II Relay Calculator'. This is a programmable computer. Paper tape is also used to enter programs and data. Its operation is more reliable, and each number is represented by 7 relays, which can perform floating-point operations.
December 1943: The first programmable computer, consisting of 2,400 vacuum tubes, is introduced in the United Kingdom, designed to break German codes and translate about 5,000 characters per second, but is destroyed shortly after use. It is said that there was a mistake in translating Russian.
1946: ENIAC (Electronic Numerical Integrator and Computer): The first truly digital electronic computer. Development began in 1943 and was completed in 1946. The principals were John W. Mauchly and J. Presper Eckert. Weighs 30 tons, 18,000 tubes, 25 kilowatts. It is mainly used to calculate the trajectory and the development of hydrogen bombs.
The development of transistor computers
Although the computer in the vacuum tube era has stepped into the category of modern computers, its large size, high energy consumption, many failures, and high price have greatly restricted its popularization and application. Until the transistor was invented, the electronic computer found the starting point to take off, one can not be recovered...
1947: William B. Shockley, John Bardeen, and Walter H. Brattain of Bell Laboratories. The invention of the transistor ushered in a new era of electronics.
1949: EDSAC: Wilkes and his team at the University of Cambridge build a computer that stores programs. I\/O devices are still paper tape.
1949: electronic discrete variable computer (EDVAC) : The first computer to use magnetic tape. This was a breakthrough, and it was possible to store programs on it multiple times. The machine was proposed by John von Neumann.
1949: 'Future computers will not exceed 1.5 tons.' That was the bold prediction of the scientific journals of the time.
1950 ~ 1960
1950: The floppy disk is invented by Yoshiro Nakamats at Tokyo Imperial University. The marketing rights were acquired by IBM. Create a new era of storage.
1950: British mathematician and computer pioneer Alan Turing says that a computer will have the intelligence of a man, and that if a person talks to a machine and cannot tell whether the conversation is a machine or a person when the questions are asked and answered, then the machine will have the intelligence of a man.
1951: Grace Murray Hopper completes the high-level language compiler.
1951: Whirlwind: The United States Air Force's first computer-controlled real-time defense system is developed.
1951: UNIVAC-1: The first commercial computer system. Designed by J. Presper Eckert and John Mauchly. The use of the computer for the census by the United States Census marked the beginning of a new era of commercial applications.
1952: EDVAC (Electronic Discrete Variable Computer) : Designed and completed by Von Neumann. Name: Electronic discrete variable computer.
1953: There are about 100 computers in operation in the world at this time.
1954: John Backus and his research group at IBM begin development of FORTRAN (FORmula TRANslation), which is completed in 1957. It is a high-level computer language suitable for scientific research.
1956: First conference on artificial intelligence held at Dartmouth College.
1957: IBM develops the first dot matrix printer.
1957: Successful development of FORTRAN high-level language.
Fourth, integrated circuits, modern computers plug in the wings of take-off
September 12, 1958: Under the leadership of Robert Noyce (founder of INTEL Corporation), the integrated circuit is invented. Soon followed by microprocessors. However, because the technology of Japanese companies was borrowed in the invention of microprocessors, Japan did not recognize its patents because it did not get the benefits it deserved. It took 30 years for Japan to admit it so that Japanese companies could get a share of the profits. But in 2001, that patent expired.
1959: Computers designed between 1959 and 1964 are generally referred to as second-generation computers. Transistors and printed circuits were used extensively. Computers were getting smaller and more powerful, and they could run FORTRAN and COBOL and receive English character commands. A large number of applications appear.
1959: Grace Murray Hopper begins development of COBOL (COmmon Business-Orientated Language), completed in 1961.
1960 ~ 1970
1960: ALGOL: The first structured programming language is introduced.
1961: IBM's Kennth Iverson introduces the APL programming language.
1963: PDP-8: DEC introduces the first minicomputer.
1964: Computers from 1964 to 1972 are generally referred to as third-generation computers. A large number of integrated circuits are used, and the typical model is the IBM360 series.
1964: IBM releases PL\/1 programming language.
1964: The first of the IBM 360 series of compatible machines is released.
1964: DEC releases the PDB-8 minicomputer.
1965: Moore's Law is published, and the performance of processors doubles every year. Later the content changed again.
1965: Lofti Zadeh creates fuzzy logic to deal with approximation problems.
1965: Thomas E. Kurtz and John Kemeny complete the development of the BASIC(Beginners All Purpose Symbolic Instruction Code) language. Especially suitable for computer education and beginners to use, has been widely promoted.
1965: Douglas Englebart proposed the idea of a mouse, but no further research was carried out. It was widely adopted by Apple Computer in 1983.
1965: The first supercomputer, the CD6600, is developed.
1967: Niklaus Wirth begins development of the PASCAL language, completed in 1971.
1968: Robert Noyce and a few of his friends start INTEL Corporation.
1968: Seymour Paper and his research group develop the LOGO language at MIT.
1969: Project ARPANET is launched, the beginning of the modern INTERNET.
April 7, 1969: The first network protocol standard, RFC, is introduced.
1969年:电子工业协会
1970 ~ 1980
1970: The first RAM chip is introduced by INTEL, with a capacity of 1K.
1970: Ken Thomson and Dennis Ritchie begin developing the UNIX operating system.
1970: The Forth programming language is developed.
1970: ARPAnet (Advanced Research Projects Agency network), the prototype of the Internet, is basically completed. It was opened to the non-military sector, and many universities and commercial sectors began to access it.
November 15, 1971: Marcian E. Hoff develops the 4004, the first microprocessor at INTEL, with 2,300 transistors, a 4-bit system with a clock rate of 108KHz and 60,000 instructions per second.
In the later days, the main indicators of processor development are summarized:
Processor master speed million instructions per second
4004 108 KHz 0.06
8080 2MHz 0.5
68000 8MHz 0.7
8086 8MHz 0.8
68000 16mhz 1.3
68020 16 MHz 2.6
80286 12MHz 2.7
68030 16MHz 3.9
386 SX 20 MHz
68030 25 MHz 6.3
68030 40MHz 10
386 DX 33MHz
486 DX 25MHz
486 DX2-50 50MHz 35
486 DX4\/100 100MHz 60
Pentium 66MHz 100
Pentium 133MHz 240
奔腾233MHz MMX 435
奔腾Pro 200 MHz 440
Pentium II 233MHz 560
Pentium II 333MHz 770
1971: PASCAL language development completed.
1972: Computers after 1972 are conventionally referred to as fourth-generation computers. Based on large-scale integrated circuits, and later very large scale integrated circuits. Computers are more powerful and smaller. People began to wonder whether computers could continue to shrink, especially if the heat problem could be solved. People began to discuss the development of the fifth generation of computers.
1972: Development of the C language is completed. It was designed by Dennis Ritche, one of the developers of the UNIX system. It is a very powerful language for developing system software and is particularly loved.
1972: Hewlett-Packard invents the first handheld calculator.
April 1, 1972: INTEL introduces the 8008 microprocessor.
1972: ARPANET goes global and the INTERNET revolution begins.
1973: The arcade game Pong is released to wide popularity. Nolan Bushnell, later the founder of Atari.
1974: The first CLIP-4 with parallel computer architecture is introduced.
Computer technology is becoming brilliant
Before this, computer technology was mainly concentrated in the field of mainframe and minicomputer development, but with the advancement of ultra-large scale integrated circuit and microprocessor technology, the technical barriers for computers to enter the homes of ordinary people have been broken through layer by layer. Especially since INTEL released its 8080 microprocessor for personal computers, this wave has been surging, and there have been a lot of information age pioneers, such as Jobs and Bill. Gates and so on, they still play a pivotal role in the development of the computer industry. At this time, Internet technology, multimedia technology has also been unprecedented development, computers really began to change people's lives.
April 1, 1974: INTEL releases its 8-bit microprocessor chip, the 8080.
December 1974: MITS releases the Altair 8800, the first commercial personal computer, valued at $397 and with 256 bytes of memory.
1975: Bill Gates and Paul Allen complete the first BASIC program to run on the MITS Altair computer.
1975: IBM introduces its laser printer technology. Introduced its color laser printer to the market in 1988.
1975: Bill Gates and Paul Allen found MicorSoft. It is now the largest and most successful software company. Three years later, he made $500,000 and grew to 15 people. $2.8 billion in 1992, 10,000 employees. Its biggest breakthrough was the development of an operating system for IBM's PC in 1981, and it has had a huge impact on the computer industry ever since.
1975: The IBM 5100 is released.
1976: Stephen Wozinak and Stephen Jobs found Apple Computer. And introduced its Apple I computer.
1976: Zilog introduces the Z80 processor. 8 bit microprocessor. CP\/M is the operating system developed for it. Many famous software programs such as Wordstar and dBase II are based on this processor.
1976: 6502, an 8-bit microprocessor is released for use in Apple II computers.
1976: Cray 1, the first commercial supercomputer. With 200,000 transistors integrated, it performs 150 million floating-point operations per second.
May 1977: The Apple II computer is released.
1978: Commodore Pet is released with 8K RAM, cassette, and 9-inch display.
June 8, 1978: INTEL releases its 16-bit 8086 microprocessor. However, because it was very expensive, the 8-bit 8088 was introduced to meet the market's need for a low-cost processor and was adopted by IBM's first generation of PCS. The available clock frequencies are 4.77, 8, and 10MHz. There are about 300 instructions and 29,000 transistors integrated.
1979: The arcade game 'Space Invaders' is released, causing a sensation. It soon made similar game consoles so popular that they generated more revenue than the American film industry.
1979: Jean Ichbiah develops the Ada computer language.
June 1, 1979: INTEL releases the 8-bit 8088 microprocessor, designed purely to meet the needs of low-cost computers.
1979: Commodore PET releases the 6502 with a 1MHz processor, monochrome display, and 8K memory, with the option to purchase additional memory as needed.
1979: The low-density disk is invented.
1979: Motorola releases the 68000 microprocessor. The main product was the Apple Macintosh, followed by the 68020 for the Macintosh II.
1979: IBM saw the personal computer market was occupied by Apple and other computer companies, and decided to develop their own personal computer, in order to launch their own products as soon as possible, they worked a lot of work with third parties, of which Microsoft undertook the development of its operating system. They soon introduced the IBM-PC on August 12, 1981. But it also provided the fertilizer for Microsoft's later rise.
1980 ~ 1990
October 1980: MS-DOS\/PC-DOS development work begins. But Microsoft didn't have its own operating system, they bought someone else's and improved it. But when IBM tested it, it found 300 bugs. So they continued to improve, and the original DOS1.0 had 4000 lines of assembler.
1981: Xerox begins working on graphical user interfaces, ICONS, menus, and positioning devices such as the mouse. Results The research results were used for reference by Apple. Apple later accused Microsoft of stealing the design for its WINDOWS software.
1981: INTEL releases the 80186\/80188 chip, which is rarely used because its registers are incompatible with others. But it uses direct memory access technology and time slice time-sharing technology.
August 12, 1981: IBM releases its personal computer, priced at $2,880. It has 64K of memory, a monochrome display, and an optional cassette tape drive
The world's first program was written in 1842, just before the first real machine that could be called a computer. This code was written by Ada Augusta, Countess Lovelace, better known as Ada Lovelace. As the author of the world's first computer program, she is widely regarded as the first ever programmer.
When Ada is referred to as a programmer, it is easy to forget that the world's first code was created when Samuel Morse first demonstrated the telephone, the slaves on the Amstel were in revolt, and the United States was in the thirties, the Ottoman and Persian Empires, and slaves in Muslim countries were fighting for the Middle East with Egypt. The word computer still meant computing work that a person had done for more than 100 years. That was a long time ago.
The first piece of code in the world was written for Charles Babbage's analytical machine, which was never actually built, although it is possible. Ada Lovelace saw the potential of Babbage's machine and had the idea of a programmable computer. She translated a paper by Italian mathematician Luigi Menabrea entitled \"Conceptual Diagram of the Analytical Engine\" for Taylor's scientific memoirs, and made sense of it through \"the translator's note (which she annotated herself)\
\"It is necessary to avoid exaggerating the idea that the energy derived from the analytical engine.\"
Lovelace was in no way able to overstate the nature of a design that encompassed a major part of the modern computer. Babbage refused to publish much of the analytical engine, making Lovelace's notes an important influence on future development, most notably prompting Alan Turing's idea of a general-purpose program storage computer. Ada did not see this, and when she died at the age of 36, the aforementioned notes became her only publication. If she had lived a few more years and worked a few more years, wouldn't the computer have looked a different way?
To return to the question: if Babbage had the resources to build an analytical engine and have Lovelace run programs on it, what was the purpose of the world's first computer program? This program allowed the Babbage analytical Machine to compute Bernoulli sequences of numbers. She then described how the program could be implemented using a large number of punched cards from Babbage's analytical engine as input. In her implementation Lovelace set the first digit of the Bernoulli sequence (B0=1, B1=-), and then began to compute the entire sequence from B2 (the first informal Bernoulli number), which she labeled B1.
A modern rewritten Javascript version of Ada might look something like this on a stack of massive punched cards. This rewritten version was not a simulation of Ada's code on the Babbage analyzer, but just another implementation of the algorithms Ada had used.
By the way, no one has ever found a bug in Ada's Bernoulli code. Although she invented programming, she obviously did not invent bugs.
Ada Lovelace Day is an international celebration of women's achievements in science, technology, engineering and mathematics.