Encyclopedia > History of computers

  Article Content

History of computing

Redirected from History of computers

Table of contents
1 See Also
2 External Links

Computing history overview

This narrative presents the major developments in the history of computing and tries to put them into perspective. For a detailed timeline of events, see computing timeline.

Earliest Devices

Humanity has used devices to aid in computation for millennia; an example is the abacus. The first machines that could arrive at the answer to an arithmetical question more or less autonomously started to appear in the 1600's, limited to addition and subtraction at first, but later also able to perform multiplications. These devices used techniques such as cogs and gears first developed for clocks. The difference engines of the 1800s could carry out a long sequence of such calculations in order to construct mathematical tables, but were not widely used.

The defining feature of a "universal computer" is programmability, which allows the computer to emulate any other calculating machine by changing a stored sequence of instructions. In 1801, Joseph-Marie Jacquard developed a loom in which the pattern being woven was controlled by punched cards. The series of cards could be changed without changing the mechanical design of the loom. This was a landmark point in programmability.

In 1835 Babbage described his analytical engine. It was the plan of a general purpose programmable computer, employing punch cards for input and a steam engine for power. While the plans were correct, lack of precision of the mechanical parts, disputes with the artisans who were building parts, and ending of government funding, made it impossible to build. Ada Lovelace, Lord Byron's daughter, wrote the best surviving account of programming the or the analytical engine, and seems to have actually developed some programs for it. These efforts make her the world's first computer programmer. The earlier Difference Engine has been built and is operational at the London Science Museum; it works exactly as Babbage designed it to.

In 1890 the United States Census Bureau used punch card, and sorting machines designed by Herman Hollerith to handle the flood of data from the decennial census mandated by the Constitution. Hollerith's company eventually became the core of IBM.

In the twentieth century, electricity was first used for calculating and sorting machines. Earlier mechanical calculators, cash registers, accounting machines, and so on were redesigned to use electric motors. Before World War II, mechanical and electrical analog computers were the 'state of the art', and many thought they were the future of computing. Analog computers use continuously varying amounts of physical quantities, such as voltages or currents, or the rotational speed of shafts, to represent the quantities being processed. An ingenious example of such a machine was the Water integrator built in 1936. Unlike modern digital computers, analog computers are not very flexible, and need to be reconfigured (ie, reprogrammed) manually to switch them from working on one problem to another. Analog computers had an advantage over early digital computers in that they could be used to solve complex problems while the earliest attempts at digital computers were quite limited. But as digital computers have become faster and used larger memory (eg, RAM or internal store), they have almost entirely displaced analog computers.

First Generation of Computers

The era of modern computing began with a flurry of development before and during World War II, as electronic circuits, vacuum tubes, capacitors, and relays replaced mechanical equivalents and digital calculations replaced analog calculations. The computers designed and constructed then have been called 'first generation' computers. First generation computers were usually built by hand circuits containing relays or vacuum valves (tubes), and often used punched cards or punched paper tape for input and as the main (non-volatile) storage medium. Temporary, or working storage, was provided by acoustic delay lines (which use the propagation time of sound in a medium such as wire to store data) or by Williams tubes (which use the ability of a television picture tube to store and retrieve data). By 1954, magnetic core memory was rapidly displacing most other forms of temporary storage, and dominated the field through the mid-1970s. A example of a practical WWII era machine was the Target Data Computer employed in American submarines, that allowed the operator to input a few pieces of data, such as the sub's speed and heading, and some observed variables about a target vessel. The TDC would then calculate and display the exact aiming point for firing torpedos. The TDC was a part of what led to total American dominance in submarine warfare in the Pacific.

This era saw numerous electromechanical calculating devices of various capabilities which had a limited impact on later designs. In 1938 Konrad Zuse started construction of the first Z-series, electromechanical calculators featuring memory and limited programmability. Zuse was (inadequately) supported by the German Wehrmacht which used his proto-computers for the production of guided missiles. The Z-series pioneered many advances, such as the use of binary arithmetic and floating point numbers. In 1940, the Complex Number Calculator, a calculator for complex arithmetic based on relays, was completed. It was the first machine ever used remotely over a phone line. In 1938 John Vincent Atanasoff and Clifford E. Berry[?] of Iowa State University developed the Atanasoff Berry Computer (ABC), a special purpose computer for solving systems of linear equations, and which employed capacitors fixed in a mechanically rotating drum, for memory.

During World War II, the British made significant efforts at Bletchley Park to break German military communications. The main German cypher system (the Enigma in several variants) was attacked with the help of purpose built 'bombs' which helped find possible Enigma keys after other techniques had narrowed down the possibilities. The Germans also developed a series of cypher systems (called Fish cyphers by the British) which were quite different than the Enigma. As part of an attack against these cyphers, Professor Max Newman and his colleagues (including Alan Turing) designed the first programmable electronic general purpose computer, the Colossus. It used vacuum tubes, had a paper-tape input and allowed some programmability. Several machines were built, in at least two variants, but details of their existence, design, and use were kept secret well into the 1970s. Winston Churchill is said to have personally issued an order for their destruction into pieces no larger than a man's hand. There is an active project to build a copy of one of the Colossus machines.

Turing's pre-War work was a major influence on the design of the modern computer, and after the War he went on to design, build and program some of the earliest computers at the National Physics Lab and at the University of Manchester. His 1936 paper in Mind included a description of what is now called the Turing machine, a purely theoretical device invented to formalize the notion of algorithm execution. Modern computers are Turing-complete (ie, equivalent algorthm execution capability to a universal Turing machine), except for their finite memory. Turing completeness is a threshold capability separating general-purpose computers from their special-purpose predecessors. It is as good a criterion as any for defining "the first computer", but unfortunately even with this restriction there is no simple answer as to which computer was the first. Babbage's Analytical Engine was the first design of a Turing-complete machine, Zuse's Z3 was the first Turing-complete working machine (but this was unknown to Zuse and was proved only in 1998 after his death), and the electronic ENIAC was the first working Turing-complete computer designed and used as such. The ABC machine was not programmable, though a complete computer in the modern sense in most other respects. George Steibitz[?] and colleagues at Bell Labs in NY City produced several relay based 'computers' in the late '30s and early '40s, but were concerned mostly with problems of telephone system control, not computing. Their efforts were a clear antecedent for another electromechanical American machine, however.

The Harvard Mark I (officially, the Automatic Sequence Controlled Calculator) was a general purpose electro-mechanical computer built with IBM financing and with assistance from some IBM personnel under the direction of Harvard mathematician Howard Aiken[?]. Its design was influenced by the Analytical Engine. It used storage wheels and rotary switches in addition to electromagnetic relays, was programmable by punched paper tape, and contained several calculators working in parallel. Later models contained several paper tape readers and the machine could switch between readers based on a condition. Nevertheless, this does not quite make the machine Turing-complete. Development began in 1939 at IBM's Endicott laboratories; the Mark I was moved to Harvard University to begin operation in May 1944.

The US-built ENIAC (Electronic Numerical Integrator and Computer), the first large-scale general-purpose electronic computer, publicly validated the use of electronics for large-scale computing. This was crucial for the development of modern computing, initially because of the enormous speed advantage, but ultimately because of the potential for miniaturization. Built under the direction of John Mauchly and J. Presper Eckert, it was 1,000 times faster than its contemporaries.

Its development and construction lasted from 1941 to full operation at the end of 1945. When its design was proposed, many researchers believed that the thousands of delicate valves (ie, vacuum tubes) would burn out often enough that the ENIAC would be so frequently down for repairs as to be useless. It was, however, capable of 5,000 simple calculations a second for hours at a time between valve failures. It was programmable, not only by rewiring as originally designed, but later also with fixed wiring executing stored programs set in function table memory using a scheme named after John von Neumann.

By the time the ENIAC was successfully operational, the plans for the EDVAC were already in place. Insights from experience with ENIAC led to the EDVAC design, which had unrivalled influence in the initial stage of the computer revolution. The design team was led by von Neumann.

The essentials of the EDVAC design have come to be known as the von Neumann architecture: programs are stored in the same memory 'space' as the data. Unlike the ENIAC, which used parallel processing, it used a single processing unit. This design was simpler and was the first to be implemented in each succeeding wave of miniaturization, and increased reliability. The EDVAC design can be seen as the "Eve" from which nearly all current computers derive their architecture.

The first working von Neumann machine was the Manchester "Baby", built at the University of Manchester in 1948; it was followed in 1949 by the Manchester Mark I computer which functioned as a complete system using the Williams tube for memory. This University machine became the prototype for the Ferranti Mark I[?], the world's first commercially available computer (although some point out that LEO I was the computer that was used for the world's first regular routine office computer job in November 1951). The first model was delivered to the University in February, 1951 and at least nine others were sold between 1951 and 1957.

Later in 1951 The UNIVAC I (Universal Automatic Computer), delivered to the U.S. Census Bureau, was the first commercial computer to attract U.S. public attention. Although manufactured by Remington Rand[?], the machine often was mistakenly referred to as the "IBM UNIVAC". Remington Rand eventually sold 46 machines at more than $1 million each. UNIVAC was the first 'mass produced' computer; all predecessors had been 'one-off' units. It used 5,200 vacuum tubes and consumed 125 kW of power. It used a mercury delay line capable of storing 1,000 72-bit words for memory. Unlike earlier machines it did not use a punch card system but a metal tape input.

Second Generation

The next major step in the history of computing was the invention of the transistor in 1947. This replaced the fragile and power hungry valves with a much smaller and more reliable component. Transistorised computers are normally referred to as 'Second Generation' and dominated the late 1950s and early 1960s. Despite using transistors and printed circuits these computers were still large and primarily used by universities, governments, and large corporations. For example the IBM 650 of 1954 weighed over 900 kg, the attached power supply weighed around 1350 kg and both were held in separate cabinets of roughly 1.5 meters by 0.9 meters by 1.8 meters. It cost $500,000 or could be leased for $3,500 a month.

In 1955, Maurice Wilkes invented microprogramming, now almost universally used in the implemetnation of CPU designs. The CPU instruction set is defined by a type of programming.

In 1956, IBM sold its first magnetic disk system, RAMAC (Random Access Method of Accounting and Control). It used 50 24-inch metal disks, with 100 tracks per side. It could store 5 megabytes of data and cost $10,000 per megabyte.

The first high level general purpose programming language, FORTRAN, was also being developed at IBM around this time.

In 1960 IBM shipped the transistor-based IBM 1401 mainframe, which used punch cards. It proved a popular general purpose computer and 12,000 were shipped, making it the most successful machine in computer history. It used a magnetic core memory of 4000 characters. Many aspects of its design were based on the desire to replace punched card machines which were in wide use from the 1920s through the early 70s.

In 1960 IBM shipped the transistor-based IBM 1620 mainframe, which also used punch cards. It proved a popular scientific computer and about 2,000 were shipped. It used a magnetic core memory of up to 60,000 decimal digits.

In 1964 IBM announced the S/360 series, which was the first family of computers that could run the same software at different combinations of speed, capacity and price. It also pioneered the commercial use of microprograms, and an extended instruction set designed for processing many types of data, not just arithmetic. In addition, it unified IBM's product line, which prior to that time had included both a "commercial" product line and a separate "scientific" line. The software provided with System/360 also included major advances, including commercially available multi-programming, new programming languages, and independence of programs from input/output devices. Over 14,000 System/360 systems were shipped by 1968.

Also in 1964, DEC launched the PDP-8 much smaller machine intended for use by technical staff in laboratories and for research.

Third Generation and Beyond

The explosion in the use of computers began with 'Third Generation' computers. These relied on Jack St. Claire Kilby's and Robert Noyce's invention - the integrated circuit or microchip. The first integrated circuit was produced in September 1958 but computers using them didn't begin to appear until 1963. While large 'mainframes' such as the IBM System/360 increased storage and processing capabilities further, the integrated circuit allowed the development of much smaller computers that began to bring computing into many smaller businesses. They were eventually called minicomputers.

The minicomputer was a significant innovation in the 1960s and 1970s. It brought computing power to more people, not only through more convenient physical size but also through broadening the computer vendor field. Digital Equipment Corporation became the number two computer company behind IBM with their popular PDP and VAX computer systems. Smaller, affordable hardware also brought about the development of important new operating systems like Unix.

Large scale integration of circuits led to the development of very small processing units, an early example of this is the processor used for analysing flight data in the US Navy's F14A TomCat fighter jet. This processor was developed by Steve Geller, Ray Holt and a team from AiResearch and American Microsystems.

In 1966 Hewlett-Packard entered the general purpose computer business with its HP-2115, offering a computational power formerly found only in much larger computers. It supported a wide variety of languages, among them BASIC, ALGOL, and FORTRAN.

In 1969 Data General shipped a total of 50,000 Novas at $8000 each. The Nova was one of the first 16-bit minicomputers and led the way toward word lengths that were multiples of the 8-bit byte. It was first to employ medium-scale integration (MSI) circuits from Fairchild Semiconductor, with subsequent models using large-scale integrated (LSI) circuits. Also notable was that the entire central processor was contained on one 15-inch printed circuit board.

In 1973 the TV Typewriter, designed by Don Lancaster, provided the first display of alphanumeric information on an ordinary television set. It used $120 worth of electronics components, as outlined in the September 1973 issue of Radio Electronics magazine. The original design included two memory boards and could generate and store 512 characters as 16 lines of 32 characters. A 90-minute cassette tape provided supplementary storage for about 100 pages of text. His design used a microprocessor, minimalistic hardware and some software to generate the timing of the various signals needed to create the TV signal. Clive Sinclair later used the same approach in his legendary Sinclair ZX80.

On November 15th 1971 Intel released the world's first commercial microprocessor, the 4004. It was developed for a Japanese calculator company, Busicom, as an alternative to hardwired circuitry. Fourth generation computers developed, using a microprocessor to locate much of the computer's processing abilities on a single (small) chip. Coupled with one of Intel's other products - the RAM chip, based on an invention by Bob Dennard of IBM, (kilobits of memory on a single chip) - the microprocessor allowed fourth generation computers to be even smaller and faster than ever before. The 4004 was only capable of 60,000 instructions per second, but later processors (such as the 8086 upon which all of the IBM PC and compatibles are based) brought ever increasing speed and power to the computers.

Supercomputers of the era were immensely powerful. In 1976 the Cray-1 was developed by Seymour Cray, who left Control Data in 1972 to form his own company. This machine was known as much for its horseshoe-shaped design -- an effort to speed processing by shortening circuit paths -- as it was for being the first supercomputer to make vector processing practical. Vector processing, which uses a single instruction to perform the same operation on many numbers, has been a fundamental supercomputer processing style ever since. The Cray-1 could calculate 150 million floating point operations per second. 85 were shipped at a cost of $5 million each.

The Home Computer Era

You can't talk about computer history without mentioning the Altair. The Altair was featured on the cover of Popular Electronics for January, 1975. It was the world's first mass-produced personal computer kit, as well as the first computer to use an Intel 8080 processor. It was a huge success, and 10,000 Altairs were shipped. The Altair also inspired the software development efforts of Bill Gates and Paul Allen, who developed a full-featured Basic interpreter for the machine.

The Intel 8080 microprocessor chip (and a follow-on clone, the Zilog Z80) led to the first wave of small business computers in the late 1970s. Many of them used the S-100 bus (first introduced in the Altair) and most ran the CP/M-80[?] operating system from Digital Research, founded by Gary Kildall. CP/M-80 was the first popular microcomputer operating system to be used by many different hardware vendors, and many ground-breaking software packages were written for it, such as WordStar and DBaseII[?]. The commands in CP/M-80 were patterned after operating systems from Digital Equipment Corporation, such as RSTS[?] and RT-11, and in turn CP/M was copied -- down to the file and memory structures -- in Microsoft's MS-DOS.

Many hobbyists of the time tried to design their own systems, with various degrees of success, and sometimes banded together to ease the job. Out of these house meetings the Homebrew Computer Club developed, where hobbyists met to talk about what they had done, exchange schematics and software and show off their systems.

At the same time, those same hobbyists were also interested in something ready-built that the average person could afford. Steve Wozniak, a regular visitor to Homebrew Computer Club meetings, designed the Apple I, a single-board computer. With specifications in hand and an order for 100 machines at $500 each from the Byte Shop, he and Steve Jobs got their start in business. In a photograph of the Apple I board, the upper two rows are a video terminal and the lower two rows are the computer. The 6502 microprocessor in the white package sits on the lower right. About 200 of the machines sold before the company announced the Apple II as a complete computer. The Apple II was one of three personal computers launched in 1977. Despite its higher price, it quickly pulled away from the TRS-80 and the Commodore Pet to lead the pack in the late 70s and to become the symbol of the personal computing phenomenon.

Unlike the TRS-80, the Apple II featured high quality and a number of technical differences. It had an open architecture, used color graphics, and most importantly, had an elegantly designed interface to a floppy disk drive, something only mainframes and minis had used for storage until then.

Another key to success was the software: the Apple II was chosen by entrepreneurs Daniel Bricklin and Bob Frankston to be the desktop platform for the first "killer app" of the business world -- the VisiCalc spreadsheet program. That created a phenomenal business market for the Apple II; and the corporate presence attracted many software and hardware developers to the machine.

The rise of Apple Computer is one of America's great success stories. Based on the business and technical savvy of Steve Jobs and Steve Wozniak, and the marketing expertise of Mike Markulla[?], Apple dominated the personal computer industry between 1977 and 1983.

More than 2 million were shipped at a price of $970 for the 4KB model.

The Commodore PET (Personal Electronic Transactor) -- the first of several personal computers released in 1977 -- came fully assembled and was straightforward to operate, with either 4 or 8 kilobytes of memory, a built-in cassette drive, and a membrane "chiclet" keyboard. It was followed by the VIC-20. It had 2.5k of usable memory and was cheaper than Apple's offerings.

A popular personal computer to be connected to a TV was released by Commodore in 1982: the Commodore 64 [C=64]. Magazines became available which contained the code for various utilities and games. All of these machines used the MOS Technologies 6502 CPU; MOS Technologies was owned by Commodore. It was followed in 1985 by the more powerful Amiga 1000, built around the Motorola 68000 CPU.


Amstrad CPC464 computer in 1988. Data storage used standard tape cassettes inserted into the reader on the right of the keyboard.
Larger version

Many other home computers came onto the market. To name just a few; The Atari 400, the Sinclair ZX Spectrum, the TI99/4, the BBC microcomputer, the Amstrad/Schneider CPC464, the Oric Atmos, Coleco Adam, the SWTPC 6800 and 6809 machines, the Tandy ColorComputer/Dragon 32, the Exidy Sorcerer, and the MSX range of homecomputers.

Years later, when the IBM PC machines started to take over the role of the homecomputer, some of the bigger homecomputer companies came out with 16-bit homecomputers to compete with the PC. Most famous were the Atari ST and the Commodore Amiga range of homecomputers.

In 1981 IBM decided to enter the personal computer market after seeing the success of the Apple II. The first model was the IBM PC. (IBM compatible computers quickly became known simply as PC's, to the irritation of earlier manufacturers who had thought they were making personal computers but hadn't thought of trademarking the intials.) It was based on an open architecture which allowed third party cards and peripherals to be used with it. It used the Intel 8088 CPU running at 4.77 MHz, containing 5000 transistors and was able to accommodate up to 640KB of RAM, though no one could afford that much in the early '80s. The first model used an audio cassette for external storage, though there was an expensive floppy disk option. Three operating systems were available for those who chose the floppy disk version. One was CP/M-86 from Digital Research, the second was [PC-DOS]] from IBM, and the third was the UCSD P-system (from the University of Califonia at San Diego). PC-DOS was the IBM branded version of an operating system from Microsoft, previously best known for supplying BASIC language systems to computer hardware companies. When sold by Microsoft, it was called MS-DOS. The UCSD OS was built around the Pascal programming language and was, most probably, too specialized for IBM's customers. CP/M-86 didn't survive the competition with Microsoft and so MS-DOS, under one name or another, swept the field.

The PC Era

About this time, "clone" machines started appearing on the market; these were 'off-brand' machines designed to run the same software as the 'on-brand' ones. Notable were the Franklin 1000 Apple II-compatible and the first IBM PC-compatibles from Compaq and others. Legal battles established the legitimacy of the machines, and the lower prices made them popular. Some introduced new features that the popular brands didn't have--the Franklin, for example, had lowercase display that the Apple II lacked, and Compaq's first machines were portable (or "luggable" in the terminology later developed to distinguish their suitcase-sized machines from laptops).

In 1982 the 80286 Intel CPU was released, and IBM released the IBM PC/AT built around it. This chip could address up to 16Mb of RAM, but the MS-DOS operating system at the time was not able to take advantage of this capability. Bill Gates of Microsoft has been quoted as saying something like, "Why would anyone want more than 640KB?" Lotus Development Corporation and others created competing and incompatible standards for addressing extra memory for software such as its Lotus 1-2-3 spreadsheet, and for a time this created much confusion in the software business.

Eventually the PC would take over the role of the 8-bit home computers and became the dominant "Personal Computer" architecture, Especially in the small business market. However this did not happen overnight. For many years PC's and other home computers competed for the money and attention of the home user. But for business use the IBM comptible PC quickly became the standard, only to be challenged by the Macintosh and perhaps, in its heydays, the Atari-ST.

In 1983 Apple introduced its Lisa. The first personal computer with a graphical user interface, its development was central in the move to such systems for personal computers. The Lisa ran on a Motorola 6809 microprocessor and came equipped with 1 megabyte of RAM, a 12-inch black-and-white monitor, dual 5 1/4-inch floppy disk drives and a 5 megabyte Profile hard drive. The Xerox Star -- which included a programming language system called Smalltalk that involved a mouse, windows, and pop-up menus -- inspired the Lisa's designers. However, the Lisa's slow operating speed and high price ($10,000) led to its ultimate failure.

Apple Computer also launched the Apple Macintosh, the first successful mouse-driven computer with a graphic user interface, with a single $1.5 million commercial during the 1984 Super Bowl. Based on the Motorola 68000 microprocessor, the Macintosh included many of the Lisa's features at a much more affordable price: $2,500.

Applications that came with the Macintosh included MacPaint, which made use of the mouse, and MacWrite, which demonstrated WYSIWYG word processing.

Although processing power and storage capacities have increased beyond all recognition since the 1970s the underlying technology of LSI (large scale integration) or VLSI (very large scale integration) microchips has remained basically the same, so it is widely regarded that most of today's computers still belong to the fourth generation.

See Also

External Links



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
KANU

... independent on December 12, 1963, and the next year joined the Commonwealth. Jomo Kenyatta, a member of the predominant Kikuyu tribe and head of the Kenya Afric ...

 
 
 
This page was created in 26.7 ms