Redirected from Computer chip
An integrated circuit (IC) is a microelectronic semiconductor device consisting of many interconnected transistors and other components.
SEM image (400x) |
ICs are constructed ("fabricated") on a small rectangle, called a "die", cut from a silicon (or for special applications, sapphire or gallium-arsenide) wafer. This is known as the "substrate". Photolithography is used to mark different areas of the substrate to be doped or to have polysilicon or aluminium tracks laid down in them. (See also semiconductor.) The die is then connected into a package using gold or aluminum wires which are welded to "pads", usually found around the edge of the die.
Integrated circuits can be classified into analog, digital and hybrid (both analog and digital on the same chip). Digital integrated circuits can contain anything from one to millions of logic gates, flip-flops, multiplexers, etc. in a few square millimeters. The small size of these circuits allows high speed, low power dissipation, and reduced manufacturing cost compared with board-level integration.
The concept for the integrated circuit was first published by Geoffrey W.A. Dummer[?] on May 7, 1952.
The first integrated circuits were developed independently by two scientists. Jack Kilby of Texas Instruments filed a patent for a "Solid Circuit" on February 6, 1958. Robert Noyce of Fairchild Semiconductor was awarded a patent for a more complex "unitary circuit" on April 25, 1961.
These early integrated circuits contained only a few transistors. Small-scale integration (SSI) brought circuits containing transistors numbered in the tens.
SSI circuits were crucial to early aerospace projects, and vice-versa. Both the Minuteman missile and Apollo program needed lightweight digital computers for their inertially-guided flight computers. The Apollo flight computer led and motivated the integrated-circuit technology, while the Minuteman missile forced it into mass-production.
These programs purchased almost all of the available integrated circuits from 1960 through 1963, and almost alone provided the demand that funded the production improvements to get the production costs from $1000/circuit (in 1960 dollars) to merely $25/circuit (in 1963 dollars).
Later, medium-scale integration (MSI) contained hundreds of transistors. Further development lead to large-scale integration (LSI) (thousands), and VLSI (hundreds of thousands and beyond). In 1986 the first one megabyte RAM was introduced, which contained more than one million transistors.
LSI circuits began to be produced in large quantities around 1970 for computer main memories and pocket calculators. For the first time it became possible to fabricate a CPU or even an entire microprocessor on a single integrated circuit. The most extreme technique is wafer-scale integration which uses whole uncut wafers as components.
See also: electronics, electrical engineering, computer engineering, microcontroller, ZIF
FOLDOC entry
Search Encyclopedia
|
Featured Article
|