MIPS,
Million
Instructions
Per
Second, is a measure of microprocessor speed.
However, this measure is useful only among processors with the same
instruction set, as different instruction sets often take different numbers of instructions to do the same job.
Many reported MIPS values have represented 'peak' execution rates on artificial instruction sequences with few branches, whereas realistic workloads consist of a mix of instructions, some of which take longer to execute than others. The performance of the
memory hierarchy also greatly affects processor performance, an issue not considered in MIPS calculations.
Because of these problems, researchers have created standardized tests such as
SpecInt[?] to measure the real effective performance in commonly used applications, and raw MIPS has fallen into disuse. It was even pejoratively referred to as "Meaningless Indication of Processor Speed", or "Meaningless Information Provided by Salespeople".
The floating-point arithmetic equivalent of MIPS is FLOPS, to which the same cautions apply.
In the 1970's, minicomputer performance was compared using VAX MIPS, where computers were measured on a task and their performance rated against the VAX 11/780[?] that performed a nominal 1 MIPS. (The measure was also known as the "VAX Unit of Performance" or VUP).
Most 8-bit and early 16-bit microprocessors[?] have a performance measured in KIPS (kilo instructions per second), which equals 0.001 MIPS.
The first general purpose microprocessor, the Intel i8080 ran at 640 KIPS.
The Intel i8086 microprocessor, the first 16 bit microprocessor in the line of processers made by Intel and used in IBM PC's, ran at 800 KIPS.
All Wikipedia text
is available under the
terms of the GNU Free Documentation License