Encyclopedia > Computer multitasking

  Article Content

Computer multitasking

In computing, multitasking is a method by which multiple tasks may execute, apparently simultaneously, on a single computer processor. Multitasking on a single CPU is accomplished by rapidly alternating between several active tasks, also known as processes. These alternations, called "context switches", are either voluntarily initatiated by a running process, or by an external event such as a hardware interrupt[?].

The first efforts to create multitasking systems took place in the 1960s, although the process was then referred to as "time-sharing". The purpose of these early efforts was to allow multiple users to share a single mainframe computer, thereby extending its usefulness. The term "time-sharing", which refers specifically to the sharing of a computer by multiple users, has become essentially obsolete, in favor of the more general "multitasking".

Early multitasking systems consisted of suites of related applications that voluntarily ceded time to each other. This approach, which was eventually supported by many computer operating systems, is today known as Cooperative multitasking[?]. Although it is rarely used in larger systems, Microsoft Windows prior to Windows 95, and MacOS prior to OS X both used cooperative multitasking to enable the running of multiple applications simultaneously.

Cooperative multitasking has many shortcomings. For one, a cooperatively multitasked system must rely on each process to regularly give time to other processes on the system. A poorly designed program, or a "hung" process, can effectively bring the system to a halt. The design requirements of a cooperatively multitasked program can also be onerous for some purposes, and may result in irregular (or inefficient) use of system resources.

To remedy this situation, most time-sharing systems quickly evolved a more advanced approach known as preemptive multitasking. On a such a system, a hardware system (not included on many early machines) can "interrupt" a running process, and direct the processor to execute a different piece of code. A system designed to take advantage of this feature need not rely on the voluntary ceding of processor time by individual processes. Instead, the hardware interrupt system can be set to "pre-empt" a running process and give control back to the operating system software, which can later restore the pre-empted process at exactly the point where it was interrupted. Programs "pre-empted" in such a manner need not explicitly give time to other processes; as far as a programmer is concerned, software programs can be written as though granted uninterrupted access to the CPU.

Pre-emptive multitasking allows the computer system to more reliably guarantee each process a regular "slice" of operating time. It also allows the system to rapidly deal with important external events like incoming data, which might require the immediate attention of one or another process.

At any specific time, processes can be grouped into two categories: those that are waiting for input or output (called "I/O bound[?]"), and those that are fully utilizing the CPU ("CPU bound[?]"). In early systems, processes would often "poll", or "busywait" while waiting requested input (such as disk, keyboard or network input.) During this time, the process was not performing useful work, but still maintained completely control of the CPU. With the advent of interrupts and preemptive multitasking, these I/O bound processes could be "blocked", or put on hold, pending the arrival of the necessary data, allowing other processes to utilize the CPU. As the arrival of the requested data would generate an interrupt, blocked processes could be guaranteed a timely return to execution.

Although multitasking techniques were originally developed to allow multiple users to share a single machine, it soon became apparent that multitasking was useful regardless of the number of users. Many operating systems, for mainframes down to single-user personal computers, have recognized the usefulness of multitasking support for a variety of reasons. Multitasking makes it possible for a single user to run multiple applications at the same time, or to run "background" processes while retaining control of the computer.

Another reason for multitasking was in the design of real-time computing systems, where a number of possibly unrelated external activities needed to be controlled by a single processor system. In such systems a hierarchical interrupt system was coupled with process prioritisation to ensure that key activities were given a greater share of available process time.

Over the years, multitasking systems have been refined. Modern operating systems generally include detailed mechanisms for prioritizing processes, while multi-processing has introduced new complexities and capabilities. Further developments in program design have introduced the concept of "threads", which are independent sub-tasks that share a single process's memory space.

See also:



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
Royalist

... of meaning. At its simplest, it refers to an adherent of a monarch or royal family. Of the more specific uses of the term, the most common include: 1. A supporter of ...

 
 
 
This page was created in 26.3 ms