Page 1 of 10
History of Operating Systems - Ayman Moumina
History of Computing - Prof. Tim Bergin
Operating systems are the software that makes the hardware usable. Hardware provides “raw computing power.” Operating system makes the computing power conveniently available to users, by managing the hardware carefully to achieve good performance.
Operating systems can also be considered to be managers of the resources. An operating system determines which computer resources will be utilized for solving which problem and the order in which they will be used. In general, an operating system has three principal types of functions.
- Allocation and assignment of system resources such as input/output devices, software, central processing unit, etc.
- Scheduling:This function coordinates resources and jobs and follows certain given priority.
- Monitoring: This function monitors and keeps track of the activities in the computer system. It maintains logs of job operation, notifies end-users or computer operators of any abnormal terminations or error conditions. This function also contains security monitoring features such as any authorized attempt to access the system as well as ensures that all the security safeguards are in place (Laudon and Laudon, 1997).
Throughout the history of computers, the operating system has continually evolved as the needs of the users and the capabilities of the computer systems have changed.
As Weizer (1981) has noted, operating systems have evolved since the 1940s through a number of distinct generations, which roughly correspond to the decades. Although this observation was made in 1981, this is still roughly valid after two decades. In this paper, we shall also follow the similar approach and discuss the history of operating systems roughly along the decades.
Early History: The 1940s and the 1950s:
In the 1940s, the earliest electronic digital systems had no operating systems. Computers of this time were so primitive compared to those of today that programs were often entered into the computer one bit at a time on rows of mechanical switches. Eventually, machine languages (consisting of strings of the binary digits 0 and 1) were introduced that sped up the programming process (Stern, 1981). The systems of the 1950s generally ran only one job at a time. It allowed only a single person at a time to use the machine. All of the machine’s resources were at the user’s disposal. Billing for the use of the computer was straightforward - because the user had the entire machine, the user was charged for all of the resources whether or not the job used these resources. In fact, usual billing mechanisms were based upon wall clock time. A user was given the machine for some time interval and was charged a flat rate.
Originally, each user wrote all of the code necessary to implement a particular application, including the highly detailed machine level input/output instructions. Very quickly, the input/output coding needed to implement basic functions was consolidated into an input/output control system (IOCS). Users wishing to perform input/output operations no longer had to code the instructions directly. Instead, they used IOCS routines to do the real work. This greatly simplified and sped up the coding process. The implementation of input/output control system may have been the beginning of today’s concept of operating system. Under this system, the user has complete control over all of main storage memory and as a result, this system has been known as single user contiguous storage allocation system. Storage is divided into a portion holding input/output control system (IOCS) routine, a portion holding the user’s program and an unused portion (Milenkovic, 1987).
Early single-user real storage systems were dedicated to one job for more than the job’s execution time. Jobs generally required considerable setup time during which the operating system loaded, tapes and disk packs were mounted, and appropriate forms were placed in the printer, time cards were “punched in,” etc. When jobs completed, they required considerable “teardown” time as tapes and disk packs were removed, time cards were “punched out” etc. During job setup and job teardown, the computer sat idle.
Users soon realized that they could cut down the amount of time wasted between the jobs, if they could automate the job-to-job transition. First major such system, considered by many to be the first operating system, was designed by the General Motors Research Laboratories for their IBM 701 mainframe beginning in early 1956 (Grosch, 1977). Its success helped establish batch computing – the groupings of the jobs into a single deck of cards, separated by control cards that instructed computers about the various specification of the job. The programming language that the control cards used was called job control language (JCL). These job control cards set up the job by telling the computer whether the cards following it contain data or programs, what programming language is used, the approximate execution time, etc. When the current job terminated, the job stream reader automatically reads in the control language statements for the next job and performs appropriate housekeeping chores to facilitate the transition to the next job. Batch processing system greatly improved the use of computer systems and helped demonstrate the real value of operating systems by managing resources intensely. This type of processing called single stream batch processing systems became the state-of-the-art in the early 1960s (Orchard-Hays, 1961).