Introduction to Operating Systems

 1. Why have an operating system?

Modern computing systems are mainly composed of one or more processors, memory, hard disks, keyboards, mice, monitors, printers, network interfaces and other input and output devices.

Generally speaking, a modern computer system is a complex system.

One: If every application programmer must master all the details of the system, it will be impossible to write code (severely affects the development efficiency of programmers: it may take 10,000 years to master all these details....)

Second: It is a very challenging job to manage these components and optimize their use, so computing installs a layer of software (system software) called an operating system. Its task is to provide a better, simpler and clearer computer model for user programs and to manage all the devices just mentioned.

Summarize:

Programmers cannot understand all the details of hardware operations. It is very tedious work to manage these hardware and optimize their use. This tedious work is done by the operating system. With him, programmers can benefit from these tedious work. Freed, you only need to consider the writing of your own application software. The application software directly uses the functions provided by the operating system to indirectly use the hardware.

2. What is an operating system

In short, an operating system is a control program that coordinates, manages and controls computer hardware and software resources. The location of the operating system is shown in Figure 1

#The operating system is located between computer hardware and application software, and is essentially a software. The operating system consists of two parts: the kernel of the operating system (running in kernel mode, managing hardware resources) and system calls (running in user mode, providing a 
system ). Therefore, simply speaking The system is running in kernel mode, which is inaccurate.

In detail, the operating system should be divided into two parts:

#1: Hide the ugly hardware call interface and provide application programmers with a better, simpler and clearer model (system call interface) for calling hardware resources. After application programmers have these interfaces, they no longer need to consider the details of operating the hardware and can concentrate on developing their own 
applications . For example: the operating system provides the abstract concept of a file. The operation of the file is the operation of the disk. With the file, we no longer need to consider the read and write control of the disk (such as controlling the disk rotation, moving the head to read and write data, etc.), #2: Make the application's competing requests for hardware resources in an orderly manner For example, many application software actually share a set of computer hardware. For example, there may be three application programs that need to apply for a printer to output content at the same time. Then program a competes for printer resources and prints, and then program b competes for printer resources, or it may It is c,
which leads to disorder. The printer may print the content of a section of a and then print c... One of the functions of the operating system is to make this disorder orderly.
#Role 1: Provide an abstraction for how to use hardware resources for applications
For example: the operating system provides the abstract concept of a file. The operation of the file is the operation of the disk. With the file, we no longer need to consider the read and write control of the disk.

Notice:
This abstraction that the operating system provides to applications is simple, clear, and elegant. Why provide this abstraction?
Hardware manufacturers need to provide their own hardware drivers for the operating system (device drivers, which is why we want to use sound cards, we must install sound card drivers...), in order to save costs or be compatible with old hardware, their drivers are complex and ugly
The operating system is designed to hide these ugly information, so as to provide a better interface for the user
In this way, the shells, Gnome, and KDE used by users see different interfaces, but in fact they all use the same set of abstract interfaces provided by the Linux system.


#Function 2: Manage hardware resourcesModern operating systems run multiple programs simultaneously. The task of the operating system is to control the allocation of processors, memory, and other I / O interface devices in 
an orderly manner among competing programs .
E.g:
Running three programs on the same computer at the same time, they want to output the results on the same computer at the same time, then the first few lines may be the output of program 1, the next few lines are the output of program 2, and then the program The output of 3 will end up being a mess (programs are a process of competing for resources)
The operating system sends the results of the printer to the buffer of the disk. After a program is completely finished, the files temporarily stored on the disk are sent to the printer for output. At the same time, other programs can continue to generate more output results (the output of these programs). not really sent to the printer), so the operating system takes the disorder created by the competition into order.
Detailed explanation

Third, the difference between the operating system and ordinary software

1. The main difference is: if you don't want to use Baofengyingyin, you can choose to use Thunder Player or simply write one yourself, but you can't write a program that is part of the operating system (clock interrupt handler), the operating system is protected by hardware and cannot be. Modified by the user.

2. The difference between the operating system and the user program is not the position of the two. In particular, an operating system is a large, complex, long-lived software,

  • Large: The source code for linux or windows is on the order of five million lines. According to a book with 50 lines per page and a total of 1000 lines, 5 million lines should have 100 volumes, and an entire bookshelf should be used to arrange it. This is only the core part. User programs, such as GUIs, libraries, and basic applications (such as Windows Explorer, etc.), can easily reach 10 or 20 times this amount.
  • Longevity: Operating systems are hard to write, such a large amount of code that once completed, the operating system owner will not easily throw it away and write another one. Rather, it improves upon the original. (basically, windows95/98/Me can be seen as one operating system, while windows NT/2000/XP/Vista are two operating systems, and they are very similar to users. There are also UNIX and its variants and Clone versions have also evolved over the years, such as System V version, Solaris and FreeBSD are all original versions of Unix, but although linux is very imitated according to the UNIX model and is highly compatible with UNIX, linux has a completely new code base)

Fourth, the history of operating system development

The first generation of computers (1940-1955): vacuum tubes and punch cards

The background of the first generation of computers:

Before the first generation, human beings wanted to replace manpower with machinery. The generation of the first generation of computers was a sign that computers entered the electronic age from the mechanical age. From the failure of Babbage until the Second World War, there was little progress in the construction of digital computers. World War II spurred an explosion of computer research.

Professor John Atanasoff of Lowa State University and his student Clifford Berry built what is believed to be the first working digital computer. The machine uses 300 vacuum tubes. Around the same time, Konrad Zuse built the Z3 computer with relays in Berlin, a group at Bletchley Park, England built the Colossus in 1944, Howard Aiken built the Mark 1 at Harvard, and William Mauchley and his students at the University of Pennsylvania J.Presper Eckert built ENIAC. Some of these machines are binary, some use vacuum tubes, and some are programmable, but they are all very primitive, taking seconds to set up to perform the simplest calculations.

During this period, engineers in the same group designed, built, programmed, operated and maintained the same machine, all programming in pure machine language, or worse, traversing thousands of cables It is connected to the plug-in board to form a circuit to control the basic functions of the machine. There is no programming language (nor assembly), and operating systems have never been heard of. The process of using the machine is more primitive, see the "Working Process" for details

Features:
There is no concept
of an operating system, all programming is to directly control the hardware

Working process:
The programmer makes an appointment on the computer timetable on the wall for a period of time, then the programmer takes his plug-in board to the computer room and puts his plug-in board into the street computer. During these few hours, he exclusively enjoys the entire computer resources. The latter group had to wait (more than 20,000 vacuum tubes often burned out).

Later, punched cards appeared, and programs could be written on the card and read into the machine without the need for a plug-in board.

advantage:

Programmers have exclusive access to the entire resource during the application period, and can debug their own programs in real time (bugs can be dealt with immediately)

shortcoming:

A waste of computer resources, only one person uses it at a time.
Note: There is only one program in the memory at the same time, which is called and executed by the CPU. For example, the execution of 10 programs is serial.

Second Generation Computers (1955~1965) : Transistors and Batch Systems

The background of the second generation computer:

Since computers were very expensive at the time, it was natural to think of ways to reduce the waste of time. The usual approach is a batch system.

Features:
Designers, production personnel, operators, programmers and maintenance personnel have a clear division of labor. The computer is locked in a special air-conditioned room and run by professional operators. This is the 'mainframe'.

With the concept of operating system

With programming language: FORTRAN language or assembly language, write it on paper, then punch it into a card, and then take the card box to the input room, hand it to the operator, and drink coffee and wait for the output interface

Working Process: Illustration

How the second generation solves the problems/disadvantages of the first generation:
1. Save the input of a bunch of people into a large wave of input,
2. Then calculate sequentially (this is problematic, but the second generation calculation does not solve it)
3. Put The output of a bunch of people is accumulated into a large wave of output

The predecessor of the modern operating system: (see picture)

Advantages: batch processing, saves time

Disadvantages:
1. The whole process requires human participation and control, and the tape is moved around (two little people in the middle)

2. The calculation process is still sequential calculation - "serial"

3. The computer that the programmer used to enjoy for a period of time must now be planned into a batch of jobs, and the process of waiting for the result and re-commissioning needs to be completed by other programs of the same batch (this has a great impact on Program development efficiency, unable to debug the program in time)

 The third generation of computers (1965~1980) : integrated circuit chips and multiprogramming

The background of the third generation computer:

In the early 1960s, most computer manufacturers had two completely incompatible product lines.

One is word-oriented: a large scientific computer, such as the IBM 7094, see above, mainly used for scientific computing and engineering computing

The other is character-oriented: commercial computers, such as the IBM 1401, pictured above, are mainly used by banks and insurance companies for tape filing and printing services

It is expensive to develop and maintain completely different products, and different users use computers for different purposes.

IBM tries to satisfy both scientific computing and commercial computing by introducing the system/360 series. The low-end 360 series is comparable to the 1401, and the high-end is much more powerful than the 7094. Different performances sell for different prices.

The 360 ​​was the first mainstream model to use a (small-scale) chip (integrated circuit), and compared to the second generation of computers using transistors, the price/performance ratio was greatly improved. Descendants of these computers are still used in large computer centers, the predecessors of today's servers , which handle no less than a thousand requests per second.

How to solve the second generation computer problem 1:
After the card is taken to the computer room, the job can be read from the card to the disk very quickly, so when a job ends at any time, the operating system can read a job from the tape, install it Running in and out of the memory area, this technology is called
simultaneous external device online operation: SPOOLING, this technology is used for output at the same time. When this technology is adopted, the IBM 1401 machine is no longer needed, and the tapes do not have to be moved around (the two little people in the middle are no longer needed)

How to solve problem 2 of the second generation computer:

The operating system of the third-generation computer widely uses the key technology that the operating system of the second-generation computer does not have: multi-channel technology

In the process of executing a task, if the cpu needs to operate the hard disk, it sends an instruction to operate the hard disk. Once the instruction is issued, the mechanical arm on the hard disk slides to read the data into the memory. During this period of time, the cpu needs to wait, and the time may be very short. , but for the cpu, it has been very long, long enough to allow the cpu to do many other tasks, if we let the cpu switch to do other tasks during this time, so the cpu will not be fully utilized. This is the technical background of the multi-channel technology

Multi-channel technology:

Multi-channel in multi-channel technology refers to multiple programs. The implementation of multi-channel technology is to solve the orderly scheduling problem of multiple programs competing or sharing the same resource (such as cpu). The solution is multiplexing. Multiplexing is divided into multiplexing in time and multiplexing in space.

Spatial reuse : divide the memory into several parts, and put each part into a program, so that there are multiple programs in the memory at the same time.

 

Time multiplexing : when one program is waiting for I/O, another program can use the cpu. If enough jobs can be stored in the memory at the same time, the cpu utilization rate can be close to 100%, similar to our primary school mathematics institute. learning co-ordination methods . (After the operating system adopts the multi-channel technology, it can control the switching of processes, or compete for the execution rights of the CPU between processes. This switching will not only be carried out when a process encounters io, but a process will occupy the CPU for too long. It will also switch, or the execution authority of the cpu is taken away by the operating system)

Modern computers or networks are multi-user. Multiple users not only share hardware, but also share information such as files and databases. Sharing means conflict and disorder.

The operating system mainly uses

1. Record which program uses what resource

2. Allocate resource requests

3. Mediation of conflicting resource requests for different programs and users.

We can summarize the functions of the above operating systems as: processing multiple (multiple or multiple) shared (shared or multiplexed) resource requests initiated by multiple programs, referred to as multiplexing

There are two ways to implement multiplexing

1. Multiplexing in time

When a resource is multiplexed in time, different programs or users use it in turn. After the first program acquires the resource and finishes using it, it is the second one's turn. . . The third. . .

For example: there is only one cpu, and multiple programs need to run on the cpu. The operating system first allocates the cpu to the first program, and the program runs for a long enough time (the length of time is determined by the algorithm of the operating system) or Encountered I / O blocking, the operating system will allocate the cpu to the next program, and so on, until the first program is reassigned to the cpu and then runs again, because the switching speed of the cpu is very fast, the feeling to the user is these Programs run concurrently, or concurrently, or pseudo-parallel. As for how resources are time-multiplexed, or who should be the next program to run, and how long a task needs to run, these are the work of the operating system.

2. Spatial multiplexing

Each client acquires a small portion of a larger resource, reducing the time spent queuing for a resource.

For example, when multiple running programs enter the memory at the same time, the hardware layer provides a protection mechanism to ensure that their respective memories are separated and controlled by the operating system, which is much more efficient than a program that monopolizes the memory and queues up to enter the memory one by one.

Other resources for space reuse are disks, and in many systems, one disk holds files for many users at the same time. Allocating disk space and keeping track of who is using which disk blocks are typical tasks of operating system resource management.

The combination of these two methods is a multi-channel technology
Detailed explanation

The biggest problem with spatial multiplexing is that the memory between programs must be divided. This division needs to be implemented at the hardware level and controlled by the operating system. If the memory is not divided from each other, one program can access the memory of another program,

The first loss is security. For example, your qq program can access the memory of the operating system, which means that your qq can get all the permissions of the operating system.

The second loss is stability. When a program crashes, it may also reclaim the memory of other programs. For example, if the memory of the operating system is reclaimed, the operating system will crash.

The operating system of third-generation computers is still batch processing

Many programmers miss the first generation of exclusive computers that could debug their programs on the fly. In order to satisfy programmers who can get a response quickly, the time-sharing operating system appeared

How to fix problem 3 on second generation computers:

Time- sharing operating system:
multiple online terminals + multi-channel technology

20 clients are loaded into the memory at the same time, 17 are thinking, 3 are running, the CPU uses a multi-channel method to process these 3 programs in the memory, because the client submits are generally short instructions and seldom consumes energy. Over time, indexing computers can provide fast interactive services to many users, all of whom think they have exclusive access to computer resources

CTTS: MIT successfully developed a CTSS-compatible time-sharing system on a modified 7094 machine. After the third-generation computer widely adopted the necessary protection hardware (the memory between programs was isolated from each other), time-sharing The system just became popular

After the successful development of CTTS, MIT, Bell Labs and General Electric decided to develop MULTICS that can support hundreds of terminals at the same time (its designers aim to build a machine that meets the computing needs of all users in the Boston area), it is obvious that it is going to be heaven, Finally fell to his death.

Later, Ken Thompson, a Bell Labs computer scientist who had participated in the development of MULTICS, developed a simple, single-user version of MULTICS, which became the UNIX system . Based on it, many other Unix versions have been derived. In order to enable programs to run on any version of Unix, IEEE proposed a Unix standard, posix (Portable Operating System Interface)

Later, in 1987, a small clone of UNIX, the minix, appeared for educational use. Finnish student Linus Torvalds wrote Linux based on it

Fourth generation computer (1980~present): personal computer

slightly 

 

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=324853278&siteId=291194637