Operating System (also known as a UNIX system)? How it Works?

It is implied by the term that an operating system is a form of software without which you would be unable to operate or run your computer. As an intermediate or translation system, it bridges the gap between computer hardware and the many application applications that are installed on the computer. In other words, you cannot utilise computer programmes with computer hardware without first establishing a link between the two devices over a medium such as a network.

It also serves as a middleman between the computer user and the computer hardware by providing a standard user interface that appears on your computer screen when you first turn on your computer and is accessible over the Internet. For example, the Windows and Mac OS operating systems both feature a graphical user interface with icons and images that allows users to access various files and apps at the same time, and both are available for personal computers.

Consequently, despite the fact that the operating system is a programme or software in and of itself, it allows users to execute other programmes or apps on the machine. We might say that it is responsible for the operation of your computer in the background.

The following are the primary functions of the operating system:

● Memory management: It is responsible for the management of both primary and secondary memory, such as RAM, ROM, hard disc, pen drive, and so on. It examines and decides on the allocation and de-allocation of memory space to different processes. It is a component of the operating system. As soon as a user interacts with a system, the CPU is expected to perform read or write operations; in this situation, the operating system determines the amount of memory to be allocated for loading the programme instructions and data into random access memory (RAM). After this application has been ended, the memory space is once again available for use by other programmes, which can be allocated by the operating system.
● Processor Management: It aids processor management by determining the sequence in which processes access the processor as well as the amount of processing time that should be provided to each process in a given time period. Aside from that, it keeps track of the state of running processes, frees up processor resources when a process is completed, and then redistributes those resources to a new process.
● Device/hardware management: The operating system includes drivers that allow you to control your devices. A driver is a form of translation software that enables the operating system to connect with devices. There are multiple drivers for different devices since each device speaks a distinct language, hence there are different drivers for different devices.
● Run software programmes: It provides an environment in which to run or utilise software applications that have been built to accomplish certain tasks, such as Microsoft Word, Microsoft Excel, Photoshop, and so on.
● Organizing and presenting directories for data management is one of the ways it assists with data management. You have the ability to see and change files and folders, for example, you can move, copy, rename, or delete a file or a folder.
● It assesses the overall health of the system and provides us with a notion of the functioning of the system’s hardware components. Examples of such information include how busy the CPU is, how quickly data is accessed from the hard disc, and so on.
This component provides user interface: It serves as a link between the user and the hardware. It might be a graphical user interface (GUI), in which you can view and click components on the screen to conduct various functions. Because of this, you can speak with a computer even if you are not familiar with the computer’s language.
I/O management is responsible for controlling the input and output devices and ensuring that the I/O process runs smoothly and efficiently. Consider the following example: it accepts input from a user through an input device and saves it in the main memory of the computer. Then it instructs the CPU to analyse the input and, as a result, produces the output through an output device, such as a monitor, as appropriate.
Security: It is equipped with a security module that guards against malware and unauthorised access to the data or information saved in the computer’s memory storage devices. As a result, it not only controls your data but also contributes to its security.
Time Management: It aids the CPU in the management of time. The Kernel Operating System (Kernel OS) is always monitoring the frequency with which processes seek CPU time. A round-robin allocation of CPU time is used when two or more processes that are equally critical fight for the same amount of CPU time. This prevents a single process from having complete control over the CPU.
Deadlock prevention occurs when a resource that is meant to be shared by two or more processes is held by only one process, resulting in the resource being unable to be shared by the other processes. Deadlock is the term used to describe this circumstance. The operating system prevents this situation from occurring by carefully allocating resources among the various programmes running on the system.
Handling of Interrupts: The operating system also responds to interrupts, which are signals created by a programme or a device in order to attract the attention of the central processor. The operating system determines the importance of the interrupt and, if it is more essential than the presently running process, it stops the execution of the current process and saves the current state of the CPU before executing the requested task. The CPU then returns to the state in which it was when it was stopped earlier.
Operating Systems are classified into the following categories:

1) Batch Processing Operating System (Batch Processing Operating System):

This technology does not allow for any interaction between the user and the machine. It is necessary for the user to prepare jobs on punch cards in the form of batches and submit them to the computer operator in order to receive credit.

Job or programme sorting is done by a computer operator who keeps comparable programmes or tasks together in the same batch and runs them as a group to speed up processing. It is intended to carry out a single task at a time. Jobs are handled on a first-come, first-served basis, which means that they are processed in the order in which they were submitted, with no human intervention.

Batch processing may be seen in the generation of credit card bills by banks, for example, which is an example of this. Instead of generating a separate statement for each credit card purchase, batch processing is used to create a single bill that contains all of the purchases made in a month at the end of the month.

A batch of bill information is gathered and stored, and then processed to create the bill at the conclusion of a billing cycle, as shown in the diagram below. The salaries of corporate personnel are computed and created using a batch processing system at the end of each month, in a similar fashion to a financial system.

The following are the advantages of using a batch processing operating system:

● Repeated tasks may be accomplished quickly and simply without the need for human interaction.
● In batch systems, there is no requirement for hardware or system support to input data.
● It has the ability to work offline, which reduces the amount of load placed on the processor because it knows which job to complete next and how long the task will take.
● It may be used by numerous people at the same time.
● If you want to schedule batch jobs to run at a time when the computer is not else occupied, you can do it at night or at any other time when the machine is not otherwise occupied.

The following are some disadvantages of batch processing operating systems:

● It is necessary to instruct the computer operators on how to operate the batch system.
● Debugging this system is not a simple task.
● If an error happens in one work, it is possible that the other jobs will have to wait for an unknown amount of time.

2) Time Sharing Operating System (also known as TSO):

As the name implies, it allows numerous users situated at various terminals to access a computer system at the same time and to share the processor’s time. In other words, each task is given sufficient time to be completed, and as a result, all duties are completed without difficulty.

Each user receives the same amount of CPU time as they would if they were utilising a single machine. The amount of time allotted to a job is referred to as a quantum or time slice; when the time allotted to a task is up, the operating system moves on to the next task.

The following are some advantages of using a time sharing operating system:

● It shortens the amount of time the CPU is idle, allowing it to be more productive.
● Each process is given the opportunity to utilise the CPU.
● It enabled the use of many apps running at the same time.

Disadvantages of a time-sharing operating system include the following:

● A specialised operating system is required due to the fact that it uses more resources.
● Because it serves a large number of users and runs a large number of apps at the same time, switching between tasks may cause the system to become sluggish.
● As a result, it demands hardware with high specifications.
● It has a lower level of dependability.

3) Distributed Operating System (DOS):

It makes use of or operates on a large number of separate processors (CPUs) in order to serve a large number of users and many real-time applications. A variety of communication links, like as telephone lines and high-speed buses, are used to establish communication between CPUs. Both in terms of size and function, the processors may be distinct from one another.

Distributed operating systems were made feasible by the availability of powerful microprocessors and improved communication technologies. These advancements made it possible to design, build, and utilise distributed operating systems. It is also an extension of a network operating system that allows for a high level of communication and integration between computers on a network, amongst other things.

The following are some advantages of a distributed operating system:

● Due to the fact that resources are shared, its performance is higher than that of a single system.
● If one system fails, malfunctions, or fails completely, the remaining nodes are not harmed or compromised.
● It is simple to add new resources to an existing system.
● It is possible to set up shared access to resources such as printers.
● Processing time is decreased to a larger extent as a result of this.
● Because of the widespread usage of electronic mail, the pace at which data is sent or exchanged is extremely fast.

The following are the disadvantages of a distributed operating system:

● Because of the pooling of resources, there may be a security risk.
● It is possible that a few messages will get lost in the system.
● Higher bandwidth is necessary when dealing with a significant volume of information.
● It is possible that an overloading problem will emerge.
● It is possible that the performance will be poor.
● The languages that are used to build up a distributed system are not clearly defined at this point in time, however.
● Because they are quite expensive, they are not widely available.

4)Network Operating System (also known as NOS):

It is as the name implies, an operating system that connects computers and gadgets to a local area network while also managing network resources. The software in a network operating system (NOS) allows the devices in a network to share resources and communicate with one another. It is a server-based programme that provides shared access to printers, files, applications, files, and other networking resources and functions through a local area network (LAN) network. Furthermore, all users in the network are aware of each other’s underlying setup as well as their own unique connections. Examples include Microsoft Windows Server 2003 and 2008, Linux, UNIX, Novell NetWare, Mac OS X, and other operating systems.

The following are some advantages of using a network operating system:

● These centralised servers may be accessible from many places and systems, even when they are in separate time zones.
● It is simple to include cutting-edge and up-to-date technologies and hardware into this system.

The following are some disadvantages of network operating systems:

● It is possible that the servers employed in the system will be pricey.
● A centralised location is required for the system, as well as for its frequent monitoring and maintenance.

5) Real-Time Operating System

It is designed for real-time applications in which data must be handled in a defined amount of time over a short period of time. It is utilised in an environment where a large number of processes must be received and processed in a short period of time to be effective. In order to avert an explosion in a petroleum refinery, real-time operating systems (RTOS) require rapid input and prompt reaction.

For example, if the temperature rises beyond a certain threshold and passes the threshold value, an instant response is required to prevent the explosion. It is also utilised to operate scientific equipment, missile launch systems, traffic management systems, and aeroplane navigation systems, amongst other applications.

On the basis of the limits imposed by time, this system is further subdivided into two types:

Hard Real-Time Systems (also known as RT systems):

These are utilised in applications where timing is crucial or where response time is a significant issue; even a delay of a fraction of a second can result in a disaster in some cases. For example, airbags and automated parachutes that deploy instantaneously in the event of an accident are examples of safety features. Aside from that, these systems are devoid of virtual memory.

Adaptive Real-Time Systems (ARTS):

These are utilised in applications where timing or response time are not as important as they should be. In this case, failing to fulfil the deadline may result in a decreased performance rather than a catastrophic failure. For instance, video surveillance (cctv), a video player, virtual reality, and so on. In this case, deadlines are not crucial for every work, all of the time.

The following are some of the advantages of a real-time operating system:

  • ● Because the greatest amount of devices and system resources are being utilised, the output is greater and more rapid.
  • ● Task shifting is extremely fast, for example, 3 microseconds, and as a result, it appears that numerous jobs are being completed at the same time.
  • ● Gives precedence to the programmes that are now executing over the apps that are queued.
  • ● It can be utilised in embedded systems, such as those in transportation and other industries.
  • ● It is completely free of mistakes.
  • ● Memory has been allocated in the proper manner.

Disadvantages of a real-time operating system include the following:

  • ● In order to minimise mistakes, only a limited number of activities may be carried out at the same time.
  • ● Designers find it tough to build the complicated and difficult algorithms or proficient programmes that are necessary to produce the desired results.
  • ● To respond to interruptions as rapidly as possible, certain drivers and interrupt signals must be used.
  • ● It is possible that it will be extremely expensive owing to the participation of the resources necessary to complete the task.
  • Generations of the operating system include the following:

The first generation (from 1945 to 1955) consisted of:

It was the period before to World War II, when the digital computer had not yet been established, and there were calculating engines with mechanical relays in use at the time. Because mechanical relays were extremely sluggish, vacuum tubes were eventually used to replace them. However, even with vacuum tubes, the performance problem remained unsolved, and these machines were excessively bulky and enormous due to the fact that they were constructed from tens of thousands of vacuum tubes.

Furthermore, each of the devices was created, developed, and maintained by a single team of individuals. The programming languages and operating systems were unknown, and absolute machine language was being utilised to programme in order to get the desired results.

These systems were created for the purpose of doing numerical computations. The programmer was needed to sign up for a specific amount of time and then insert his or her plug board into the computer system. Punch cards, which were created in the 1950s and boosted the performance of computers, were introduced. This technology allowed programmers to create programmes on punch cards and then read those programmes into the system. The remainder of the operation remained essentially the same.

The second generation (from 1955 to 1965) consisted of the following individuals:

The transistor was introduced in the mid-1950s, which marked the beginning of this generation. The addition of transistors improved the reliability of the computers, which allowed them to be offered for sale to customers. Mainframes were the technical term for these devices. This is something that only large enterprises and government agencies could afford. When using this machine, the programmer was necessary to write the programme on a piece of paper and then punch the programme onto cards. Upon arrival in the input room, the card would be handed over to an operator who would then get the output. The output from the printer is delivered to the output room via a conveyor belt. Because of these stages, it was a time-consuming task. As a result, the batch system was designed to overcome this problem.

In a batch system, the tasks were gathered in a tray in the input room in the form of batches and read onto a magnetic tape, which was then transported to the machine room and installed on a tape drive there. A specific programme was used by the operator to read the first task or job off the cassette and run it, after which the result was recorded onto a second tape using a special software. Jobs were finished one by one as the operating system automatically read the next job from the tape. After the batch was completed, the input and output cassettes were removed from the system, and the next batch was begun. It was necessary to take printouts from the output cassette. It was mostly employed in the fields of engineering and science computations. The first operating system to be used in computers of this generation was named FMS (Fortran Monitor System), and IBMSYS and FORTRAN were used as high-level programming languages.

The third generation (from 1965 to 1979) consisted of the following individuals:

The 360 family of computers from IBM was introduced in 1964, marking the beginning of this generation. During this generation, transistors were phased out in favour of silicon chips, and the operating system was enhanced to accommodate multiprogramming; some operating systems even supported batch processing, time sharing, and real-time processing all at the same time.

The fourth generation operating system (from 1979 to the present) consists of the following components:

The introduction of personal computers and workstations marked the beginning of this generation of operating systems. With the introduction of microprocessor chips containing thousands of transistors in this generation, it became possible to develop personal computers that were able to keep up with the growth of networks and thus support the development of network operating systems and distributed operating systems. These operating systems included the DOS, Linux, and Windows operating systems, to name a few examples.

Leave a Reply

Your email address will not be published.