In 2003, Microsoft purchased Connectix Corporation, a company which specialized in the development of virtualization software both for the Mac OS and Windows operating systems. Since then, both Hewlett-Packard and Sun have announced that they too are working hard to enhance their virtualization technology.
IBM has been a leader in virtual machines for quite some time, and virtualization has played an important role in many IBM projects. Over the last few years, there has been a noted increase in the amount of academic research that has been conducted on this subject.
While more people have become familiar with virtualization technology today, the concept has actually been around since the 1960s. The best example of this is the M44 Project undertaken by IBM, which had the goal of being able to analyze and then share various system concepts together.
The architecture came in the form of virtual machines, and the memory hierarchy for M44 was used. The implementation for the address space inside the memory was accomplished through multi-level programming and the virtual memory. By 1963, IBM had constructed a large collection of computers, and they worked closely with MIT to create and implement a much higher quality time share system.
Because of the amount of research and work conducted in this area, a number of achievements were made. First, numerous virtual machines where created, and all of them were based on IBM. The best examples of this are the CP-67 and the CP-40.
The virtual machines which were created by IBM were exact copies of the hardware which were connected to them. A new component named the VMM, or Virtual Machine Monitor, was operated on top of the physical hardware. Numerous virtual machines could be created through the VMM, and every instance could be allowed to use its own OS. IBMs VMs are highly regarded even in the present day.
Virtualization Challenges
There are a number of challenges that virtualization has faced over the years, and many of these challenges are still encountered in the present day. To understand the virtualization challenges of today, it is first necessary to take a trip into the past.
When virtual machines were first developed, they were designed to handle the many problems that were present in the third generation operating systems and architectures. Because these systems made use of hardware organization which is dual state, these virtual machines were necessary. Not only was this a problem in the 1960s and 70s, but it is still problematic today.
One thing that you will want to become familiar with is the privileged mode, and the non-privileged mode. The privileged mode is the mode in which every instruction will be made available to the software, but with the non-privileged mode, this will not be made available.
The operating system offers a miniature resident program which is named the PSN, or Privileged Software Nucleus, and this is very similar to a kernel. The user programs are capable of carrying out the instructions which are non-privileged, and it is also possible for them to make the supervisory calls. There are a number of problems with this method.
While the hardware which is non-privileged may carry out supervisory calls or provide instructions for the software nucleus that is privileged, only a single bare machine interface will be exposed. What this means is that only a single kernel can be processed.
Anything that must "talk" to the bare machine will not be able to be processed alongside the kernel which is booted. It is not possible for any person to carry out a process that may cause disruptions in the running system in any way. A good example of this would be performing system debugging, upgrades, or migration.
Additional Virtualization Limitations
It is also not possible for an individual to operate applications which are not trusted in a manner which is secure. It is not easy for a person to create the illusion for the configuration of the hardware when this person is not making use of either arbitrary memory, numerous processors, or storage configurations.
Despite the many limitations discussed here, virtualization is still needed, because there are a number of problems that it solves, and it allows a number of capabilities and benchmarks to be reached. Virtualization is a form of methodology in which the resources of a single computer system are split into multiple environments where executions can be carried out.
To achieve this, a number of different technologies may be used, and some examples of these include software partitioning along with hardware partitioning, and this includes machine simulation, emulation, quality of service, and many other things. This definition of virtualization is not extremely concrete, and it includes a number of concepts, and some of these include the quality of service. While this is a field of study which is separate, it may often be used in conjunction with virtualization.
In many cases, these technologies may be joined together in clever ways to generate systems which are quite interesting. One of the most interesting of these is virtualization. Virtualization as a concept is simply related to the ways in which various technologies come together in elaborate ways, and the properties that come from these connections can be roughly described as being virtualization. The concept of virtualization is connected to numerous paradigms.