Virtualization is the creation of a virtual—or separated—computing environment. The concept came into play in the 1960s as a means for dividing system resources of mainframe computers. Today’s virtualization technology relies on a hypervisor to create an abstraction layer which separates, or partitions, the operating system from the hardware so that multiple operating system (OS) instances can run concurrently on a single machine without tying up resources.
To alleviate read-write inconsistencies, the hypervisor caches any changes the user or system makes to the virtual hardware or disk and allows those changes to be written to the OS at a later time. The virtual image mimics an actual environment and is seamless to users. Virtualization also allows for an OS of one physical machine to be moved to another in situations where a problem arises or if there is a need for expansion.
Today, three main categories of virtualization exist:
Virtualization is an effective way to reduce costs while boosting efficiency, flexibility, scalability, and agility. Some of the other benefits of virtualization include the minimization of downtime and enablement of high availability. It helps in situations of business continuity and disaster recovery, enables centralized management, provides a testing environment for new applications or software, and can even support legacy applications without constraining the entire system.