Virtualization An Emerging Technology A Technology Overview

Virtualization is one of the hottest trends in information technology today. This is no accident. While a variety of technologies fall under the virtualization umbrella, all of them are changing the IT world in significant ways. This paper introduces Microsoft’s virtualization technologies, focusing on three areas: hardware virtualization, presentation virtualization


INTRODUCTION
Virtualization is the latest in a long line of technical innovations designed to increase the level of system abstraction and enable IT users to harness ever-increasing levels of computer performance. At its simplest level, virtualization allows you, virtually and cost-effectively, to have two or more computers, running two or more completely different environments, on one piece of hardware. For example, with virtualization, you can have both a Linux machine and a Windows machine on one system. Alternatively, you could host a Windows 95 desktop and a Windows XP desktop on one workstation.
In slightly more technical terms, virtualization essentially decouples users and applications from the specific hardware characteristics of the systems they use to perform computational tasks. This technology promises to usher in an entirely new wave of hardware and software innovation. For example, and among other benefits, virtualization is designed to simplify system upgrades (and in some cases may eliminate the need for such upgrades), by allowing users to capture the state of a virtual machine (VM), and then transport that state in its entirety from an old to a new host system.
Virtualization is also designed to enable a generation of more energy-efficient computing. Processor, memory, and storage resources that today must be delivered in fixed amounts determined by real hardware system configurations will be delivered with finer granularity via dynamically tuned VMs.

WHY VIRTUALIZATION?
Why is virtualization the sensation of the season? This section goes over four reasons why virtualization is so important.

Trend #1: Underutilized hardware
Today, many data centers have machines running at only 10 or 15 percent of total processing capacity. In other words, 85 or 90 percent of the machine's power is unused. However, a lightly loaded machine still takes up room and draws electricity, so the operating cost of today's underutilized machine can be nearly the same as if it was running flat-out. It doesn't take a rocket scientist to recognize that this situation is a waste of computing resources. And, guess what? With the steady improvement in performance characteristics of computer hardware, next year's machine will have twice as much spare capacity as this year's (and so on, for the foreseeable future).
Obviously, there ought to be a better way to match computing capacity with load. And that's what virtualization doesby enabling a single piece of hardware to seamlessly support multiple systems. By applying virtualization, Organizations can raise their hardware utilization rates dramatically, thereby making much more efficient use of corporate capital. So, the first trend that is causing virtualization to be a mainstream concern is the unending growth of computing power brought to us by the friendly folks of the chip industry.

Trend #2: Data centers run out of space
The business world has undergone an enormous transformation over the past 20 years. Business process after business process has been captured in software and automated, moving from paper to electrons. The rise of the Internet has exponentially increased this transformation. Companies want to communicate with customers and partners in real-time, using the worldwide connectivity of the Internet. Naturally, this has accelerated the move to computerized business processes. The net effect of all this is that huge numbers of servers have been put into use over the past decade, which is causing a real estate problem for companies: They're running out of space in their data centers. And, by the way, that explosion of data calls for new methods of data storage. These methods go by the common moniker of storage virtualization, which, as you may guess, means making it possible for storage to be handled independently of any particular piece of hardware.
Virtualization, by offering the ability to host multiple guest systems on a single physical server, allows organizations to reclaim data center territory, thereby avoiding the expense of building out more data center space. This is an enormous benefit of virtualization, because data centers can cost in the tens of millions of dollars to construct.

Trend #3: Green initiatives demand better energy efficiency
Power costs used to rank somewhere below what brand of soda to keep in the vending machines in most company's strategic thinking. Companies could assume that electrical power was cheap and endlessly available. The assumption regarding availability of reliable power was challenged during the California power scares of a few years ago. Although later evidence caused re-evaluation about whether there was a true power shortage, the events caused companies to consider whether they should look for ways to be less power dependent.
Furthermore, the impact of the green revolution has meant that companies are increasingly looking for ways to reduce the amount of energy they consumeand one of the places they look first is their data center.
To show the level of concern about the amount of energy being consumed in data centers, consider these facts: A study commissioned by AMD and performed by a scientist from the Lawrence Berkeley National Laboratory showed that the amount of energy consumed by data centers in the U.S. doubled between 2000 and 2005.
Furthermore, energy consumption is expected to increase another 40 percent by the end of the decade. Current energy consumption by data center servers and associated cooling costs represents 1.2 percent of the total energy consumed in the U.S. O c t o b e r , 2 0 1 3 Based, in part, on the results of this study, the United States Environmental Protection Agency (EPA) has convened a working group to establish standards for server energy consumption and plans to establish a new "Energy Star" rating for energy efficient servers.
The cost of running computers, coupled with the fact that many of the machines filling up data centers are running at low utilization rates, means that virtualization's ability to reduce the total number of physical servers can significantly reduce the overall cost of energy for companies.

Trend #4: System administration costs mount
Computers don't operate all on their own. Every server requires care and feeding by system administrators. Common system administration tasks include: monitoring hardware status; replacing defective hardware components; installing operating system (OS) and application software; installing OS and application patches; monitoring critical server resources like memory and disk use; and backing up server data to other storage mediums for security and redundancy purposes.
As you can imagine, these tasks are pretty labor intensive. System administratorsthe people who keep the machines humming -don't come cheap. And, unlike programmers, system administrators are usually co-located with the servers, because they need to access the physical hardware.
As part of an effort to rein in operations cost increases, virtualization offers the opportunity to reduce overall system administration costs by reducing the overall number of machines that need to be taken care of. Although many of the tasks associated with system administration (OS and application patching, doing backups) continue even in a virtualized environment, some of them disappear as physical servers are migrated to virtual instances.
Overall, virtualization can reduce system administration requirements drastically, making virtualization an excellent option to address the increasing cost of operations personnel.

VIRTUALIZATION TECHNOLOGIES
To understand modern virtualization technologies, think first about a system without them. Imagine, for example, an application such as Microsoft Word running on a standalone desktop computer. Figure 1 shows how this looks.

Figure 1: A system without virtualization
The application is installed and runs directly on the operating system, which in turn runs directly on the computer's hardware. The application's user interface is presented via a display that's directly attached to this machine. This simple scenario is familiar to anybody who's ever used Windows. But it's not the only choice. In fact, it's often not the best choice. Rather than locking these various parts together-the operating system to the hardware, the application to the operating system, and the user interface to the local machine-it's possible to loosen the direct reliance these parts have on each other. Doing this means virtualizing aspects of this environment, something that can be done in various ways. The operating system can be decoupled from the physical hardware it runs on using hardware virtualization, for example, while application virtualization allows an analogous decoupling between the operating system and the applications that use it.

Hardware
Operating System O c t o b e r , 2 0 1 3 Similarly, presentation virtualization allows separating an application's user interface from the physical machine the application runs on. All of these approaches to virtualization help make the links between components less rigid. This lets hardware and software be used in more diverse ways, and it also makes both easier to change.
Given that most IT professionals spend most of their time working with what's already installed rather than rolling out new deployments, making their world more malleable is a good thing.
Each type of virtualization also brings other benefits specific to the problem it addresses. Understanding what these are requires knowing more about the technologies themselves. Accordingly, the next sections take a closer look at each one.

HARDWARE VIRTUALIZATION
For most IT people today, the word "virtualization" conjures up thoughts of running multiple operating systems on a single physical machine. This is hardware virtualization, and while it's not the only important kind of virtualization, it is unquestionably the most visible today. The core idea of hardware virtualization is simple: Use software to create a virtual machine (VM) that emulates a physical computer. By providing multiple VMs at once, this approach allows running several operating systems simultaneously on a single physical machine. Figure 2 shows how this looks.

Figure 2: Illustrating hardware virtualization
When used on client machines, this approach is often called desktop virtualization, while using it on server systems is known as server virtualization. Desktop virtualization can be useful in a variety of situations. One of the most common is to deal with incompatibility between applications and desktop operating systems. For example, suppose a user running Windows Vista needs to use an application that runs only on Windows XP with Service Pack 2. By creating a VM that runs this older operating system, then installing the application in that VM, this problem can be solved.

PRESENTATION VIRTUALIZATION
Much of the software people use most is designed to both run and present its user interface on the same machine. The applications in Microsoft Office are one common example, but there are plenty of others. While accepting this default is fine much of the time, it's not without some downside. For example, organizations that manage many desktop machines must make sure that any sensitive data on those desktops is kept secure. They're also obliged to spend significant amounts of time and money managing the applications resident on those machines. Letting an application execute on a remote server, yet display its user interface locally-presentation virtualization-can help. Figure 3 shows how this looks. As the figure shows, this approach allows creating virtual sessions, each interacting with a remote desktop system. The applications executing in those sessions rely on presentation virtualization to project their user interfaces remotely. Each session might run only a single application, or it might present its user with a complete desktop offering multiple applications. In either case, several virtual sessions can use the same installed copy of an application.

APPLICATION VIRTUALIZATION
Virtualization provides an abstracted view of some computing resource. Rather than run directly on a physical computer, for example, hardware virtualization lets an operating system run on a software abstraction of a machine. Similarly, presentation virtualization lets an application's user interface be abstracted to a remote device. In both cases, virtualization loosens an otherwise tight bond between components.
Another bond that can benefit from more abstraction is the connection between an application and the operating system it runs on. Every application depends on its operating system for a range of services, including memory allocation, device drivers, and much more. Incompatibilities between an application and its operating system can be addressed by either hardware virtualization or presentation virtualization, as described earlier. But what about incompatibilities between two applications installed on the same instance of an operating system? Applications commonly share various things with other applications on their system, yet this sharing can be problematic.
For example, one application might require a specific version of a dynamic link library (DLL) to function, while another application on that system might require a different version of the same DLL. Installing both applications leads to what's commonly known as DLL hell, where one of them overwrites the version required by the other. To avoid this, organizations often perform extensive testing before installing a new application, an approach that's workable but time-consuming and expensive.
Application virtualization solves this problem by creating application-specific copies of all shared resources, as Figure 4 illustrates. The problematic things an application might share with other applications on its system-registry entries, specific DLLs, and more-are instead packaged with it, creating a virtual application. When a virtual application is deployed, it uses its own copy of these shared resources.

Operating System
Physical Machine

Virtual Session
Application Presentation Virtualization O c t o b e r , 2 0 1 3

Figure 4: Illustrating application virtualization
Application virtualization makes deployment significantly easier. Since applications no longer compete for DLL versions or other shared aspects of their environment, there's no need to test new applications for conflicts with existing applications before they're rolled out. And as Figure 4 suggests, these virtual applications can run alongside ordinary applications-not everything needs to be virtualized.
Microsoft Application Virtualization, called App-V for short, is Microsoft's technology for this area. An App-V administrator can create virtual applications, and then deploy those applications as needed. By providing an abstracted view of key parts of the system, application virtualization reduces the time and expense required to deploy and update applications.

OTHER VIRTUALIZATION TECHNOLOGIES
This overview looks at three kinds of virtualization: hardware, presentation, and application. Similar kinds of abstraction are also used in other contexts, however. Among the most important are network virtualization and storage virtualization.
The term network virtualization is used to describe a number of different things. Perhaps the most common is the idea of a virtual private network (VPN). VPNs abstract the notion of a network connection, allowing a remote user to access an organization's internal network just as if she were physically attached to that network. VPNs are a widely implemented idea, and they can use various technologies. In the Microsoft world, the primary VPN technologies today are Internet Security and Acceleration (ISA) Server 2006 and Internet Application Gateway 2007.
The term storage virtualization is also used quite broadly. In a general sense, it means providing a logical, abstracted view of physical storage devices, and so anything other than a locally attached disk drive might be viewed in this light. A simple example is folder redirection in Windows, which lets the information in a folder be stored on any network-accessible drive. Much more powerful (and more complex) approaches also fit into this category, including storage area networks (SANs)

Application Logic
Application