Storage Peer Incite: Notes from Wikibon’s January 25, 2011 Research Meeting
Recorded audio from the Peer Incite:
In the January 25 Peer Incite session, the Wikibon community took a look at Virtual Desktop Infrastructure (VDI), the latest in an evolution of technologies that have attempted to recentralize end-user computing. VDI has found niche markets in several verticals including medical and government, often where security concerns are high (e.g., with medical patient records).
However, users should not make the mistake of thinking that virtual desktop offers the same major automatic advantages as virtualizing servers and storage. As the experts who presented at the meeting made clear, virtual desktops are an entirely different animal than virtualized data centers, both in the technical and the financial sense.
End-user computing, in fact, has been evolving away from a centralized model ever since the PC revolution of the early 1980s, when the first primitive desktop computers replaced centralized green screen systems. Today end-user computing is evolving toward a highly mobile, multi-device model that VDI today cannot support.
When looking at VDI it is important to consider both its advantages and disadvantages carefully. If your primary problem is ensuring data security in a controlled environment, then VDI, properly designed, can make sense. If, however, your concern is supporting an increasingly mobile and virtual workforce who may work from laptops, smartphones, and tablets depending on what is most convenient to them at the moment, then the better answer is likely to be cloud computing and Software-as-a-Service. G. Berton Latamore, Editor
Virtual Desktop Infrastructure (VDI) is the modern version of the thin client movement of the 1990's. VDI essentially delivers desktop-as-a-service (DaaS), meaning end-users of desktops, laptops and notebooks can access a virtual set of servers that are allocated to providing client-side applications and services. Because these services are delivered over the corporate network, client management is simplified and ostensibly, end-user computing costs, support costs, licensing, and software maintenance expenses are all decreased.
In addition, proponents of VDI suggest that the approach is inherently more standardized and hence more reliable, available and secure because system images can be locked down, ‘templatized’ and easily replicated with control by an IT administrator.
Despite these appealing attributes, to date, VDI deployments have been confined to relatively narrow niches such as call centers, claims desks, government use cases and other environments where a critical mass of users is performing similar tasks consistently. Beyond these situations, organizations have struggled to realize the benefits of virtualized desktops largely because the end-user experience (from the standpoint of performance, functionality, graphics, etc.) has not lived up to that of the traditional Microsoft client model.
As a result, VDI to date has been seen as largely tactical by IT practitioners. At the January 25th, 2011 Wikibon Peer Incite Research Meeting, a panel of experts agreed that in order to become more of a strategic initiative, the notion of virtual desktop needs to evolve from a device-centric mentality to a data- and application-centric view. In other words, as users begin to access services from more devices (e.g. smartphones, tablets, etc.), VDI needs to allow access to user data and apps from any device, from anywhere at any time. It is the support of the mobile enterprise user that holds the most long-term potential for VDI and the name itself (Virtual Desktop) is increasingly outdated.
Where is VDI Prominent Today?
Much of VDI today is seen in industries such as financial services, education, medical/health (e.g. hospitals and clinics) care and government. For example, VDI is used for training students on how to use technology. In medicine, caregivers use tablets and card-scanning technology at kiosks allowing them to carry their virtual desktop, data, and applications with them as they move from patient to patient. U.S. federal government is another popular vertical due to the inherent consistency and standardization qualities of VDI.
A few years ago, VDI-related projects were initiated as very small pilots - e.g. 20 seat deployments. In 2010, the panelists saw more interest in 200 and 500 seat deployments, and today they are seeing much larger installations in the 1,000, 2,000, 5,000 and even 15,000 seat levels. Generally these deployments are occurring in situations where there is a high degree of user commonality from the standpoint application access, data type, and task performed.
Where's the Starting Point for Desktop Virtualization?
According to Michael Keen of Lakeside Software, a VDI specialist, users need to start by examining and understanding their data so they can make better decisions about how and where to deploy VDI. Keen calls this "decision acuity" and posits that lack of understanding of the data is one of the biggest pitfalls for users thinking about VDI. Keen suggests that the planning phase is crucial.
Glenda Canfield of VMware adds that it's critical to categorize users and understand their needs by groups. Many people think VDI can be deployed ubiquitously across the enterprise, but the fact is it can't. Different users and different 'tiers' of users will require different profiles, data and application access, and performance attributes so planning up front becomes even more critical for success. When it comes to user tiering, the panel suggests that a good way to tier is by user type or role - e.g. knowledge worker, task worker, etc.
The experts stressed that capturing and understanding data about the users was crucial to ensure successful deployments, and it was suggested that the collection of this data should be automated. Specifically, one person can manage an automated data collection task using automated tools over a period of 3-4 weeks versus perhaps as many as four people over a similar time period (or even longer) if attempted manually.
The bottom line is that from a planning perspective it's all about understanding the user experience and how to best replicate that in a virtual environment. If the user experience degrades, the project will fail. The panel suggest a classic analyze-->design-->build-->manage approach; with an understanding of the reality that desktop virtualization will be a subset of an overall virtualized infrastructure. Specifically, VDI will include a blend of:
- Virtual desktops,
- Physical desktops,
- Server-based computing,
- Application virtualization.
As such, as Jason Langone of MicroTech put forth, a key for users to understand is how to segment the overall virtual infrastructure and build out proper tiers of infrastructure (e.g. storage service levels for different user classes). Ultimately, the panel suggests that VDI becomes a hybridized component part of the overall virtualization strategy and should not be looked at in isolation.
Desktop Virtualization Cold Hard Facts
We performed some secondary research and found some statistics on VDI that are useful in setting context. Specifically, according to numerous surveys (e.g. Informationweek, The Register/Xiotech, ESG and a Falconstor-supplied survey):
- Roughly 68%-77% of organizations are either using, testing or strongly considering VDI deployments.
- Costs per VDI user range from $30-$40 to well over the traditional desktop average of $70-$90/user - depending on data protection level,
- Top five perceived benefits of VDI include: High availability, better utilization, simpler management, lower opex, and better security.
- Major barriers include difficulty in making a business case (whereas server virtualization is an easy business case), elongated time to value - i.e. it often requires project completion before benefit stream is realized, spotty user performance and functionality.
These trends resonated with the panel members and were clearly consistent with what they see in the field. The bottom line is that unlike server virtualization, desktop virtualization is much more nuanced and often has unclear ROI. Notably, in the military space the focus is on deploying quickly in an operational theater and security in the field. ROI and TCO are secondary concerns.
What are the Storage Constraints of VDI?
Rob Peglar of Xiotech joined the call and made some observations about storage, which they identified as one of the main drags:
- For every $1 spend on virtual desktop deployment, 3-10 is spent on storage.
- Most problems tend to be about I/O and storage.
- It's not just about how much storage capacity is needed but increasingly about how fast data needs to be accessed.
- Sizing virtual desktop installations (including remote desktops and mobile) is critical for storage - understanding access and I/O patterns is fundamental.
- Generally, individual desktop streams will be randomized in a VDI environment, which means poorly behaved storage with a mix of reads and writes.
- The key is to test at small scale (e.g. one dozen seats doing random access IO) and scale carefully.
- Boot storms are a particularly heinous barrier to adoption with a high Pareto I/O workload. A clean Windows 7 boot might take 150,000+ I/Os and a Windows 7 boot and login with lots of anti-virus and other background services might take 1M I/Os (with up to half of those writes).
- Questions must be answered including: What's the workload? Are you streaming to the desktop? Are you accessing remotely? What's the graphics workload over the network? Are users allowed to save to desktop? etc.
- Users will see a huge bottleneck effect where they should plan on anywhere between 10-30 iops per seat as a rule of thumb.
The bottom line according to Mr. Peglar is it's not the bytes that will kill you, it's the IOPs. And the standard of measurement is that users want their VDI experience to be no less than their existing desktop environment, and storage is a key component of that equation.
What About Mobile?
In anticipation of the VMware view client shipping, Jason Langone migrated to a pure iPad solution. His assessment - the solution, while cool, is extremely limited for mobile today. VDI must be primarily about delivering applications and there continues to be data access issues. Users are asking about iPads, netbooks, smartphones, and have the expectation that they will be able to access data and apps anywhere, anytime. The industry thus far is not delivering on this vision, but it's clearly the direction of VDI.
The bottom line is that for VDI to break out of its niche, the industry must put forth and deliver on a vision of user-centric computing proving data and application access to mobile devices. We are clearly in the early stages of that vision with perhaps 2-4 years of innovation required to deliver.
The panelists suggested users focus on 10 key areas:
- Plan, plan, plan!
- Understand user data and workloads.
- Understand end-users, their current experience, and how to replicate that.
- Understand access patterns and I/O requirements.
- Ensure you have the right network and storage infrastructure in place.
- Pilot small and make sure you can scale properly and make sure the end-user computing team is well-represented in the assessment process.
- Beware of virtual desktop sprawl - different classes of user will require different configurations.
- Understand these are not temporary solutions - plan for sustainability.
- Plan for operational readiness - e.g. how to manage and maintain; how to manage patches, etc.
- Make sure you have end user acceptance testing built into the project as a phase.
Also, the advice is to set SLA expectations conservatively - meeting current service levels at a minimum but not promising to vastly exceed current experiences.
Action item: VDI implementations today are mainly tactical. While certain use cases are showing clear ROI, the vision of supporting myriad mobile devices and providing anytime, anywhere access to data and applications for all users is still years off. However gaining experience with desktop virtualization is critical to supporting next generation end-user computing environments and preparing for more transformative user experiences. VDI is becoming more stable and viable and 2011 is the year for users to ramp up expertise to learn how to leverage the concept for more strategic uses down the road.
CIOs who have been reaping the rewards from server virtualization will be dismayed that desktop virtualization has a very different business case. VDI is compelling for niches inside organizations that have a large number of people doing the same task, such as call centers or hospitals. VDI can also be a method to improve security, which is especially useful for government deployments.
For other environments, the key problem for the CIO is how to meet the needs of various types of workers with a proliferation of access methods including mobile devices and tablets. Today, companies need to go through an extensive process of measuring the current environment, testing with a pilot program, adjusting the solution to meet the performance needs of the users, and repeating the process if a deployment is to be successful.
The pace of innovation on end-user devices is far outpacing the ability of VDI software to deliver infrastructure for devices. Current VDI deployments work best for local environments to a thin client, so the promise of providing a desktop image anywhere on any device is more of a vision than a reality.
For the short term; VDI can be deployed in a tactical way to wring out more costs for large scale local implementations of client devices, to provide flexibility to workers within a facility (such as with cube farm call centers and medical or health care workers), or to increase security.
Action item: CIOs should understand their user communities and the use cases where VDI makes sense. Desktop virtualization should be deployed as a tactical solution today where the benefits of management, flexibility or security can justify the cost. VDI can be considered part of larger strategy of virtualization as long as the financial and organizational impacts are examined independently.
Footnotes: Desktop Virtualization Reality Check from the Wikibon Blog
Server virtualization has been a runaway success with obvious reductions in OPEX and CAPEX. Storage virtualization has led to much simpler unified storage products with clear OPEX benefits. Citrix and VMware are the leaders in Virtual Desktop Infrastructure deployments. The infrastructure group should just take VDI on and make it three for three, right?
With server and storage utilization, the end-user is only peripherally involved. As long as the applications based on the virtualized infrastructure perform well, the user sees no difference at all. All changes in process are within the IT department. With VDI the user experience is the single most important key to success.
What we heard from practitioners on the 1/25/2011 Wikibon Peer Incite meeting were clear imperatives for successful VDI implementations:
- VDI implementations should be led by people experienced with PC and desktop deployment – infrastructure staff should be restricted to a support role;
- IT infrastructure staff are NOT typical end-users;
- Segment user groups by the applications run, the locations used and IT experience;
- Pick a homogenous group of users in the same location where the business case is clear as a starting point;
- Collect data automatically on the exact usage of application run, the infrastructure used (e.g., file & print), and desktop functionality used;
- Run a small VDI pilot to understand VDI and its impact on the organization;
- Ensure that storage is architected properly and can meet peak IOPS requirements such as boot storms;
- Success can be measured by a group of users actually using the new VDI infrastructure, and realizing the projected savings.
One of the greatest challenges for CIOs is managing access, ensuring security, and integrating the plethora of end-user devices that employees are carrying. End-users find it useful to access a screenshot of a desktop on their end-user device, but it is not integration. Integration is being able to print from an end-user device directly to an IP printer, lookup and change meetings and reservations directly, interact with data required directly.
The pace of innovation in end-user devices and applications is unrelenting, and VDI cannot possibly innovate fast enough to keep pace. VDI can make a contribution to integration, but selling VDI as a strategic solution for end-user device integration will almost certainly lead to unfulfilled expectations and project failure.
Action item: Keep VDI implementations simple in scope and led by desktop experts, migrate users in small homogeneous groups one group at a time, and focus on great end-user experience and budget savings.
IT-driven virtual desktop infrastructure (VDI) initiatives are often motivated by a desire to improve operational efficiency and security while decreasing desk-top capital, maintenance, and management costs. VDI initiatives, however, are at risk of being rejected by desktop users, who are the IT department’s customers, if IT fails to integrate them into acceptance testing and ongoing determination of acceptable performance and availability.
IT professionals involved in VDI initiatives may be tempted to extrapolate from average storage capacity, average CPU, average memory, and average I/O requirements of desktop users to determine the necessary VDI infrastructure. Acceptable performance at the time of need, which is anytime the user wants it, however, is a major determining factor in whether a user will be satisfied with VDI.
Storage infrastructure plays a key role in performance. Storage capacity requirements are relatively predictable and constantly increasing, tempting IT to manage requirements out of one, centralized, homogeneous storage pool. Unlike capacity requirements, however, I/O requirements fluctuate dramatically, both up and down, based upon user activity. Peak I/O requirements are a crucial factor in the user experience, especially during periods of high user activity, such as log-on and log-off times. In order to deliver acceptable performance, VDI managers must pay as much attention to I/O performance of the systems as they do to capacity requirements.
VDI bears little resemblance to a server virtualization project. Even in a world of corporate standards, desk-top users have demanded and often receive a highly-customized experience, so a one-size-fits-all strategy will not work. Some users will compare the VDI implementation to their own beefy desktops, with rich graphics that have been finely tuned, optimizing performance for a specific workload. In addition, each user tends to customize the user interface to fit their style of work, so implementations need to enable and protect the customization of the user interface.
End-user acceptance testing is a critical part of the VDI implementation plan. IT cannot be responsible for acceptance testing and ongoing management of the end-user experience. If end-users perceived performance and usability as sub-par, IT will be dragged into months of escalation meetings and damage control. It will be difficult to control dissatisfied users of VDI, who will either demand a return to the desktop, or, if stymied, find below-the-radar methods of meeting their needs.
Action item: VDI project managers should analyze workloads and divide desktop users into performance classes, selecting representatives from each class to perform user-acceptance testing of VDI implementations. Users, not IT, should determine acceptable performance. Further, once implemented, maintenance of VDI performance should also be a user-directed initiative. In doing the workload analysis, IT should also recognize at the outset that some users may not fit well within any performance class and may best be left outside of the VDI implementation.
Planning is the most important component of ensuring that VDI is implemented properly. Most organizations approach this step with error-prone manual processes and questionnaires that may or may not be completed and returned. It is imperative that IT organizations embarking on this paradigm-shifting compute environment take the time to arm themselves with all of the intelligence available about their current end-user compute and application environments.
Today’s market is dominated by only a few vendors, whose systems range from monitoring software to fully detailed and in-depth planning and design tools. Here are the three vendors that are most prevalent today:
3. Lanamark – The company’s recently introduced Desktop Transformation Model helps accelerate desktop transformation planning and design. This is based on an agent-less analysis of applications, users, desktops, laptops, and terminal servers. It enables application and desktop delivery specification for multiple user groups and supports multi-phase transformation design across Citrix XenApp, Citrix XenDesktop, Microsoft App-V, VMware ThinApp and VMware View.
2. Liquidware Labs – This vendor came to the analysis space more than three years ago with Stratusphere Fit. This tool fully assesses an organizations current environment — from the desktop, to the network, to the data center, to the SAN. It will provide information needed to decide which users, PCs, applications, and servers are ready for Citrix XenDesktop, VMware View, and/or Windows 7.
1. Lakeside Software – This company has been around the longest in the analysis and on-going management of products in the server-based computing and virtualization space. It has developed the most comprehensive planning/design and management solution for desktop virtualization. Lakeside’s flagship product, SysTrack, constructs models based on the customer’s environment that quantify and predict load. Using a mathematical probability model, SysTrack predicts loading based on real-world user behavior, hypervisor type, desktop model (assigned/persistent/pooled), hardware requirements, storage (both sizing and throughput) and other environmental concerns.
Action item: Organizations looking to investigate whether virtual desktop computing fits into their end-user compute model must implement a tool like those mentioned above to ensure that planning is based on accurate information. Users also must understand what to do with that information after they have it. Users must ensure that their vendor-of-choice has a methodology to guide them from beginning (planning) to end (on-going management) in virtual desktop infrastructures. This will give them the greatest chance of success.
Server virtualization has been a runaway success with obvious reductions in OPEX and CAPEX. Storage virtualization has led to much simpler unified storage products with clear OPEX benefits. Desktop virtualization aka VDI offers similar benefits – at least to many shops.
Break the Desktop Upgrade and Replacement Cycle
Because VDI can use lightweight endpoints, customers can re-purpose or completely replace desktops with thin clients running, for example, Linux with practically no applications running or living on the desktop. The painful desktop replacement cycle can be replaced via a less costly asset management plan.
Another benefit is avoiding hardware upgrades for new operating systems, e.g., Windows Vista. Prior to VDI, more often then not it would be necessary to upgrade hardware, memory, disk space, etc. With VDI, the operating system images can be installed on a centralized server with plenty of resources and accessed via a thin client.
Stop Managing Endpoints
In a typical corporate infrastructure, desktops are managed using remote software technology. Managing hundreds or thousands of desktops in this way is quite difficult and problematic. Using VDI allows central management of all the virtual desktops and better control what is being installed and used on those desktops. Less time can be spent on endpoints -- actual physical PCs -- because they no longer need to be managed as tightly. These endpoints are only needed to provide a remote desktop connection to the virtual desktop.
In addition, deployment of virtual desktops is lightning fast as opposed to using system imaging technology. With virtual desktops, deployment of new images to geographically dispersed locations can be managed from one data center using data center-class technologies.
Rationalize the Application Portfolio
VDI is an opportunity to clean up the application portfolio and identify low use applications that don’t need to be widely distributed across the enterprise. VDI deployment also provides an opportunity to reduce rogue software licensing and expensive volume-based software that is consuming lots of expensive licensing and maintenance dollars.
No More Desktop Backup
Though many organizations forego desktop backup or use network shares, they still must often re-image the desktop when it fails – a time-consuming and disruptive process. With VDI, data center-class backup technology can be applied to virtual desktop images thereby providing a high quality backup of both system and data files. Moreover, With VDI and snapshot point-in-time copies, desktops can be rolled back to different states in time. This is a great feature, and provides great flexibility to end users.
A thin client VDI session can use less electricity than a desktop computer. Using VDI is a way to reduce carbon footprints and save money in power costs. What’s more, virtual desktops can be powered off when not needed.
Caveat - The Pendulum Always Swings
Since the dawn of IT, advances in technology have compelled users to either deploy it centrally or distributed; thin client or fat client or no server or distributed servers, etc. Over time, we have seen the dominant orientation swing back and forth from consolidated to distributed and back.
This pendulum never stops, and with the IT world abuzz about moving totally away from desktops to mobile devices and mobile applications it is clear that these new endpoints are getting heavier/fatter by the day. And the challenges of managing them grow proportionately. VDI offers clear savings for selected environments, but those environments will change rapidly.
Action item: Consider VDI for the savings it can bring today. Recognize that these savings may be one time only.
Many technologies have aligned in the last five years to create a Perfect Storm which has begun to enable Virtual Desktop delivery and what is currently referred to as DaaS (Desktop as a Service). The end goal for this technology is to provide a Virtual Operating System (primarily Windows-based today) to allow workers access their “work” applications at anytime in any location using any access device.
Hypervisors are NOT the only technology that enables desktop virtualization. You must consider hardware, software, storage, network, security, and access device proliferation. Smart phones, tablets, and net books are driving a trend of consumers\workers going out and purchasing their own access devices with the expectation that their work IT organization will support the device. The big challenge here is supporting ANY device with ANY operating system. Bandwidth; multi-media feature parity in delivery protocols and core OS platform support of applications still must be considered.
Some of the above is solved with the “VDI” solution. With clients that work with Linux & Apple devices it allows users to get their desktops and run apps in a completely different OS (MS) that has no conflicts with their company’s platform standard. That is if the provider of the Virtual Desktop Technology has a client and protocol algorithm that will work on the end point (ex: Linux, Apple,etc.)
Then we look at the whole “Cloud” evolution and private versus public, and the industry agreement that it will ultimately be about a hybrid combination: private+public=hybrid. Both consumers and business users will ultimately be renting resources to enable the end-user to have a “like local” experience and grant access from “anywhere using any device” to both work and personal resources.
Action item: You have to take into consideration all the dependent technology associated with delivering this solution and understand what will be the most effective solution for your users today, but also what will be required to sustain and scale the technology long term.