Tag Archives: Business Practices

Cloud-Based VDI vs. DaaS – Is There a Difference?

With nearly all new technologies in the IT space comes confusion over terminology. Some of the confusion is simply because the technology is new, and we’re all trying to understand how it works and how – or whether – it fits the needs of our businesses. Unfortunately, some of the confusion is often caused by technology vendors who want to find a way to label their products in a way that associates them with whatever is perceived as new, cool, innovative, cutting-edge, etc. Today, we’re seeing that happen with terms like “cloud,” “DaaS,” and “VDI.”

“VDI” stands for Virtual Desktop Infrastructure. Taken literally, it’s an infrastructure that delivers virtual desktops to users. What is a virtual desktop? It is a (usually Windows) desktop computing environment where the user interface is abstracted and delivered to a remote user over a network using some kind of remote display protocol such as ICA, RDP, or PCoIP. That desktop computing environment is most often virtualized using a platform such as VMware, Hyper-V, or XenServer, but could also be a blade PC or even an ordinary desktop PC. If the virtual desktop is delivered by a service provider (such as VirtualQube) for a monthly subscription fee, it is often referred to as “Desktop as a Service,” or “DaaS.”

There are a number of ways to deliver a virtual desktop to a user:

  • Run multiple, individual instances of a desktop operating system (e.g., Windows 7 or Windows 8) on a virtualization host that’s running a hypervisor such as VMware, Hyper-V, or XenServer. Citrix XenDesktop, VMware View, and Citrix VDI-in-a-Box are all products that enable this model.
  • Run multiple, individual instances of a server operating system (e.g., 2008 R2 of 2012 R2) on a virtualization host that’s running a hypervisor such as VMware, Hyper-V, or XenServer. In such a case, a policy pack can be applied that will make the 2008 R2 desktop look like Windows 7, and the 2012 R2 desktop look like Windows 8. In a moment we’ll discuss why you might want to do that.
  • Run multiple, individual desktops on a single, shared server operating system, using Microsoft Remote Desktop Services (with or without added functionality from products such as Citrix XenApp). This “remote session host,” to use the Microsoft term, can be a virtual server or a physical server. Once again, the desktop can be made to look like a Windows 7 or Windows 8 desktop even though it’s really a server OS.
  • Use a brokering service such as XenDesktop to allow remote users to connect to blade PCs in a data center, or even to connect to their own desktop PCs when they’re out of the office.
  • Use client-side virtualization to deliver a company-managed desktop OS instance that will run inside a virtualized “sandbox” on a client PC, such as is the case with Citrix XenClient, or the Citrix Desktop Player for Macintosh. In this case, the virtual desktop can be cached on the local device’s hard disk so it can continue to be accessed after the client device is disconnected from the network.

Although any of the above approaches could lumped into the “VDI” category, the common usage that seems to be emerging is to use the term “VDI” to refer specifically to approaches that deliver an individual operating system instance (desktop or server) to each user. From a service provider perspective, we would characterize that as cloud-based VDI. So, to answer the question we posed in the title of this post, cloud-based VDI is one variant of DaaS, but not all DaaS is delivered using cloud-based VDI – and for a good reason.

Microsoft has chosen not to put its desktop operating systems on the Service Provider License Agreement (“SPLA”). That means there is no legal way for a service provider such as VirtualQube to provide a customer with a true Windows 7 or Windows 8 desktop and charge by the month for it. The only way that can be done is for the customer to purchase all the licenses that would be required for their own on-site VDI deployment (and we’ve written extensively about what licenses those are), and provide those licenses to the service provider, which must then provision dedicated hardware for that customer. That hardware cannot be used to provide any services to any other customer. (Anyone who tells you that there’s any other way to do this is either not telling you the truth, or is violating the Microsoft SPLA!)

Unfortunately, the requirement for dedicated hardware will, in many cases, make the solution unaffordable. Citrix recently published the results of a survey of Citrix Service Providers. They received responses from 718 service providers in 25 countries. 70% of them said that their average customer had fewer than 100 employees. 40% said their average customer had fewer than 50 employees. It is simply not cost-effective for a service provider to dedicate hardware to a customer of that size, and unlikely that it could be done at a price the customer would be willing to pay.

On the other hand, both Microsoft and Citrix have clean, easy-to-understand license models for Remote Desktop Services and XenApp, which is the primary reason why nearly all service providers, including VirtualQube, use server-hosted desktops as their primary DaaS delivery method. We all leverage the policy packs that can make a Server 2008 R2 desktop look like a Windows 7 desktop, and a 2012 R2 desktop look like a Windows 8 desktop, but you’re really not getting Windows 7 or Windows 8, and Microsoft is starting to crack down on service providers who fail to make that clear.

Unfortunately, there are still some applications out there that will not run well – or will not run at all – in a remote session hosted environment. There are a number of reasons for this:

  • Some applications check for the OS version as part of their installation routines, and simply abort the installation if you’re trying to install them on a server OS.
  • Some applications will not run on a 64-bit platform – and Server 2008 R2 and 2012 R2 are both exclusively 64-bit platforms.
  • Some applications do not follow proper programming conventions, and insist on doing things like writing temp files to a hard-coded path like C:\temp. If you have multiple users running that application on the same server via Remote Desktop Services, and each instance of the application is trying to write to the same temp file, serious issues will result. Sometimes we can use application isolation techniques to redirect the writes to a user-specific path, but sometimes we can’t.
  • Some applications are so demanding in terms of processor and RAM requirements that anyone else trying to run applications on the same server will experience degraded performance.

There’s not much that a service provider can do to address the first two of these issues, short of going the dedicated-hardware route (for those customers who are large enough to afford it) and provisioning true Windows 7 or Windows 8 desktops. But there is a creative solution for the third and fourth issues, and that’s to use VDI technology to provision individual instances of Server 2008 R2 or Server 2012 R2 per user. From the licensing perspective, it’s no different than supporting multiple users on a remote session host. Once the service provider has licensed a virtualization host for Windows Datacenter edition, there is no limit to the number of Windows Server instances that can be run on that host – you can keep spinning them up until you don’t like the performance anymore. And the Citrix and Microsoft user licensing is the same whether the user has his/her own private server instance, or is sharing the server OS instance with several other users via Remote Desktop Services.

On the positive side, this allows an individual user to be guaranteed a specified amount of CPU and RAM to handle those resource-intensive applications, avoids “noisy neighbor” issues where a single user impacts the performance of other users who happen to be sharing the same Remote Desktop Server, and allows support of applications that just don’t want to run in a multi-user environment. It’s even possible to give the user the ability to install his/her own applications – this may be risky in that the user could break his/her own virtual server instance, but at least the user can’t affect anyone else.

On the negative side, this is a more expensive alternative simply because it is a less efficient way to use the underlying virtualization host. Our tests indicate that we can probably support an average of 75 individual virtual instances of Server 2008 or Server 2012 for VDI on a dual-processor virtualization host with, say, 320 Gb or so of RAM. We can support 200 – 300 concurrent users on the same hardware by running multiple XenApp server instances on it rather than an OS instance per user.

That said, we believe there are times when the positives of cloud-based VDI is worth the extra money, which is why we offer both cloud-based VDI and remote session hosted DaaS powered by Remote Desktop Services and XenApp.

Countdown to July 14, 2015

In case you haven’t heard, Microsoft will end support for Windows Server 2003 on July 14, 2015. A quick glance at the calendar will confirm that this is now less than a year away. So this is your friendly reminder that if you are still running 2003 servers in production, and you haven’t yet begun planning how you’re going to replace them, you darn well better start soon. Here are a few questions to get you started:

  • Are those 2003 servers already virtualized, or do you still have physical servers that will need to be retired/replaced?
  • If you have physical 2003 servers, do you have a virtualized infrastructure that you can use for their replacements? (If not, this is a great opportunity to virtualize. If so, do you have enough available capacity on your virtualization hosts? How about storage capacity on your SAN?)
  • Can the application workloads on those 2003 servers be moved to 2008 or 2012 servers? If not, what are your options for upgrading those applications to something that will run on a later server version?
  • What impact will all this have on your 2015 budget? Have you already budgeted for this? If not, do you still have time to get this into your next budget?
  • Would it make more sense from a budget perspective to move those application workloads to the cloud instead of purchasing server upgrades? (Maybe a monthly operating expense will be easier to deal with than the capital expenditure of purchasing the upgrades.)

According to Microsoft, there are more than 9 million 2003 servers still in production worldwide…and the clock is ticking. How many of the 9 million are yours?

A Brief Respite from CryptoLocker

A couple of days ago (June 2), the UK’s National Crime Agency announced that law enforcement agencies have effectively disabled key nodes of the GOZeuS network, which provided a key delivery mechanism for CryptoLocker’s ransom malware. They’ve also identified a person believed to be the leader of the criminal enterprise behind GOZeuS, and international officials say that other arrests are “in progress.”

While this is good news, it’s unlikely to be a permanent solution to the ransomware problem, given the distributed nature of Internet-based malware. It does, however, give us some breathing room – perhaps as much as a couple of weeks – to think about how to protect against it.

In case you’re not familiar with what CryptoLocker is, it is a particularly nasty form of malware that first appeared in the fall of 2013, and is typically spread by tricking a user into clicking on a disguised executable. Disguised executables are, in part, enabled by the unfortunate design choice Microsoft made in Windows XP that continued through Windows 7, which was to “Hide extensions for known file types” by default. (Personally, this always annoyed me, and one of the first things I always did when setting up a new PC was to deselect that option. It does appear that it is no longer selected by default in Windows 8 and 8.1.)
Hide extensions of known file types
This meant that, for example, a Word document that was called “My Important Customer Proposal.docx” would display in Windows Explorer (and elsewhere within the OS) as, simply, “My Important Customer Proposal.” That also meant that if someone sent you an email with a file attachment called MalwareDesignedToStealYourMoney.pdf.exe, it would display in Windows as, simply, MalwareDesignedToStealYourMoney.pdf. An unsophisticated or careless user – or someone who perhaps was just exhausted from a long day and not thinking clearly – might look at the file name and think it was an ordinary Adobe PDF file, and double-click on it to open it up…not realizing that the “.exe” that was hidden from them meant that it was really an executable that was designed to install malware on their system.

“But why,” you might ask, “wouldn’t my anti-virus software protect me against this?” The answer is that some anti-virus products might protect you, depending on how the options are set. But many, if not most, users have local administrator rights to their PCs. (Yes, arguably they shouldn’t, but every IT admin that’s ever tried to take those rights away has had to deal with the howls of protest when users – often top executives – suddenly can’t install iTunes or some other equally essential utility on their PCs.) So unless your AV product is set to scan files whenever they are accessed – a setting that often isn’t enabled even on products that are capable of doing it because it can slow your system down – you won’t know that you’re installing something bad until it’s too late. Local administrators, by definition, have the authority to install software. You launched the installation program, you’re a local administrator, so it’s going to get installed.

CryptoLocker screen cap
Once installed, CryptoLocker checks in with a server on the Internet that assigns a public/private key pair to that PC, and CryptoLocker then happily goes to work using the public key to encrypt all the documents, spreadsheets, pictures, etc., on your system. The latest variants will even encrypt files on network drives if they’re mapped using a drive letter. (So far, it doesn’t appear that CryptoLocker knows how to navigate across UNC paths.) There is even some evidence that the latest variants may wait up to two weeks before locking you out of your files, in the hopes that you will move through a full cycle of backups during that time, meaning that all your backups will also be encrypted and therefore useless to you. Once it’s done its dirty work, you will suddenly be unable to access any of your files, and will be presented with a screen that tells you that you have, typically, 72 hours to submit payment – typically via untraceable money cards or bitcoin – in order to obtain the private key that will decrypt your files. Otherwise, the private key will be automatically destroyed, and your files will be forever out of your reach.

If the thought of having to cough up the equivalent of $300 US or lose all your data leaves you with cold chills (as it does me), what can/should you do?

  • First and foremost, educate your users. One of the most basic rules of computer safety is that you simply don’t open email attachments from people you don’t know – and, for that matter, don’t open them from people you do know unless you were expecting them and know what they are. Remember that it’s not that tough to impersonate someone’s email address. At the moment, most CryptoLocker payloads are disguised as invoices from financial institutions, messages from shipping companies, notices from law enforcement agencies, etc., often with scary messages about account closures, final notices, and amounts due. Also beware of zip file attachments. Make sure your users are aware of these common tricks, so they don’t reflexively click to see what a file attachment is.
  • If you’re still running Windows 7 or earlier, deselect the “Hide extensions for known file types” option. This will at least make it slightly more likely that someone will notice that there’s something not quite right about the file they’re about to click on.
  • Keep your anti-virus products up to date.
  • Restrict permissions on shared folders.
  • Consider removing local admin rights from users.
  • Consider using a prevention tool like “CryptoPrevent” from the folks at Foolish IT, LLC. This is a tool that is free for both private and commercial use – although there is a paid version that will automatically update itself and offers additional features like email alerts when applications are blocked. When installed, it will, silently and automatically, lock down a Windows system by, among other things, preventing executables with double extensions (like “something.pdf.exe”) from running, and preventing executables from running if they’re located in folders where you wouldn’t expect legitimate programs to be located. It implements over 200 rules that will help protect you from other forms of malware as well as CryptoLocker.

    It should be noted that, if you’re running a Professional version of Windows that is joined to a Windows domain, all of these rules could be set via group policies, and there are even pre-packaged prevention kits, such as CryptolockerPreventionKit.zip, available at www.thirdtier.net/downloads that will make it easier to set those group policies. But if you’re not comfortable with the whole concept of group policies and/or you’re not in a Windows domain or you’re running a home version of Windows, CryptoPrevent is a fast and easy way to deal with the issue.

Please do not assume that the latest law enforcement announcements mean that we don’t have to worry about CryptoLocker anymore. It’s estimated that CryptoLocker raked in as much as $30 million just in the first 100 days after it appeared in the wild. With that much money in play, it – or something else like it – will inevitably reappear sooner or later.

Why Desktop as a Service?

This morning, I ran across an interesting article over on techtarget.com talking about the advantages of the cloud-hosted desktop model. Among other things, it listed some of the reasons why businesses are deploying DaaS, which align quite well with what we’ve experienced:

  • IaaS – Businesses are finding that as they move their data and server applications into the cloud, the user experience can degrade, because they’re moving farther and farther away from the clients and users who access them. That’s reminiscent of our post a few months ago about the concept of “Data Gravity.” In that post, we made reference to the research by Jim Gray of Microsoft, who concluded that, compared to the cost of moving bytes around, everything else is essentially free. Our contention is that your application execution platform should be wherever your data is. If your data is in the cloud, it just makes sense to have a cloud-hosted desktop to run the applications that access that data.
  • Seasonality – Businesses whose employee count varies significantly over the course of the year may find that the pay-as-you-go model of DaaS makes more sense than building an on-site infrastructure that will handle the seasonal peak.
  • DR/BC – This can be addressed two ways: First, simply having your data and applications in a state-of-the-art data center gives you protection against localized disasters at your office location. If your cloud hosting provider offers data replication to geo-redundant data centers, that’s even better, because you’re also protected against a catastrophic failure of the data center as well. Second, you can replicate the data (and, optionally, even replicate server images) from your on-site infrastructure to a cloud storage repository, and have your hosting provider provision servers and desktops on demand in the event of a disaster – or, although this would cost a bit more, have them already provisioned so they could simply be turned on.
  • Cost – techtarget.com points out that DaaS allows businesses to gain the benefits of virtual desktops without having to acquire the in-house knowledge and skills necessary to deploy VDI themselves. While this is a true statement, it may be difficult to build a reliable ROI justification around it. We’ve found that it often is possible to see a positive ROI if you compare the cost of doing a “forklift upgrade” of servers and server software to the cost of simply moving everything to the cloud and never buying servers or server software again.

It’s worth taking a few minutes to read the entire article on techtarget.com (note – registration may be required to access some content). And, of course, it’s always nice to know we’re not the only ones who think there are some compelling advantages to cloud-hosted desktops!

Windows XP – Waiting for the Other Shoe to Drop

It's Dead Jim

As nearly everyone knows, Microsoft ended all support for Windows XP on April 8. To Microsoft’s credit, they chose to include Windows XP in the emergency patch that they pushed out last night for the “zero day” IE/Flash vulnerability, even though they didn’t have to, and had initially indicated that they wouldn’t. (Of course, the bad press that would have ensued had they not done so would have been brutal. Still, kudos to them for doing it. Given that so many of us criticize them when they do something wrong, it’s only fair that we recognize them when they do something right.)

But what about next time?

The fact is that if you are still running Windows XP on any PC that has access to the Internet, your business is at risk – and that risk will increase as time goes on. The IE/Flash issue should be a huge wake-up call to that effect.

Windows XP was a great operating system, and met the needs of most businesses for many, many years. However, Windows 7 and Windows 8 really are inherently more secure than Windows XP. Moreover, the realities of the software business are such that no vendor, including Microsoft, can continue to innovate and create new and better products while simultaneously supporting old products indefinitely. The “End of Life” (EOL) date for WinXP was, in fact, postponed multiple times by Microsoft, but at some point they had to establish a firm date, and April 8 was that date. The patch that was pushed out last night may be the last one we see for WinXP. When the next major vulnerability is discovered – and it’s “when,” not “if” – you may find that you’re on your own.

Moving forward, it’s clear that you need to get Windows XP out of your production environment. The only exception to this would be a system that’s isolated from the Internet and used for a specific purpose such as running a particular manufacturing program or controlling a piece of equipment. Unfortunately, a lot of the Windows XP hardware out there simply will not support Windows 7 or Windows 8 – either because it’s underpowered, or because drivers are not available for some of the hardware components. So some organizations are faced with the prospect of writing a big check that they weren’t prepared to write for new hardware if they want to get off of Windows XP altogether – and telling them that they had plenty of warning and should have seen this coming may be true, but it isn’t very helpful. Gartner estimates that between 20 and 25 percent of enterprise systems are still running XP, so we’re talking about a lot of systems that need to be dealt with.

Toby Wolpe has a pretty good article over on zdnet.com about 10 steps organizations can take to cut security risks while completing the migration to a later operating system. The most sobering one is #9 – “Plan for an XP breach,” because if you keep running XP, you will eventually be compromised…so you may as well plan now for how you’re going to react to contain the damage and bring things back to a known-good state.

One suggestion we would add to Toby’s list of 10 is to consider moving to the cloud. Many of the actions on Toby’s list are intended to lock the system down by restricting apps, removing admin rights, disabling ports and drives, etc., which may make the system safer, but will also impact usability. However, a tightly locked-down XP system might make an acceptable client device for accessing a cloud hosted desktop. Alternately, you could wipe the XP operating system and install specialized software (generally Linux-based) that essentially turns the hardware into a thin client device.

But the one thing you cannot do is nothing. In the words of Gartner fellow Neil MacDonald (quoted in Toby’s article), “we do not believe that most organizations – or their auditors – will find this level of risk acceptable.”