Over the past seven years or so I’ve seen machine virtualization grow from a neat trick for server consolidation to a platform for agile data center management. Throughout that time there has never been doubt about who is number one in this game. But I’ve also been impressed that VMware has never been complacent about their leader status.

An interesting story has not been so much VMware’s leadership but whether any other 133958124player would ever be a serious and legitimate alternative. Five years ago it was really no contest. Today there is a contest, particularly from Microsoft. But in tracking the progress of competitors we shouldn’t take VMware’s leadership position for granted. VMware doesn’t.

In short, let’s give VMware their due.

Areas of Leadership

In focusing on a competitive landscape we often look at feature parity. Back in the day there were things that VMware did that nobody else did. Things like being able to move a running virtual machine from one host to another and being able to increase the number of VMs that could comfortably share a host machine through memory sharing.

But VMware can rightly claim that, while the competition can add “me too” features, that doesn’t mean that they do it better.  In memory sharing and “over commit”, for example, the competition can claim progress, but VMware has a larger slate of capabilities including memory compression and transparent page sharing. In memory management, VMware is clearly ahead.

Another example is storage management. In a shared environment, storage management is critical. VMware is not the only vendor that has storage management in their portfolio, but VMware is the only one that has built APIs (vSphere APIs for Array Integration or VAAI) to integrate virtual management with the native management of storage arrays. The degree to which storage vendors support VAAI is a differentiating feature in our storage vendor landscapes.

VMware can, and does, point to other areas where they continue to show leadership. In securing virtual infrastructure, for example, VMware has vShield application, data, network, and endpoint security. These have recently been amalgamated under the banner of vCloud Networking and Security 5.1.

Good Enough Might Be Good Enough

Does all this mean that we think VMware is the only and obvious choice for virtualization in your infrastructure? Of course not. You don’t always need to go with best in class. Sometimes good enough is good enough. As noted above, an interesting story has been about whether the competition has been good enough.

A year ago I blogged on how Microsoft is VMware’s only real competitive threat. I still hold to this position. Microsoft has continued to get traction for Hyper-V. The main reason they have not been a champion in our Vendor Landscapes has been the slow general availability release of Hyper-V 3.0 and System Center 2012. Microsoft has a tendency to talk about a product as if it is in general use a year or more before the fact. Only now is it coming together in actual product.

Citrix XenServer has always scored well in our feature-by-feature comparisons with VMware but it has struggled for market share. Citrix no longer argues XenServer as a general replacement for VMware instead focusing on targeting it to areas where Citrix has existing strength such as application and desktop virtualization and service provider clouds.

In the meantime, VMware continues to do what they have always done, focus on where virtualization is going next and innovating to remain the market leader. This includes cloud of course as well as the fully software defined data center (servers, networks, and storage).

Share on FacebookShare on Google+Share on LinkedInTweet about this on Twitter

93294061When faced with the task of building a “system,” the only way to go is to build the most appropriate solution for the situation. In some cases, that may be a fat architecture; in some cases, it may be a thin architecture; or it may be a little “chubby” client (a hybrid of both), but the main point is to build the most appropriate solution.

Fat clients will not automatically be replaced by thin clients. Either approach has its share of positive and negative attributes. For a fuller discussion, please see the article, What are the pros and cons of fat and thin architectures, and will thin replace fat in the future?

The trend across many businesses regardless of industry is a move towards thin client systems, primarily because thin client systems can support on-demand and other Internet-based applications with relatively little administrative or technical support. If you want to operate in a thin client environment, you’ll need to make sure that your network resources are extremely robust and you have some form of guaranteed uptime since thin clients can’t do a lot of work when the network is down.

Many of the advantages to taking a thin client approach revolve around cost savings. Workstations running the thin client do not need the vast system requirements that the application may require. Because of this, it is possible to outfit the workplace with low-cost computers that do not have the fastest, newest processors, lots of memory, and storage space. Only the computers that are really running the actual application need to be expensive, state-of-the-art machines. This cost, the cost of the computer needed to run a fat client, can be a negative factor. For fat client situations, the desktops do need to be state of the art, multi-processor, high RAM machines which, in large enterprise situations can have massive cost implications. This cannot be ignored when making a decision to go fat or thin.

There are also cost-savings in license fees. Not every user of the application needs to connect to it at the same time. So instead of paying a license fee to have the application installed on every computer (and sit idle), you pay a license fee for every simultaneous connection to the application. Keep in mind that not all software vendors offer this option, so you will need to investigate that this is a licensing model option when choosing software.

Additionally, there are savings to be had with respect to time, which also leads to cost savings and increased productivity. When a new version of an application is released, or if there is a maintenance upgrade, there is no need to install the fix, patch, update, or upgrade on every workstation. Only the computers running the application need to have the software installed. The thin clients on the workstations connect just as easily to the new version of the systems as they did with the old. In a large organization, this promotes a greatly reduced installation and deployment time that can save hundreds of hours. Downtime is also reduced since multiple thin clients can access one upgraded version and get back to work as soon as that upgrade is completed.

Thin clients can run just as easily on laptops, tablets, desktops, smartphones, and a host of other devices such as smartboards, all with virtually no dependence on the actual OS of the device. This means that key personnel can access the application while out of the office, from various locations (whether on the other side of the facility, or the other side of the world), which can be especially useful in the case of EHR systems or simply for emergencies where your staff needs to be connected.

Not everything about thin clients is perfect; there are some disadvantages which must be weighed in when deciding the direction. As mentioned above, thin clients do require a stable network connection, whether that is the local network or the Internet. If a router fails or if the connection is disturbed for any reason, work can often come to a grinding halt. Responsiveness is also sometimes an issue. Even the fastest connections are not faster than a local machine. Internet lag time and network transmission speed affect the thin client application. There is always some delay as information is transmitted over the network, and this delay gets greater as the distance to the servers increases (particularly for internet traffic).

If you are a globally distributed organization, and the servers (and thus the application) are located in a different country, then besides the lag time due to distance, you may be faced with local laws and regulations that apply to the location of the application but not to that of the client. You may end up in a situation where certain control of the application is dictated to you and not something you can control.

Also with thin clients, unless they are properly load balanced with redundant failovers, etc., they do tend to create a single point of failure which can be catastrophic for a business if it does not have proper contingencies in place.

It is not unreasonable to predict that thin client computing is the future for business, especially when the thin client related technology continues to advance at a pace that begins to remove, or water down the disadvantages. For now, as I mentioned before, every organization needs to weigh the advantages and disadvantages in terms of its needs before taking the step towards one or the other.

Share on FacebookShare on Google+Share on LinkedInTweet about this on Twitter

100804918As organizations increase the number of VMs they run per host and move virtualization into production workloads, they require greater management capabilities. Vendors included in Info-Tech Research Group’s Vendor Landscape all provide solutions to ease management and have moved their development focus into utility infrastructure.

Citrix and VMware, both ranked as Champions in the report, also receive Info-Tech’s two Vendor Landscape Awards. VMware receives the Trend Setter award for its addition of innovative features that enable increased control over network and storage resources and capabilities.  Citrix wins the Best Overall Value award for its robust solution offered at a much lower cost than many competitors.

Microsoft’s recent development on Hyper-V and Windows Server 2012 has greatly improved Microsoft’s offering, making it a closer competitor to market leader VMware. While the initial cost of licensing Windows Server is high, Hyper-V comes at no additional cost to customers, with other solutions requiring licensing for Server Software plus the cost of licensing the virtualization product.

Red Hat and Oracle, while not offering as strong a feature set as competitors, provide a cost effective solution to organizations looking at implementing a virtual environment without the extraneous features.

For all the details, see Info-Tech’s Vendor Landscape: Server Virtualization.

Share on FacebookShare on Google+Share on LinkedInTweet about this on Twitter

There has been a lot of buzz of a new concept emerging in the network community– software defined networking (SDN). SDN is glamorized as the network’s latest push towards a more streamlined and cost-efficient solution compared to the physical infrastructure currently dominating the floors of IT departments. Promoters are trumpeting this advancement as an innovation marvel; much like virtualization was to servers. In fact, a key component of SDN is bringing networks to a virtual environment. Despite the hype of SDN giving it much notability, many are still confused about the underlying concept of SDN, the possible complications, and the business value of having an SDN network. Visit Info-Tech’s solution set Prepare for Software Defined Networking (SDN) to guide you through fact and fiction.

SDN is essentially a network architecture where the management, mapping, and control of traffic flow is removed from network devices, and centralized in the network. This separation is said to increase performance, network visibility, and simplicity given it is constructed correctly. However, given SDN’s infancy, a sufficient number of use cases and proof-of-concepts have yet to emerge in the SDN space, leaving organizations wondering if there is any revenue generating or cost saving opportunities. How can they make a sound decision on SDN? It may be too early to make a final decision, but they can start crafting the case and investigate the early movers in the SDN space.

Be prepared to see a shift in networking paradigms because of SDN: hardware to software, physical to virtual, propriety to commodity. Naturally, this will throw off traditional networking staff from their game. But, do not worry, current SDN solutions are still in “Version 1” and future versions may see solutions become friendlier to traditional network practices and concepts. With the attention it is getting from the media and established network leaders, SDN technologies will likely (and hopefully) evolve to mainstream deployment states.

Realize SDN is here. Understand where it came from and how it can help your business. Remember to wait for the SDN space to settle and mature before implementing SDN in your organization. After all, you wouldn’t want your child driving your multi-million dollar car.

Share on FacebookShare on Google+Share on LinkedInTweet about this on Twitter

The theme of VMworld a week or so ago in San Francisco was “Right here. Right now”. Robot arm holding the earthUnfortunately the message for one of the most interesting announcements out of the conference – VMware Horizon Suite – was “Not here but will be soon.” I, for one, can hardly wait.

You may have heard the term “Post PC Era”. Horizon Suite is how VMware intends to stay relevant and important in the Post PC Era. It is the dawn of that era that is upon us right here, right now.

VMware’s forte in end user or workforce computing right now is VDI – Virtual Desktop Infrastructure – the virtualization of the typically Windows PC and hosting it on centralized servers where it can be accessed by a range of network connected devices (thin client, PC, laptop, and even tablet or smart phone). To their credit the folks at VMware realize that VDI is not an end but a beginning. For many VDI is the first step in the journey to the Post PC Era.

It is a first step because it accomplishes an important transition in workforce computing. It separates the experience of using a Windows desktop and applications from possessing a specific PC device. Managing workforce computing becomes less about managing end point devices and more about managing delivery of services to any device.

It’s important also to recognize that the Post PC Era isn’t about PCs disappearing entirely. In the PC era, which began in earnest about 25 years ago, individual or “personal” computing was synonymous with the personal microcomputer (and that included Macs). The Posts PC Era is about the end of that synonymous relationship.

Desktop PCs will go on but they will become just one of several means of accessing personal computing services. Those services will likely be hosted somewhere else. PC will go from meaning personal computer to meaning personal cloud.

This brings us back to Horizon Suite. VDI will remain important but as we move into the Post PC Era managing the delivery of applications and data to a range of devices, including smartphones and tablets, will become increasingly important. Increasingly those devices will not be owned and managed by the enterprise.

Horizon Suite brings together a number of products that VMware has been developing for enterprise-class management of these kinds of services. This includes Project Octopus (corporate DropBox functionality), Project AppBlast, ThinApp, and Horizon Application Manager. The goal is to provide the enterprise with the ability to create an AppStore-like web portal from which the enterprise can deliver Windows, Android, iOS and SaaS applications.

At VMworld we saw a cool demonstration of a Horizon-delivered corporate iPad app. Using ThinApp technology the app was resident on a personal iPad device but was wrapped in its own container of corporate managed security policies.

In Horizon Suite, plus View VDI and another recent acquisition called Wanova Mirage, VMware has a full suite of tools for the enterprise to securely deliver workforce computing in the Post PC, Bring your own, personal cloud era.

Just not yet.

The beta for Horizon Suite is expected before Christmas ’12 with the roll out following next year. Meanwhile the march to the Post PC era continues and VMware isn’t the only company developing a product suite. Of course, Citrix comes to mind.

Horizon Suite can’t come soon enough.

Share on FacebookShare on Google+Share on LinkedInTweet about this on Twitter