Jay Butler, Senior Technical Consultant
Some of you were probably around when new accounts were opened on dumb terminals and teller transactions were processed on a glorified calculator. You know the faded, manila-colored, monitor/keyboards of the mean, green-screen machines, and the fancy calculators that could print receipts (sometimes). Commonly used before the introduction of actual desktop computers, these things must have been the first step up from just hand writing everything. I don’t know, I wasn’t around before then. I started my IT career with community financial institutions during the technical revolution of Windows 3.1 desktops and Novell Netware 3 file servers. The environment of centralized mainframe with connected dumb terminals had been mostly replaced by a distributed network of servers and desktop computers.
Not long after dumb terminals and calculators were completely wiped out by a network of ever evolving Windows desktops, Novell Netware servers disappeared seemingly overnight with the advent of Microsoft’s Active Directory. At about the same time, modems were being replaced by high speed, always-on Internet connectivity with sophisticated firewalls. There have most certainly been many other innovations over the last decade, but that was the last major wide scale technical overhaul that stands out in my mind. Things have been mostly status quo when it comes to the core server and workstation infrastructure, but I sense virtual technology is about to change that dramatically.
Virtual technology is nothing new, but only recently has it become a viable option for most small businesses. Like other major cutting edge technical innovations, exorbitant cost has been the barrier to early adoption for community financial institutions. Inflated prices are typical in the early stages of new technology as leading vendors race to recuperate development investments before competitive forces take hold. At the same time, the broader IT service industry starts its own race to learn the expertise necessary to provide cost effective implementation and support. As real world implementations spurn further innovation, products emerge more refined with greater stability and rich, finely tuned features. These timelines vary but eventually, most worthwhile technology becomes a viable solution. Virtual technology is most certainly worthwhile and for many it has been a viable solution for a while now. Not all mainstream technology warrants application in your environment, but I believe virtualization deserves serious consideration as your next major upgrade. The costs of virtualization are down and many of you have aging server hardware that needs to be replaced. If you’re considering server upgrades, it might be time to take advantage of virtual technology.
Virtual technology separates the operating system (OS) from the hardware. Before virtual machines, a single computer was only able to run a single OS because the OS required exclusive access to the computer hardware. A virtual machine differs in that it can run many operating systems at once via specialized software. The software is actually an OS in its own right that controls the hardware making it available to the virtual servers transparently. Think of the computer hardware as the first layer, the virtual server OS as the second layer, and the actual servers the third layer. Virtual server OS software such as VMWare is compatible with most any hardware and is installed first. Traditional OS software such as Windows Server 2003 goes on next as the virtual machine. The transparency of virtual server operating systems means each virtual machine “believes” it has exclusive access to the hardware though it is really being shared. The server does not “know” it’s virtual so it operates just as it would without the virtual server layer. Different operating system versions are allowed at layer three up to the hardware’s capacity. Each virtual server runs independently so that a problem with one does not affect the others.
You can relate this to your personal PC in that its OS allows multiple programs to run at the same time. If one program has a problem, it often does not affect the other programs in use; however, like your PC, if the computer hardware fails, all the virtual servers fail. If the server hardware fails, it’s possible to have an instant failover configuration so that all the virtual servers on the failed hardware (computer) recover automatically to other viable hardware. This is typically accomplished through use of a Storage Area Network (SAN). If a SAN is beyond your budget, the next best thing is the strategic deployment and backup of multiple virtual machines.
Server problems like these benefit from something commonly referred to as snapshots. Once a virtual server has been created, it can be saved and resaved in order to preserve it in a working, trouble-free state. With “snapshots” you can quickly revert back to the servers’ state prior to any change essentially eliminating costs associated with downtime and lengthy troubleshooting steps. Virtual server snapshots can be loaded on any hardware that runs the same underlying virtual server software. Application software changes are easily tested offline by saving a production virtual server and bringing it up on separate, non-production hardware where an upgrade can be performed and tested. This kind of portability also has major implications for reducing the cost and complexity involved with large scale disaster recovery.
Disaster recovery (DR) lives in relative obscurity and testing has always been a major burden for community financial institutions. Traditional solutions that provide quick failover recovery are fairly expensive and can be plagued with complications. Most DR plans specify a hot site where duplicated standby equipment sits around waiting on a rare event that may or may not happen. Testing involves complicated mock recovery procedures that often fail to meet expectations and rarely prove with any certainty that the plan will actually work if ever invoked.
Virtual technology is going to transform this mass of confusion into a model that is easily understood, and a plan that can be easily proven. Safe Systems has already taken steps in this direction by offering a new service called Continuum; however, with Continuum you don’t even need virtual servers at your site. We host virtual servers at our facility and house continually updated virtual server images of your standard production servers. In the event of a disaster, we can immediately bring up your servers in our virtual environment and provide seamless remote access wherever your business needs to be. Continuum can significantly reduce your DR costs and provide simplified testing that easily validates your plan. Continuum of course does not preclude your own virtual server deployment plans. Indeed Continuum could well compliment your plan by serving as a more cost effective disaster recovery solution.
Difficult economic times among other factors have slowed what once was about a three year turnover rate for business critical servers. As servers age, risk increases for hardware failure. These days the expansion has slowed and there has not been a compelling reason to migrate to a new server or desktop infrastructure such as when Microsoft introduced Active Directory and later Windows XP. I believe these dynamics explain why I am starting to see a trend in hardware malfunctions encountered by our network operations center. These failures may result in significant downtime as recovery procedures are often lengthy and problematic.
Replacing the hardware with new virtual servers not only reduces the risk of failure but it also improves recoverability as I have explained. As you make plans to replace risky server hardware, virtual machine technology such as VMWare warrants top consideration. Windows 2003 server software and client access license investments could then be extended by simply moving those installations over to new virtual servers. The 2003 OS will be viable until at least 2015, the currently scheduled end of life. When it comes time to upgrade to the latest Windows Server version, virtual servers will ease that transition. By then, we may be planning a totally virtualized infrastructure where all hardware is shared across the enterprise. Virtual technology is revolutionizing information technology the same way desktop/server networks did decades ago. For community financial institutions, the time has come to realize the advantages of virtualization.
For additional information about Virtual Technology, see Curt’s previous Emerging Technology Series article on Server Virtualization. As mentioned SANs are often combined with Virtual Technology, for more information please see Brian Brannon’s article What is a SAN?