
Technology breaks in waves; some bigger than others. Virtualization is a big wave. It changes how we think about technology and how we do it. Virtualization is both a great advancement and an immense disruption.
The shift from counting on fingers and toes to using the abacus meant a new way to think about counting. Fast forward a few millennia to the wave of digital computers and mainframes that completely changed the face of science, government, and business. After the mainframe (still with us despite predictions to the contrary) came the waves of mini-computers, personal computers, laptops, tablets, smart phones, all with many iterations to change our thinking along the way.
Wavelets.
Parallel developments created advanced networks and operating systems, databases and the computer languages we use to build complex applications. Then, on the eve of Y2K, x86 server virtualization was introduced by VMware and met with a counter-wave of skepticism.
Modeled after IBM’s mainframe virtualization that was developed in the 1960s, x86 server virtualization still seemed exotic, mysterious, and a bit implausible. Then it took hold. CIOs and CFOs recognized the major benefits possible in reducing physical server count while maintaining or even increasing server capacity with virtualization. Add in capabilities for using non-disruptive live migration to move from one physical host to another. Bring in cloud computing and broad high availability benefits. Save energy, floor space and capital costs and it’s clear why server virtualization has been enthusiastically pursued.
Then, similar to VMware virtualizing physical servers, companies like Cisco and Arista brought us network virtualization. Their goal was also to decouple physical from logical, creating a new level of flexibility and mobility. It meant removing boundaries and deploying network resources as logical instead of physical infrastructure. All of these virtualization advances showed how resources could be more efficiently deployed to reduce capital and operating costs. Virtualization greatly enhanced scalability, manageability and availability coupled while smoothing advances toward mainstream cloud. The concept more spread to storage hardware, data center design and application development. Virtualization became mainstream IT, the first option for any new deployment. Not exotic anymore. A very big wave.
Gartner’s definition:
“Virtualization is the abstraction of IT resources that masks the physical nature and boundaries of those resources from users. An IT resource can be a server, a client, storage, networks, applications or operating systems.”
But this definition is incomplete. To be truly comprehensive, and to achieve the greatest possible impact, we need to include data. Data Virtualization. Think of it as VMware for data. Just as server virtualization eliminated multiple, underutilized, redundant machines, data virtualization eliminates multiple duplicate data sets. Data is freed from the restrictions of physical infrastructure and instead linked directly to applications. One golden master copy simultaneously serves multiple uses. Data is no longer storage or OS or system dependent. Like a virtual machine, it can easily be moved and shared across the user base. Data itself becomes the infrastructure.
When a single technology approach can resolve several challenges at once, you know you’re on the right track. Data virtualization does that. First, it eliminates the need for a bunch of systems used to create copies of data. No more need for separate licenses and hardware systems to create data copies for backup, archive, analytics, or application test and development. Then, following the course of server and network virtualization, elimination of redundant copies means that storage footprint, capacity, cost, and management can all be reduced.
Cascading benefits include tighter data security because there’s a smaller attack surface. Application development and test can be accelerated because virtual copies of production data are immediately available for developers, testers and QA. Virtualized data can be easily and quickly moved to the cloud and back. The abstraction of data, data virtualization, is the missing puzzle piece. It elevates the combined resources of IT above proprietary physical infrastructure. Data is no longer tied to a box. It’s tied to a purpose.
Some of us may think we know what will be the next wave. But it’s not easy to be certain of the future. Remember chairman of IBM Thomas Watson’s famous quote in 1943: “I think there is a world market for maybe five computers.” Clearly he missed how the compute wave would break, and he helped to invent it. But there’s another, not so famous quote from Watson. It advocates a mindset that continues to search for and contribute to whatever will be the next wave: “Whenever an individual or a business decides that success has been attained, progress stops.”
So, here’s to progress.