Virtualization and cloud computing are broadly similar technologies, but they each have notable differences which business decision makers need to know so that they can choose the right option for their company.
Virtualization allows for the creation of multiple simulated environments from one system, effectively turning it into multiple systems. It is this technology which powers cloud computing, whereby multiple departments can access (and usually edit) a single resource set. Cloud computing requires manipulation of hardware through virtualization; without the latter, the former would not exist.
One of the main characteristics of cloud computing is its facilitation of a single resource to be accessed by multiple parties from any location once they have Internet connectivity. Virtualization is not as readily accessible; if the user is located outside the network, they can only gain access once they obtain the requisite information from the network controller.
In the event of a disaster affecting your network, the consequences are far more severe with virtualization, as the failure of just one machine could cause chaos across the entire network. If only one machine on a cloud was affected, the others would still be operational, so disaster recovery is much more straightforward.
Cloud computing is often seen as the best solution for businesses spread across a wide geographic area where multiple staff require access to a single cloud, or where human resources are comparatively sparse. Virtualization however, is the ideal option for businesses where full control over integration and security is required.
This infographic from The Missing Link goes into further detail about the key distinctions that define virtualization and cloud computing.