The news these days in security is mostly “doom and gloom”. Just consulting a site like DatalossDB.org is enough to depress even the most hardened security professional. However, there are technology advances happening all around us - some which may lead to new security issues, and others that may help security teams out enormously.
I’m willing to argue that virtualization technology falls into the latter category on most counts. Sure, there are flaws in virtualization software, and new attack vectors (the hypervisor, management tools, etc.).
However, there’s also a new angle to this story - virtualization provides an interesting theme of centralization and unification in the data center that provides myriad benefits for security and operations teams, particularly in certain areas. In this post, we’ll explore some of the most fundamental benefits of virtualization, and in the next we’ll delve into some of the less obvious areas.
The SANS Institute (full disclosure, I teach and author courses for them) has long been promoting a program called the Twenty Critical Security Controls. This program has gone through a number of revisions, but the top two items on the list since it began have been asset inventory and software inventory.
A huge number of issues have arisen from poor inventory management. When the Heartbleed bug was first announced early in 2014, security teams were scrambling due to the sheer pervasiveness of the issue - it was everywhere. The security community quickly realized that many of our organizations’ commercial products had embedded open source components. We scrambled to get an idea of system and software inventories, also pushing vendors for patches as quickly as possible; this same scenario occurred later in the year when the Shellshock (Bash) bug was announced.
Virtualization may be one answer to this problem. Given that hypervisors force a more centralized computing environment where virtual machines share hardware assets, it would stand to reason that focusing on hypervisors and all of the virtual and physical resources connected to them would yield results in wrangling a thorny and pervasive problem - namely, keeping up with all of our assets. Running virtual machines are known to the hypervisor, which makes the hypervisor platform a central control and query target.
Numerous methods to script and automate these queries exist - shell scripts in Xen and KVM environments, PowerCLI and vSphere CLI scripts for VMware, and many automation and orchestration scripting tools and frameworks like Chef and Puppet. These tools can easily provide some automated methods of populating an inventory database of virtual machines, whether desktops or servers. This can then be reconciled with a master list of systems, with anything “new” sending an alert to operations teams and security and response teams. In fact, automation and orchestration tools could be used to temporarily isolate or quarantine unrecognized systems until they’re validated, if desired.
The second major aspect of virtualization facilitating security efforts around inventory and software is analysis of the installed apps and components within a virtual machine. There are many different ways to accomplish this. First, defined virtual machine template files can be monitored for changes (using hashed and other file integrity monitoring mechanisms), ideally to prevent out-of-band changes or malicious modification of templates from being provisioned to new systems in the environment. As these template files are hosted and facilitated by the hypervisor, using the hypervisor as a central point of access to and control over the templates makes sense.
Within virtual machines, we can also assess software inventories using virtualization-compatible host security tools that can take advantage of APIs and the hypervisor kernel itself to more rapidly and accurately “scan” the OS and file system to look for unauthorized software, or prevent it from being installed in the first place. This approach brings much more efficiency to the process of assessing individual VM software inventories, and takes advantage of the hypervisor’s integration with VMs and resource management capabilities, as well.
Virtualization isn’t a panacea, of course. In fact, we have some new inventory management problems, as well, namely virtual machine sprawl (or VM sprawl). Dormant virtual machines (those that are suspended or turned off) won’t be visible to scripts and monitoring capabilities, although the files associated with the VMs could be located within a storage environment. In addition, the presence of snapshot files and backups make the total inventory equation more challenging, as those files likely contain a variety of sensitive data.
The idea of virtualization helping to simplify change and configuration management also aligns with inventory discovery and reporting. Existing configuration databases could be more readily updated as inventory discovery was completed across the virtual environment. Automated lists of systems to patch or configure in specific ways could be easily created with the same processes and tools. Patches may also be easier to install on virtual systems as all the VMs are somewhat co-located and can be updated with minimal impact to bandwidth or environment conditions. Change management could be more streamlined, but may require additional analysis of potential impacts due to the tight coupling of components at all layers of the stack.
This has particular merit and promise for virtual desktops. Many organizations use a “linked clone” configuration to create new virtual desktop instances from a central template. By updating the template, all new virtual desktops are created meeting configuration and patching requirements by default. Keeping this template protected with anti-malware and other OS-level protection tools is paramount, naturally, but the benefits to a single point of protection are enormous. Managing desktop security, from client application vulnerabilities to OS patches and high volumes of malware attacks, could be greatly streamlined in a virtual environment instead of chasing down individual laptops and desktops spread across a large network environment.
In the next post, I’ll explore several other areas where virtualization may play a role in helping to simplify fundamental security processes and streamline controls.
{{cta('f3d4e3fc-af0f-441d-9eb9-deceb2dd3925','justifycenter')}}
tags
Dave Shackleford is the owner and principal consultant of Voodoo Security and a SANS analyst, senior instructor, and course author. He has consulted with hundreds of organizations in the areas of security, regulatory compliance, and network architecture and engineering, and is a VMware vExpert with extensive experience designing and configuring secure virtualized infrastructures. He has previously worked as CSO for Configuresoft, CTO for the Center for Internet Security, and as a security architect, analyst, and manager for several Fortune 500 companies. Dave is the author of the Sybex book "Virtualization Security: Protecting Virtualized Environments", as well as the coauthor of "Hands-On Information Security" from Course Technology. Recently Dave coauthored the first published course on virtualization security for the SANS Institute. Dave currently serves on the board of directors at the SANS Technology Institute and helps lead the Atlanta chapter of the Cloud Security Alliance.
View all postsDon’t miss out on exclusive content and exciting announcements!