By Slavik Markovich
The move to cloud computing brings with it a number of attributes that require special consideration when it comes to securing data. And since in nearly every organization, their most sensitive data will be stored either directly in a relational database, or ultimately in a relational database through an application, how these new risks impact database security in particular is worth considering. As users move applications involving sensitive data to the cloud, they need to be concerned with three key issues that affect database security:
- Privileged User Access: Sensitive data processed outside the enterprise brings with it an inherent level of risk, because outsourced services bypass the physical, logical and personnel controls IT departments exert over in-house programs. Put simply, outsiders are now insiders.
- Server Elasticity: One of the major benefits of cloud computing is flexibility, so aside from the fact that you may not know (or could have little control over) exactly where your data is hosted, the servers hosting this data may also be provisioned and de-provisioned frequently to reflect current capacity requirements. This changing topology can be an obstacle to some technologies you rely on today, or a management nightmare if configurations must be updated with every change.
- Regulatory Compliance: Organizations are ultimately responsible for the security and integrity of their own data, even when it is held by a service provider. The ability to demonstrate to auditors that their data is secure despite a lack of physical control over systems, hinges in part on educating them, and in part on providing them with the necessary visibility into all activity.
Access control and monitoring of cloud administrators is a critical issue to ensuring sensitive data is secure. While you likely perform background checks on your own privileged users and may also have significant physical monitoring in place as well (card keys for entry to the datacenter, cameras, and even monitoring by security personnel) – even if this is being done by your cloud provider – it is still not your own process. And that means giving up some element of control. Yet, these individuals may have nearly unlimited access to your infrastructure, something they need in many cases to ensure the performance and availability of the cloud resources for all customers.
So, it is reasonable to ask the cloud provider what kinds of controls exist on the physical infrastructure – most will have this well under control (run away, do not walk, if this is not the case). The same is likely true for background checks on administrators. However, you’ll also want to know if a malicious administrator at the cloud provider makes an unauthorized copy of your database, or simply connects directly to the database and changes records in your customer accounts. You can’t trust simple auditing solutions as they are easily bypassed by DBAs, and audit files can be doctored or deleted by system administrators.
You have a number of ways to address this (encryption, tokenization, masking, auditing and monitoring), but in all cases you need to make sure the solution you deploy cannot be easily defeated, even by privileged users, and will also work well in the distributed environment of the cloud. This brings us to our next point.
Much has been written about how the location of your data assets in the cloud can impact security, but in fact potentially even more challenging is the fact that the servers hosting this data are often reconfigured over the course of a day or week, in some cases without your prior knowledge. In order to provide high availability and disaster recovery capabilities, cloud providers typically have data centers in multiple locations. And to provide the elastic element of cloud computing, where you can expand capacity requirements in near real-time, additional resources may be provisioned as needed wherever capacity is available. This results in an environment that is simply not static, and unless you are hosting your own private cloud, you may have limited visibility into these physical infrastructure updates.
How does this impact security? Many of the traditional methods used to protect sensitive data rely on an understanding of the network topology, including perimeter protection, proxy filtering and network sniffing. Others may rely on physical devices or connections to the server, for example some types of encryption, or hardware-assisted SSL. In all of these cases, the dynamic nature of the cloud will render these models untenable, as they will require constant configuration changes to stay up-to-date. Some approaches will be impossible, as you will not be able to ensure hardware is installed in the servers hosting your VMs, or on specific network segments along with the servers.
To work in this model, you need to rethink database security, and utilize a distributed approach – look for components that run efficiently wherever data assets are located (locally on your cloud VMs), and that requires minimal (if any) configuration as VMs are provisioned, de-provisioned, and moved.
Lastly, you will likely face a somewhat more challenging regulatory audit as you move data subject to these provisions to the cloud. It’s not that this is inherently less secure, but more so due to the fact that it will be something different for most auditors. And to the majority of auditors, different is not usually a good thing (apologies up front to all those very flexible auditors that are reading this – why is it we never have you on our customer audits?) So, if the data you need for an application hosted in the cloud is subject to Sarbanes-Oxley, HIPAA/HITECH, PCI DSS, or many other regulations, you need to make sure the controls necessary to meet compliance are in place, AND that you can demonstrate this to your auditor.
We’re seeing many cloud providers trying to placate these concerns by getting their own SAS-70 certifications, or even PCI DSS certifications done generally for their environment. However, while this is a nice touch and can even be helpful in your own audit, you are ultimately responsible for your own data and the processes related to it – and YOUR auditor will audit YOUR environment, including any cloud services. So, you will need to be able to run reports on all access to the database in question and prove that in no case could an insider have gained access undetected (assuming your auditor is doing his or her job well, of course). The key here is to look for strong segregation of duties, including the ability for you (or a separate third party, NOT the cloud provider) to monitor all activity on your databases. So, if a privileged user touches your data, alerts go off, and if they turn off the monitoring all together, you are notified in real-time.
It is certainly possible to address these issues, and implement database security that is not easily defeated, that operates smoothly in the dynamic environment of the cloud, and provides auditors with demonstrable proof that regulatory compliance requirements have been satisfied. But, it very well may mean looking at a whole new set of security products, developed with the specific needs of cloud deployments in mind.
Slavik Markovich is CTO and co-founder of Sentrigo, a provider of database security for on-premises and cloud computing environments and corporate member of the Cloud Security Alliance (CSA). Previously, Markovich served as VP R&D and chief architect at DB@net, a leading IT architecture consultancy, and led projects for clients including Orange, Comverse, Actimize and Oracle. Markovich is a recognized authority on Oracle and JAVA/JavaEE technologies, has contributed to open source projects and is a regular speaker at industry conferences. He holds a BSc in computer science.