When it comes to data loss, threats from inside the organization have become as worrisome, if not more so, than external threats. The loss of trade secrets, intellectual property or organizational reputation from a single breach can be devastating, and the threat of regulatory punishment has magnified concerns. Recently, many of the largest and most trusted companies have been hit with insider-driven leaks of sensitive data, and internal technology teams are now tasked with finding and implementing processes and technologies that protect against the insider threat.
At the same time, increased collaboration both within the organization and with partners and clients means that security requirements are now more complex and fluid than ever before – more access control decisions involving more people need to be made more quickly, and more access activity must be monitored and analyzed. Unfortunately, IT hasn’t been able to keep up with access control maintenance or access auditing for many years; without a new approach, they will fall even further behind. We now face both the risk of greater damage from a single data breach and an increased probability of a breach occurring.
In risk management terms, annualized loss expectancy (ALE) – the amount of money you can expect to lose each year due to an insider breach – is increasing rapidly. When a threat’s ALE is significant, then it’s time to implement controls to reduce its likelihood and impact. A key area of risk to start with is to prevent insider breaches, especially from unstructured and semi-structured data stores.
Unstructured and semi-structured data stores – such as file shares and email systems – hold not only the most data (80% of all organizational data falls into these categories), but they are also the least controlled. Access controls are often stale or excessive, there is rarely an audit trail, and IT has lost track of who owns and can use the data.
In contrast to databases, which have traditionally had tightly controlled access rights and auditing, the data on file servers, NAS devices, SharePoint and Exchange is growing faster and is becoming increasingly more complicated to protect. These are the platforms where IT can quickly mitigate a lot of risk by leveraging new technology designed to enable secure collaboration.
Secure Collaboration
In most organizations, almost every employee collaborates digitally, creating and sharing a large amount of data on several platforms. Data has value to the organization because it’s shared; data without collaboration is a frozen asset.
What organizations are beginning to realize is that collaboration, while completely essential, has grown beyond their control. Organizations don’t know what data belongs to whom, what’s being accessed, who owns it, and who should have access. When uncontrolled, these valuable assets can become a liability. The data on these platforms are at risk for loss, theft or misuse.
The Five Imperatives
When considering technologies and processes to realize data governance objectives, organizations must gauge the effectiveness based on five imperatives of data protection. Additionally, any system for controlling access to unstructured data has to provide sufficient automation to make processes continuous and accurate in the future, in addition to eliminating ineffective, manual processes.
First, organizations must understand their data – who has access to it, who is accessing it, and which data is sensitive. By gathering and processing relevant data about the data – that is, the metadata – organizations will be able to answer higher-order questions like: Who (outside of IT) owns the data, who shouldn’t have access, and who might be abusing access?
Second, organizations must reduce exposure to at-risk, sensitive data. Sensitive data that’s exposed to too many users represents the highest areas of risk. IT can reduce exposure by identifying who needs access. By intelligently analyzing audit information, access can quickly be reduced without disrupting business activity.
Third, organizations should identify and assign data owners. Data is an organizational asset, not a technology asset. Once the areas of highest risk are uncovered and fixed, IT needs to make sure that those with the right knowledge and authority to make access control decisions – the data owners – are identified and assigned to data sets. This alignment of data with an owner is critical to not only fixing current access control problems, but to maintain a secure access control model in the future.
Fourth, once assigned, data owners need to be appropriately involved. Data owners need information about their data and the ability to control access independently of IT. New technologies can provide recommendations to data owners on where access rights can be safely removed, flag changed access rights, and give real-time intelligence of people that may be abusing their access rights. When data owners enact access control decisions, the probability of data loss to insiders is reduced.
And finally, organizations should leverage intelligent automation. A single terabyte of data typically contains tens of thousands of folders, many of them with unique permissions, and millions of files. Effective governance of even ten terabytes is impossible without efficient, effective automation, much less the many-hundreds of terabytes found at large organizations.
IT needs to leverage technologies designed to gather, process, and analyze metadata, present relevant actionable intelligence, and then commit appropriate changes back to their environment. Intelligent automation will provide IT and data owners with the ability to reduce risk and maintain correct access throughout the organization, and practice secure collaboration.
Reduce the Insider Threat
IT now has the ability to answer questions about unstructured and semi-structured data that were impossible a few years ago. By leveraging new technologies, IT can finally understand how these data sets are being used and who they belong to. Through automation, access control can be locked down to a secure level that reduces the risk of exposure and lowers the ALE of critical business assets.
Brian Vecci is a technical product marketing manager at Varonis Systems, a leading provider of comprehensive data governance software. He is a skilled client-facing technology evangelist, having served in similar roles at Microsoft, UBS and Compuware. Vecci has experience across a variety of industries, including financial services, legal, publishing, and information technology.