People have levered machines throughout history to make tasks easier: water and wind power made physical labor less taxing and made people more productive; John Henry died in an effort to prove that he could outwork a steam-powered hammer.
Today, machines are infinitely more intelligent – capable now of much more than simply augmenting, then eclipsing physical capability. Muscle has given way to machine.
Data science has arrived, bringing with it new buzzwords: algorithms, machine learning, deep learning, deep machine learning algorithms. Your everyday life is full of examples of machine learning – a type of artificial intelligence (AI) that gives computers the ability to learn without being explicitly programmed – and then change when exposed to new data.
In the security industry, machine learning is critical to detect and prevent insider threats. For example, user and behavior analytics (UEBA), an emerging cybersecurity technology that employs machine learning, is able to detect shifts in employees' “normal” behaviors and psycholinguistic changes that may indicate a potential threat. If those changes are detected, the UEBA system can alert so action can be taken.
Consider the following example of how UEBA works: Susan, who works in accounting, moves three sensitive spreadsheets a day to a corporate Dropbox account shared by the finance group. That’s part of Susan’s job, and the UEBA software, driven by machine learning, has been learning her normal routine. After several months, however, Susan starts loading the spreadsheets onto a USB drive as well. The software detects this shift in behavior and communicates it by changing her employee risk score, delivering an alert to the security team.
Is this shift in behavior out of the ordinary but harmless? Or does it indicate something more nefarious? Could she be sharing sensitive information with someone outside the company?
Susan may be copying these files to a USB drive because she was asked to provide them to someone in the company that way. If a machine were solely in charge of evaluating the situation, it might automatically lock her out of her systems after detecting this irregular behavior, suspecting intentional wrongdoing. However, automatically locking her systems down would impact workflow and lessen productivity.
Determining the best way to protect the company while enabling vital work to continue is best left to humans that can assess context and make their best judgment call. Machine learning was able to detect suspicious behavior and alert administrators to take a closer look at Susan’s actions.
Consider this key point: Behaviors of large groups of people can be used to predict future actions because machines can view and analyze large quantities of data. However, past behaviors of individuals are far less reliable for predicting future actions because people are dynamic. People change their minds. Sometimes they need to react to different situations and make different choices.
Machine learning does a fantastic job of highlighting areas where conditions are ripe for an insider attack. It is an invaluable tool in the ongoing fight to better secure against breaches, leaks, theft, and sabotage, but it should remain a tool that people lever to improve their decision-making capability.
Unlike the steam hammer, which ultimately replaced human efforts, machine learning can’t replace the human here. No John Henry’s in cybersecurity, please.