Social engineering has long been the preferred route for hackers, whether through the front door or using social media and email. So what better way to protect against the threat than with a bit of ‘social pen-testing’? Davey Winder reports.
Phishing remains a very real threat to organizations of any size. Symantec research showing a 91% increase in spear-phishing attacks from 2012 to 2013 tells us that much. But forget thinking of the threat in terms of the old Nigerian 419 or Canadian National Lottery scams. The bad guys have moved on and so must you if you’re to avoid falling victim to increasingly sophisticated and sector-specific targeted phishing attacks.
You need to start thinking about implementing a social engineering vulnerability evaluation strategy. But when is the right time?
Time for Action
Rodney Joffe, who leads the security department at Neustar but also gets regularly called-upon as an expert to brief the US government on cyber threats, is in no doubt that organizations should start now by developing training programs. These should not only educate employees to recognize, report and resist such attacks, but also to understand the potential impact to themselves, their jobs and the future of the company.
“The most common vector for corporation compromise over the last three to five years has been through the use of social engineering of some kind,” says Joffe. “Some of this will be automated [phishing emails] and some will be manual [targeted phone calls and personal emails], with personal information being shared to gain trust.”
If a company has not already been targeted, it’s most likely because that company simply isn’t visible online yet, according to Joffe.
Kevin Foster, testing services manager at MTI, agrees that it's no longer a matter of if you will be socially engineered but when, as all organizations hold information that is valuable to somebody.
“The size of your organization is also a factor,” Foster told Infosecurity. “With growth and multiple locations comes opportunities for anonymity and social engineering attacks with staff leaving and the risk of impersonation of ex-staff members becoming a threat.”
“[Phishing] can shock complacent staff into realizing how vulnerable to social engineering they really are”Joe Ferrara, president and CEO, Wombat Security Technologies
OK, that's all pretty much a given, but are there any clear cut advantages of this type of social pen-testing?
Joe Ferrara, President and CEO of security education services provider Wombat Security Technologies, sees the benefit as twofold. “Firstly it can shock complacent staff into realizing how vulnerable to social engineering they really are, and through that keep them on their toes and improve overall security,” he says. “Secondly it opens a valuable communications channel between users and security staff.”
In other words, it can help people understand that they can report phishing and other malicious attacks to their IT department, and creating a security conversation is never a bad thing anyway. Of course, all such testing needs to be tied to further training efforts.
Without the aforementioned further training efforts, “Mock phishing can be a negative experience for the end user,” Ferrara insists. “However, when they see the teachable moment, a gentle teaching point of what they missed, they are humbled and realize they can be a victim and thus are more receptive to more in-depth training later."
In-house or Contractor?
Once a decision is made to implement such an evaluation, should these mock phishing tests be carried out in-house, or via an external contractor?
Ken Munro, founder of ethical hacking firm Pen Test Partners, sees pros and cons to each. As far as an internal approach goes, he cites cost effectiveness and the “highly informed” perspective from which it can be applied as advantages.
“Useful information known only to internal staff can help in fooling particular people, and if done correctly, it’s possible to fool all the people, all of the time,” he explains.
On the downside, emotional involvement by the phisher may mean the approach is lacking in its use of key language or emotional triggers. “The results of an internal test may also be skewed by personality issues or just the fact that the person running it may not be able to put on their ‘evil hat’ and try to take advantage of their colleagues,” Munro warns.
As for external testing, the pros encompass having a detached view and the option of including ‘ticking clock’ techniques not known to those outside this niche pen test sector. Key profiling traits such as ‘amiable’ or ‘analytical’ can make certain approaches more appealing with higher success rates. Also, if staff are upset by being duped, Munro says that management can always shift the blame to the pen testing company, allowing for a more independent view, which is hard when dealing with internal users.
As for the cons, well a good external company may be costly. “If you buy an off-the-shelf approach you may get the equivalent of a phishing Happy Meal,” says Munro.
So the internal approach is good for those with significant budget restrictions but shouldn't be relied upon to be wholly indicative of security posture. Still, it’s a decent place to start.
Engage Not Enrage
Whichever approach is taken, the fundamental aspects of an effective social engineering evaluation and training scheme will be much the same; but how can information security professionals and management present these in such a way that it engages rather than insults the intelligence of staff?
Scott Greaux, vice president of product management at PhishMe, reckons the most important part of any behavioral risk management program is to be open and up-front about the program goals and objectives. “This allows a dialog to occur and concerns to be addressed before any simulated phishing training takes place,” he says.
“We always steer the debrief conversation in the direction of remediation and education, rather than blame and sanctions”Gavin Watson, Random Storm
The second most important program attribute, according to Greaux, is the inclusion of an “in-line education experience that is brief, non-punitive and focuses on a few items that the email consumer can control.”
When it comes to attack specifics, MTI’s Foster offers the following environmental elements which have to be maintained for an effective simulation:
- Information Gathering Secrecy: Information regarding the assessment must remain at board level to mitigate leakage regarding the audit.
- Business As Usual: Maintain all functions within the organization as business-as-usual (BAE). Only operating at BAU will provide a true picture of the organizational exposure and allow for vulnerable points to be identified prior to the attack phase.
- Keep it Real: Security policies and traditional training courses can seem rather dry and isolated from the organization, with users feeling it will never happen to them. In order to counter this, it can be useful to use real examples of successful attacks aimed at their business, or similar ones, to show what actually happened and how the compromise was achieved. In other cases, it’s more appropriate to engage the audience in a demonstration or test exercise to see whether they can detect a phishing/scam email and whether they would recognize a malicious website or attachment carrying an exploit.
- Reporting: Testers must provide a full audit of activities conducted, the methods used to achieve the aim, and the information gained at each stage of the attack.
Risk Mitigation
Of course, there are potential risks associated with socially engineering your staff, and these require mitigation. Scott Greaux admits that with any program involving large scale communications, there are many opportunities to make poor decisions. In six years of working with hundreds of global organizations, PhishMe found that failing to inform potentially affected parties of the program – including email consumers, internal departments, and senior management – could lead to problems.
Abusing copyrighted logos and brand material in simulated exercises can also cause confusion and anger, and possibly lead to legal problems. Additionally, punishing users who fall victim – instead of providing encouragement – can fail to spur the desired behavior changes. Including only certain parts of the organization in the program can also lead to problems, says Greaux.
Tact and Training
So where does this leave us on balance? There can be no doubt that while using simulated social engineering attacks is a relatively new approach and a contentious one, it's cost-effective and hugely beneficial if effectively managed. The last word goes to Gavin Watson who is head of the Social Engineering Penetration Testing Team at Random Storm, and author of Social Engineering Penetration Testing, published by Elsevier.
“Rather than simply training staff to stop people from blagging their way past reception, we get employees to work with us and, starting with the assets that they are responsible for protecting, we get them to brainstorm how they might try to get to that asset. We teach them the basics of how to socially engineer,” he tells Infosecurity.
“Then we get them involved in role-play to see how they would challenge a real attacker. This then empowers them to think in terms of the asset protection and consequences of a breach, and provides them with the skills to spot and thwart attacks, rather than just banging them over the head with a rule book and pointing the finger of blame at individuals that ‘fail’ the pen test.”
Pen testers must be aware of the emotional state of the staff being debriefed and final reports should be delivered with tact, he adds.
“We always steer the debrief conversation in the direction of remediation and education, rather than blame and sanctions,” says Watson. “It is not uncommon for clients to be shocked and angry that they were breached during the evaluation, and to react by blamestorming.”