In 2009 Air France flight 447 settled their 228 passengers onboard. They received push back clearance from Air Traffic Control and took off from Rio De Janeiro Airport bound for Paris. A few hours into the flight, whilst over the Atlantic Ocean, they aircraft disappeared. Following a two year search the black box was recovered and it revealed the factors that led to this fatal crash.
An investigation revealed that a storm had hit the aircraft and ice had disabled some critical systems. The pilots had taken manual control of the aircraft and owing to inaccurate speed and altitude readings combined with pilot error the plane stalled and plummeted into the ocean.
Why am I starting an information security blog with a tragic tale of a historical plane crash? Well, both Air France and Airbus were just cleared of involuntary manslaughter in a case brought by the families of the victims. They wept in court as the judge proclaimed there was insufficient evidence to hold them accountable.
Tackling the Blame Game
The striking thing about the aviation industry, almost like no others, is the open and transparent safety investigations that follow both crashes and near misses. They employ something often referred to as “just culture”. This is similar to “no blame culture” except for instances where there is wilful misconduct or gross negligence found.
The result of this is twofold. Firstly, they have a transparent and thorough post-incident report following detailed investigations. Secondly, these reports are shared with the industry leading to improvements and learning across operators and manufacturers around the world.
You can probably see where I am going with this. If we look at the cyber attacks observed in the last year alone, of which there are many, I can think of only two organisations that have done any real “let’s all learn from our mistakes” type of endeavour.
When I run cyber crisis exercises for clients there is one thing that strikes me time and time again. If we weren’t all reinventing the exact same wheel, we would be much further along in our security journey.
The “safety” industry in aviation in particular has a very public need to adopt this “just culture” and open, industry wide reporting.
A cyber breach isn’t likely to put hundreds of human lives at risk, there won’t be photos of debris gracing the front pages and yes, they are much easier to cover up. This tempting option to cover up all or most of what happened especially our own mistakes actually is hindering our progress significantly.
We are really all in the same boat and are constantly reminded that “its not if but when” so why are we so reluctant to share more openly what we have come to learn? The sad reality of using this veil is that cyber attackers haven’t had to innovate much over the years, we are very much still discussing the same issues on panels at conferences and weekly we see another big company get hit.
The lessons we should be learning from Air France flight 447 and other aviation safety incidents are:
- Don’t punish innocent mistakes, address the culture that led to them. Learn from them.
- Do a full, detailed post incident report.
- Share those lessons with your fellow CISOs and security professionals. Not the edited curated lessons, the ACTUAL lessons. Warts and all.
Cyber is a unique challenge but I for one am a firm believer that we should be cherry picking the best parts of other industries and adapting them to our battleground.
Follow The Beer Farmers on Social Media:
Twitter: @Thebeerfarmers
LinkedIn: The Beer Farmers