On the third and final day of InfoSecurity Europe 2022, Ian Hill, director of cyber security at BGL Insurance, hosted a roundtable discussion on disinformation warfare, exploring the relationship between the subjective truth and objective facts in the context of fake news. The role that psychology, social media and culture play in contextualizing and interpreting information was discussed, as well as the increased need for AI-based cybersecurity techniques and technology to detect and mitigate such risks.
Hill commenced the discussion by telling the room that we’re living in a content-heavy world, and the combination of social media and email means we have to navigate through an information-dense digital space unprecedented in human history. A number of case studies was used by Hill to illuminate the significant issue of disinformation and how it’s increasingly challenging to decipher fact from fiction, particularly with the advent of deep fakes and fake news.
One such case study was the recent UK “fuel shortage,” with Hill stating that there was, in fact, no petrol shortage, and the overwhelming demand for petrol was caused by social media and disinformation, further emphasizing the real-world consequences that fake news can have. Hill then presented a framework for “information pollution,” which categorized the different types of disinformation into three subgroups: misinformation, disinformation and mal-information.
Misinformation relates to false information that is not intended to cause harm; disinformation is considered to be false information that intends to manipulate and cause damage to individuals or organizations; and mal-information has substantial “malicious intent” and refers to information that stems from the truth but is often exaggerated in a way that misleads and causes harm. The question of why people engage in “information pollution” and the spreading of disinformation was also discussed, with Hill believing that there are a variety of motivational factors, including seeking approval and admiration, elevating one’s importance and gaining or exercising power or wealth.
The roundtable discussion continued on the theme of information pollution with an often-overlooked issue: information that’s out of date or has been superseded, a problematic aspect of the internet’s archiving nature. The role of psychology was also discussed in understanding the spread of misinformation, with Hill providing an overview of cognitive biases common to humans, including implicit bias, omission bias and confirmation bias. These biases were contextualized within the problem of “astroturfing,” which is a malicious practice in which a larger actor makes it appear as if a particular message or view has originated from the ground up and has grassroots support.
Participants from the audience contributed nuanced and counterbalancing views, affirming that while disinformation is a significant problem, it’s not new. They also stated that the internet and social media are just tools and that we need a sense of proportionality – but it’s imperative we’re not losing a sense of proportionality. They also stressed that people are cautious in implementing too much regulation and discussed the need for freedom and security. Finally, they detailed that the “guardians” of information dissemination need to be “guarded” and checked.
The session finished by reasserting the need for an online information environment that’s neither too open nor too closed and that disinformation is a weapon in the new battleground that is the internet