The growth and weaponisation of disinformation across the globe has the potential to destabilise the global economy and topple institutions that serve as the framework for society.  

Disinformation was once the near-exclusive provenance of nation-states – both the Brexit campaign and US elections in 2016 were plagued with accusations of misinformation and fake news. However, disinformation campaigns are escalating as a problem for corporations, driving new forms of cybersecurity threats – those that aren’t just technical, but psychological.  

There’s no shortage of case studies to point to, from Wayfair’s accusations of child trafficking, to Victoria’s Secret’s allegations of marketing lingerie to pre-teens, to the death of a BT Group employee due to conspiracy theories alleging that wireless 5G networks were being used to spread COVID-19. For business, this is a clarion call to take the lead – not simply to protect their own reputations, but to steward and uphold an increasingly threatened information environment. 

There are two statistics that help demonstrate the scale of the problem. It takes fact-based information about six times longer to reach 1,500 people as it does for false information. In fact, Edelman’s recent Trust Barometer showed that, 46% of adults in the UK have poor information hygiene – they don’t verify the sources of their news and they don’t check their sources before sharing. Social media also allows disinformation to spread at an exponential rate. Just last month Belgian visual artist, Chris Ume, revealed how his deepfakes of Tom Cruise, which went viral on TikTok, were able to evade disinformation tools. 

It’s therefore not surprising that the report also found that 7 out of 10 global executives at large organisations worry about false information being used as a weapon and are concerned that new technologies will make it impossible for the average person to know if what they are seeing or hearing in the media is real. Their concerns are valid.   

Six in 10 people don't trust journalists, government leaders, or business leaders and are worried they are purposely trying to mislead people and contaminate the media. This global “infodemic” has driven trust in all news sources to record lows, with social media (35 percent) and owned media (41 percent) the least trusted and traditional media (53 percent) seeing the largest year-over-year drop in trust.  

Disinformation breeds distrust and has harmful effects of society as a whole. When catastrophes strike society’s faith in traditional institutions is what encourages us to come together for the public good and take collective action. However, disinformation undermines public trust, leaving a dangerous mistrust in science amid a global pandemic and climate change crisis in the face of an environmental emergency.  

Clearly this issue is larger than any one nation or private company. But as Edelman’s Trust Barometer tells us, business is viewed globally as the most competent institution to solve societal problems, and it’s increasingly viewed as having an ethical responsibility to do so. It’s now time that the private sector to acknowledge and take on a responsibility to combat disinformation, not just as a reputational risk for corporations, but also as a social harm.   

So, what must business do to become part of the solution? To recognise where the private sector can be effective in combatting disinformation, it’s important to understand that disinformation campaigns are often coordinated by sophisticated actors. Like any organised structure, their behaviours often fall into repeatable patterns that provide us an opportunity to disrupt their efforts. But in order to accomplish this task, the private sector has to leverage tools and strategies which are as flexible and inventive as those they seek to undo: 

Understand the threat. To succeed in combating disinformation, corporations must first identify and unmask the operations and motives behind these campaigns, plotting out what they are seeking to achieve and how it is likely to spread. Bad actors convene in the dark corners of the web, on fringe and encrypted platforms, to workshop potential targets and disinformation narratives. These activities are currently undetectable by the vast majority of traditional online and social media tools. Once their target and preferred narrative is set, disinformation operatives leverage the behaviour science phenomena known as “confirmation bias” to target communities who are already predisposed to believe the elements of the disinformation campaign, leveraging communities on fringe platforms like 8kun and 4chan, who then often add their own disinformation spin before sharing the content on more mainstream platforms, such as Twitter and Facebook. Once their disinformation reaches newsfeeds on mainstream social media, bad actors know that the behavioural science phenomena “illusory effect” will kick in, meaning that as people see disinformation repeated and shared by multiple users on social media, they are more likely to believe that there is at least some truth to the conspiracy.  

Avoid “debunking” tools. Many tools promise to “debunk” disinformation, but their effect is limited. Academic studies show that once a user has been exposed to disinformation it is virtually impossible to dislodge from their brain. Therefore, the current tools widely marketed to the private sector don’t work.  

Inoculate. Emerging experts in the field of behavioural science have built an impressive body of evidence supporting the theory that users can actually be “inoculated” to disinformation. Whereas medical inoculation involves the introduction of a weakened virus to trigger protective responses to build antibodies, disinformation inoculation presents users with intervention messaging before disinformation hits their newsfeeds. Leading minds in the field, like Dr. Sander Van der Linden of Cambridge University, have produced results proving this method can trigger protective responses, such as enhanced critical thinking, which may even have longer term implications. To achieve this, companies must deploy interventive messaging to empower their audience to make informed decisions on what information is real or manipulated. 

However, to truly succeed, this effort must be the beginning of a larger collaboration. The threat is too great to operate in silos. It’s imperative that tools and strategies are evolved across the globe, key learnings are shared, and that business works closely with the academic community and public sector to confront this crisis. The private sector has proven its ability to meet new challenges, like developing a vaccine for COVID-19 and landing a rover on Mars. The business case for defending against disinformation attacks is clear, but the moral imperative must be our north star.  

Watch the full recording from disinformation: a behavioural science approach below: