Parallels are often drawn between the existential threat posed by nuclear weapons, and the arrival of Artificial Intelligence (AI). On the release of Oppenheimer, Director, Christopher Nolan himself noted how recent calls for restraint on the AI development mirrored those from the father of the atomic bomb.
In the post-WWII era, institutional attempts to control the development of nuclear weapons stalled, leading to a race between the great powers of the day.
We’ve witnessed the emergence of AI over many years, but it is only after the recent rapid advance that many of us have started to comprehend its awesome potential power.
To harness that power, regulate effectively, and direct it as a force for good, strong institutions are critical.
The picture today is troubling. The 2024 Edelman Trust Barometer shows that the public has little confidence in the current institutional response to AI or other key areas of innovation.
In nearly all countries in our 28-market study, people are more likely to believe innovation is mismanaged rather than well managed. Globally, they think that by a margin of two to one.
This is driving a profound sense of alienation and dislocation too.
Institutional mismanagement of innovation is making many feel left behind by technology. On a deeper level it is feeding societal divisions and, in a year when nearly half the world will go to the polls, it is getting sucked into partisan party politics too. Institutions have a major communications challenge on their hands.
So, what can be done? The data points to five key actions:
First, people are more willing to embrace innovation if it is vetted by scientists and ethicists, and well regulated. Institutions need to ensure they have these independent third-party experts giving their stamp of approval where there are emerging ethical concerns.
Second, while three quarters of people (77%) believe that scientists should be leading the charge on implementation of innovation, these experts are currently falling short in the public’s eye. Fifty-three percent believe science has become too politicised and nearly half (45%) believe scientists don’t know how to communicate with them.
Effective storytelling that engages and explains the science in relatable ways, treads carefully to avoid stepping into political culture wars territory, and also mobilise peer-to-peer voices to advocate on their behalf is vital.
Third, there needs to be a more sophisticated understanding of how people are now seeking out and consuming information. People are most likely to turn to digital channels for information about new technologies and innovation.
This demands that institutions are alive to the threats of mis and dis-information and educating people on how to have good information hygiene. More than that, institutions need to speak to audiences on the platforms people are turning to, and not just through the channels that feel most convenient or where they have muscle memory.
Fourth, people want institutions to hear their concerns and let them ask questions. They need engagement and dialogue, not silence or one-way broadcast. Honesty about the pluses and minuses, and the inherent trade-offs, are vital to unlocking trust and an embrace of innovation.
Greater institutional engagement on the promise and peril of AI and other technologies can help set the pace of change. To increase acceptance, people want to be shown how innovation will bring a better future. Credibility is key.
Fifth, business cannot go it alone. Sixty percent of people say that if business partners with government, they will trust it more with technology-led changes. This is up by fifteen percentage points since 2015. Public-private partnership involves two institutions that sometimes talk different languages and feel miles apart, but they need to get much better at finding a common tongue and moving at the same pace.
The UK AI Safety Summit was a good example of how this can take place across local and global domains. But even here representation for the startups driving the next wave of innovation was limited and civil society groups also noted the absence of a voice for affected workers.
“Now I Am Become Death, the Destroyer of Worlds” is a quote often attributed to Oppenheimer after he witnessed the first detonation of the bomb. It’s actually a line from the Bhagavad Gita. As menacing as it sounds, this was in fact a mistranslation of the Hindu scripture and the words of Vishnu, the God of Preservation.
The true meaning is far more complex, symbolising humankind’s need to put our faith in the divine. To restore trust in the promise of innovation, we need institutions that can powerfully articulate a vision of the future we can all believe in.
As world leaders debate the progress of Artificial Intelligence, Sat Dayal and Nick Hope share their commentary on the 2024 Edelman Trust Barometer and the Perils of Innovation.