Research from the Oxford Internet Institute is revealing the ways in which government agencies and political parties around the world use digital media to manipulate public opinion.
There was a time when the communication strategies employed by democratic governments and those used by authoritarian regimes were easy to distinguish from one another. Today though, things aren't quite so straightforward. 'I started out studying democracies, then I studied autocracies and now I'm back at democracies, and that's in part because a surprising number of communicative tricks jump from regime type to regime type,' says Professor Phil Howard, Director of the Oxford Internet Institute (OII).
In a growing number of countries around the world, very deliberate efforts are being made to mislead and throw public conversations off the rails.Professor Phil Howard
Around the world, government agencies and political parties are using the Internet to spread misinformation, exercise censorship and undermine trust in democracy. According to a recent OII study, the problem is growing rapidly, and now constitutes a critical threat to public life. But although the role of digital media as a tool for political manipulation has recently come under increased scrutiny, it is by no means a new phenomenon.
It was in the run-up to the US presidential election of 2000 that technology experts first began using the Internet to influence public opinion, says Professor Howard. These experts, often industry lobbyists, would set up bogus campaigns in order to create the illusion of widespread public support. 'Fake lobby groups had been around for many years, but it was really in the late 1990s that they started taking people's trust in their own social networks online and abusing it for political means,' he notes.
This practice became all the more powerful following the creation of Facebook and Twitter in the mid-2000s. 'Some of the accounts that we catch here at the OII were started only a few months after these companies went live,' says Professor Howard. 'If political campaign consultants don't have guidance about what constitutes ethical behaviour, they will very quickly turn new technologies and platforms to the service of a lobbyist or political interest.'
Following the Brexit referendum and the 2016 US presidential election, technology companies have come under increased pressure to combat the spread of junk news, with efforts focusing on the identification of fake accounts and the removal of the very worst of the content from their platforms. For countries facing upcoming elections, such as South Africa, Guatemala and India, these solutions cannot be implemented quickly enough.
'These elections are going to be very important for how democracy grows globally,' says Professor Howard, who will soon begin investigating the way in which tech companies are addressing these problems in the Global South. 'India has 40 or 50 different functioning languages that people vote in. Will Facebook really have the capacity – either in software or personnel – to catch an organised misinformation campaign there? I doubt it.'
So, how do you begin to tackle a problem on this scale? According to Professor Howard, the answer lies in more social media, not less. 'There are lots of political science theories that tell us that democracy works well in small groups, and that the larger the group gets, the harder it becomes to signal preferences and clearly express opinions,' he explains. 'Social media platforms already allow small groups to deliberate, share evidence and sometimes come to a consensus on a particular issue, so there may be ways to build that into policymaking.'
These days, elections are all about data.Professor Phil Howard
Professor Howard points to instances of single cities, small states and high-trust organisations using groups of citizens to produce voters' guides. Such groups come together to fact-check the claims being made on both sides of the debate, producing a balanced document that can then be used to inform decision-making. It's an approach he feels could work well in a number of countries around the world.
Governments, of course, cannot be forced to adopt strategies like this, but by working closely with policymakers, practitioners and business leaders, the OII is uniquely positioned to help shape the debate.
Looking ahead, Professor Howard says that innovations in artificial intelligence, big data and blockchain are going to have a significant impact on our political, economic and cultural institutions. 'At the OII, we investigate how to get the best out of such innovations, and how to do so ethically and equitably.'
The OII was founded in 2001 following a gift of £10 million from Dame Stephanie Shirley, and is currently fundraising to secure programming and a dedicated building facility through endowment.