Australians could be targeted by online persuasion or misinformation campaigns at the upcoming federal election, according to a new report commissioned by the Department of Defence.
A team of interdisciplinary researchers from five universities found that operations similar to Steve Bannon’s Cambridge Analytica and Russia’s Internet Research Agency, also known as the “troll factory”, could readily influence Australians’ political opinions through social media.
Online marketing expert Dr Stephanie Meek from Edith Cowan University (ECU) contributed to the report.
She said these kinds of techniques had been used by advertisers for ages, but now they’re also being used by well-funded political machines with sometimes-nefarious goals.
“It’s getting the most relevant message to the target without them knowing about it that is so harmful,” Dr Meek told The New Daily.
Australia is vulnerable because of how much data people share about themselves online, and on social media in particular.
Even after this information was used for political gains in the US and elsewhere, Australians haven’t changed their behaviour, and the government hasn’t stepped in.
“All of this stuff is still available – it can still be used,” Dr Meek said.
What was Cambridge Analytica?
Cambridge Analytica was a British political consultancy firm run by former Trump aide Steve Bannon, among others, and backed by right-wing billionaire Robert Mercer.
The Australian researchers found that the company used traditional and “quasi-experimental” digital techniques using information obtained through “illegal data harvesting”.
In 2018, the company made international headlines when it was exposed for accessing the data of more than 80 million Facebook users via a third-party app.
During the 2016 US presidential election, Cambridge Analytica backed the Trump campaign by delivering bespoke messages to potential voters based on their social media activity.
“When you think about the fact that Donald Trump lost the popular vote by three million votes but won the electoral college vote, that’s down to the data and the research,” the company’s head of data Alexander Taylor said at the time.
In the Caribbean nation of Trinidad and Tobago, Cambridge Analytica encouraged swathes of young people not to vote in order to hand the election to its client.
The company worked on more than 100 election campaigns in more than 30 countries on almost every continent – except Australia.
Although Cambridge Analytica no longer exists under that name, Dr Meek said plenty of companies still specialise in those kinds of political campaigns and are willing to work with whoever will pay.
“I don’t think that the general public in Australia are as aware as they need to be of the capabilities of some organisations to manipulate our behaviour,” she said.
“I don’t think the general public are aware of what’s going on around the world and how that could affect us here in Australia.”
The internet is an integral part of everyday life and should not be dismissed as a medium for political influence or even radicalisation.
“Online radicalisation is nothing new, but COVID-19 sent it into overdrive. Isolated individuals spent more time online, exposed to extremist messaging, misinformation and conspiracy theories,” ASIO director-general Mike Burgess said in a speech last week.
“Social media platforms, chat rooms and algorithms are designed to join up people who share the same views, and push them material they will ‘like’.
“It’s like being in an echo chamber where the echo gets louder and louder, generating cycles of exposure and reinforcement.”
Tackling the problem
ECU’s Securing Digital Futures director Tony Marceddo said the report’s findings “are a timely reminder of the importance of influence and the need for Australian governments to put in place strategies to counter these threats”.
The team of researchers suggested several ways Australia can fight back against targeted persuasion or misinformation campaigns at upcoming elections.
One way is to regulate the use of these persuasive technologies with a code of practice.
Another is to implement stringent data-collecting regulations to prevent users’ information from being illegally harvested by political machines.
Finally, the researchers called on the Department of Defence to devise a set of metrics to monitor the information (and misinformation) environment online.