Analytics

Can we eliminate political disinformation?

– According to a recent Oxford University report,  political parties in at least 61 countries have used disinformation to influence public opinion.

– Unless we eliminate the incentives that push political actors to  use disinformation, it will not disappear and will more often than not outsmart any defensive strategy to tame its effect.

– Codes of Conduct can help parties unite against disinformation, and point fingers, in unison, to those using it

– Awareness campaigns should also inform the population about the risks of disinformation and explain exactly how parties are using it and with what intention.

– The EU upcoming Digital Service Act will be a paradigm shift by clarifying who is responsible for spotting and taking down disinformation from social networks. 

Addressing the effects of disinformation in our democracies shares many similarities to how pandemics can be tackled: you can impose measures to contain it, you can calculate the social cost the population is capable of assuming, you can increase your knowledge about it and  you can develop better treatments. Yet, nothing beats eliminating the virus. While most of the world is bunkered down, Taiwan, which has so far eliminated the COVID-19 virus, enjoys life-as-usual. When it comes to disinformation, we have taken all the classic defensive approaches: we have developed better methods to understand and track disinformation, advanced legislative and regulatory solutions to attempt to curb it and improved our ability to debunk it. We have even made our democracies a bit more resilient to disinformation by public awareness and increasing transparency, writes Alberto Fernandez Gibaja.

At the EU level, the upcoming Digital Service Act will be a paradigm shift by clarifying who is responsible for spotting and taking down disinformation from social networks and increasing the transparency and collaboration requirements for social media platforms. But still, we need to face the grim reality: unless we work towards eliminating the incentives and windows of opportunity that push political actors to  use disinformation, it will not disappear and will more often than not outsmart any defensive
strategy to tame its effect.

Manipulating public opinion with falsehoods and other opinion-influence techniques, often referred to with the overarching label of disinformation, is nothing new. During the Cold War it was used widely by the Soviet Union to cover up disasters -such as Chernobyl- or to try to discredit democratic movements. What is new(-ish) is its capacity to become viral and spread like wildfire. No matter how good defensive measures are, if the incentives and opportunities to use disinformation are there, political actors will keep trying to use diverse techniques to manipulate public opinion. This is why we need to root disinformation out from its source.

Why would a democratic political party, candidate or interest group get its hands dirty in spreading disinformation? Because they can, the incentives are great and the consequences of getting caught seem minimal. In the 2019 elections in the UK disinformation of all types was spreading directly from political parties accounts, including the Conservatives sharing a doctored video from official accounts. In Spain both Twitter and Facebook took down hundreds of social media accounts linked to political parties in two occasions.  According to the University of Oxford political parties in at least 61 countries have used disinformation, including during the US presidential campaign and the Tunisian elections.

Therefore, it is these incentives  that democratic institutions, political parties and platform owners need to focus to  ensure disinformation does not infect our democracies. To do so, a carrot and stick strategy should be used combining legislative and non-legislative approaches. We need to assemble trust-building measures Initiatives like the one taken by the Ministry of Interior and Kingdom Relations in the Netherlands, together with International IDEA, to develop with political parties a Code of Conduct on online advertisement point in the right direction. Codes of Conduct can help parties unite against disinformation, and point fingers, in unison, to those using it. Platform owners can support these Code of Conducts by highlighting when a party has breached it and also when parties are adhering to it.

There is a wide array of possibilities to support political parties developing clean campaigns online, from deploying features that support parties’ ability to reach their audiences better, to providing subsidized space in their platforms to candidates.  Codes of Conduct can also be expanded to society as a whole, as was the case in Panama during the last elections, seeking a society-wide effort to root disinformation.

Awareness campaigns should also inform the population about the risks of disinformation and explain exactly how parties are using it and with what intention. The role of journalism and mass-media is paramount. Newspapers and news agencies should incentivize healthy debate among candidates and avoid amplifying disinformation, even if unintentionally.

Disinformation should also become more difficult for parties and candidates. Two measures can help. Firstly, adapting the political finance regulations to include what parties spend online, and provide regulatory agencies the capacity and mandate to audit those expenditures. Parties spend lavishly in disinformation, yet, money spent online is extremely difficult to track and opens several opportunities to bypass political finance legislation. Parties can make payments from abroad to a foreign agency and still run a national campaign in the country. Parties and candidates would only disclose a small fraction of the money spent on advertisements, but they would not disclose how much they have spent in spreading disinformation, maintaining troll farms, buying bots or pushing astroturfing online.

A second measure is related with the industry that has flourished thanks to disinformation. Parties and candidates tend to use digital marketing agencies to mount their operations, sometimes even in other countries. Increasing oversight measures should be placed to make sure these digital marketing agencies stick to the rules. Many steps could be taken in this direction, ranging from requirements to disclose which agencies work with each party to subject these agencies to digital audits.

Like viruses, disinformation adapts, mutates and becomes each day more capable to avoid any measure created inside the hosting organism -in this case, social media networks- to get rid of it. The massive push against disinformation that has taken place since 2016 has been effective to a certain extent. Yet, even if we are more capable of detecting disinformation and fighting it, it keeps appearing in our newsfeed, private Whatsapp groups or Facebook groups.  If we want to root out disinformation from the political democratic debate, we need to look at its source, and reduce as much as possible the incentives from parties and candidates – and other interest groups – to use it.

Source: EU reporter

Leave a Reply