When social media and some online tools were introduced to society they were hailed as tools that would connect people, enable freedom of speech and facilitate better access to information. Although some of the good things are achieved through online tools we are also witnessing the spread of misinformation. Many will remember Bell Pottinger and how it used social media to spread misinformation. Currently, we are witnessing the spread of misinformation around vaccines. We are also likely to see more misinformation as South Africa enters the election season. These factors have given rise to online misinformation fighters. Earlier this month Phumzile Van Damme, former member of Parliament and member of the Democratic Alliance together with other organisations kickstarted a Local Government Election Anti-Disinformation Project. The organisers indicated that their effort is a collaborative project against disinformation and misinformation. The project’s partners are: Phumzile Van Damme, Right2Know, Code For Africa,Superlinear, Dr David Rosenstein and WITNESS. They also indicated that their project will focus on Disinformation monitoring and combatting, focusing on online political discourse, messaging emanating from political parties and government, advocacy focusing on Big Tech, PR firms and the use of video technology to expose human rights abuses and combat disinformation, behavioural science aimed at understanding the believability of disinformation in South Africa.
Phumzile Van Damme is no stranger to the fight against misinformation, her efforts contributed in bringing down the world renowned Bell Pottinger. She is not alone in her efforts to fight misinformation. Another significant player in this fight is an entity that is incubated within the University of Cape Town Alan Gray Centre for Values Leadership, the Centre for Analytics & Behavioural Change. The alarming increase in fake news, hate speech, mis-and-disinformation campaigns, trolling and all other negative influences designed to confuse, distract, and divide society led to the founding of the Centre for Analytics and Behavioural Change (CABC). According to its co-founder Stuart Jones the centre has about 40 team members who consist of journalists, academics, psychologists, criminologists and sustainability experts. The centre is backed and funded by the Millenium Trust which is also a funder of many organisations that have been involved in South African societal issues. These organisations include the Ahmed Kathrada Foundation, AmaBhungane Centre for Investigative Journalism,Daily Maverick,Centre for the Advancement of the Constitution, Corruption Watch, Freedom Under Law and others.
The Millenium Trust was established in 2010 by the Capitec founder, Dr Michiel Le Roux. He established the Millenium Trust with an aim of dealing with some of South Africa’s pressing issues. The trust has done this by supporting the organisations mentioned above including the Centre for Analytics and Behavioural Change.
The centre seeks to use technological and social science means to address the misinformation challenge. The picture that seems to emerge is that the misinformation challenge in South Africa is attracting very influential people and organisations to deal with it. Phumzile Van Damme left her parliamentary seat to focus on this issue. What seems to be a significant amount of money and the size of resources deployed by the Centre for Analytics and Behavioural Change also seems to indicate that this is receiving the serious attention it deserves.
The fight against misinformation will not be an easy one. According to Phumzile Van Damme when she tried to get Facebook to account for misininformation to the South African parliament she was blocked by both her former political party leaders and also somehow an organisation that lobbies for Facebook. It’s also a fight that will also require a significant amount of money which is probably part of the reason why a trust with the backing of one of the richest South Africans, Michiel Le Roux, is also backing this fight.
When one considers the effects of misinformation, think vaccine hesitancy and recent unrests, there’s no doubt that there’s a need to fight misinformation with the every ammunition available. One of the tools that will be used in this fight is fact checking. Global experience has proven that fact-checking can be a powerful tool in the fight against online falsehood. It is also important to understand that it can be used as a means of censorship if not only facts but also opinions and narratives are checked.
Used correctly, fact-checking contests falsehoods in ways that complement free speech. But free speech is about letting people be wrong as well as right. There’s a need to limit the checking to facts, which is tricky enough, and not opinions that the checkers don’t happen to like.
When it comes to partisan fact-checking about complex issues—which describes much of the fact-checking that takes place in the context of political news—the truth as stated is often the subjective opinion of people with shared political views. This is something that South African misinformation fighters will have to guard against. It’s very easy to bring together people with the same worldview on these matters and also get backers who share an identical worldview. This leads to bias in how the fact checking process is undertaken and can raise questions about the real motive of misinformation projects.
One path to a solution is “adversarial fact-checking.” Fact-checking is often done by teams of two or more journalists rather than by a single person. It may be ideal that political, health and others claims continue to be aggressively fact-checked, but by teams of individuals with diverse sociopolitical and diverse views; for example, by pairing fact-checkers from entities with different world views. This would add little, if any, cost. The misinformation fighting community should abandon fact-checkers’ pretext of objectivity and political disinterest and instead acknowledge and declare their socio political leanings. This will be important especially now that part of the focus will be on factchecking election information.Research underscores that fact-checkers’ personal biases influence both their choice of which statements to analyze and their determination of accuracy. Let diverse fact-checkers work as members of an adversarial team, much like two sides in arbitration. Fact-checkers are human beings who live in the real world, rather than in a sociopolitical monastery. There’s a need to abandon the pretense of objectivity and design a system of adversarial fact-checking that places the evidence for competing claims front and centre.
When adversarial fact-checking leads to unresolvable disagreements among team members, readers will be better able to judge how persuasive each side’s argument is and arrive at a more informed conclusion than they would if only one side’s evidence is presented. The misinformation fighters are on the right track, to win they will need more resources and support not just from single entities who share their worldview but from a diverse pool of support and neutral entities. In the long run society will need an artificial intelligence tool built by a diverse community to fact check the pool of information in society.