The right to be wrong

Does the right to freedom of speech include the right to be wrong?

There is a touch of irony in one of the defining characteristics of the Information Age being #FakeNews, and in recent years the impacts and origins of #FakeNews have become the subject of much research. A Council of Europe report describes the term as “woefully inadequate to describe the complex phenomena of information pollution”, and that because it has been “appropriated by politicians around the world to describe news organisations whose coverage they find disagreeable” it is “becoming a mechanism by which the powerful can clamp down upon, restrict, undermine and circumvent the free press”. The report opts instead to use the term Information Disorder, and uses harm and falseness to divide the types of information into three sub-categories:

  • Mis-information: When false information is shared, but no harm is meant
  • Dis-information: When false information is shared with the intention to cause harm
  • Mal-information: When genuine information is shared with the intention to cause harm, typically by publicly sharing information meant to stay private

Tackling information disorders is a complex and multifaceted problem for various institutions within the scientific establishment. These include fringe movements such as “9/11 truthers” rejecting the analyses from engineers, material scientists, and demolition experts over the collapse of the World Trade Centre towers (#JetFuelCantMeltSteelBeams), to more well-known anti-science movements such as the Flat Earthers, anti-vaxxers, and the anti-GMO movement. For the large part I don’t think we as a society take these movements, and information disorders as a whole, as seriously as we should. However, as the COVID-19 pandemic grips the world, the narrative around information disorders has shifted. Multiple countries, including South Africa, have criminalised the spreading of false information surrounding the disease and this has raised the question of whether our right to freedom of speech includes the right to be factually incorrect.

As scientists we can get lulled into thinking those who perpetuate information disorders are an isolated group with whom we have little contact, but this is not the case. Recently a family member of mine posted a lengthy status on Facebook relating to the COVID-19 pandemic. The post claimed that the virus was a plot to destabilise the capitalist economies of the Global North, that communist countries had not been affected by the virus, and that the Chinese government already had a cure that was being hidden from the world but used to treat their own citizens. This post was one of the many I had seen discussed online, dripping with racist, sinophobic, and anti-science rhetoric. The only difference was not an abstract example on someone else’s timeline, but a very real post by a healthcare professional I knew personally. I wholeheartedly believe that we all have an obligation to tackle information disorders wherever possible, particularly when it is perpetuated by those close to us.

In our exchange, the poster (Person X) justified their sinophobia by referring to the reported brutalities of the Chinese government and the cultural differences in animal consumption between Western and Eastern societies. However, this is both an example of othering and whataboutism that only seeks to divert the attention from the racism and sinophobia under question. As Jonathan Kolby states in Coronavirus, pangolins and racism: Why conservationism and prejudice shouldn’t mix “environmentalism and conservationism are noble and vital pursuits” but “dialogues about coronavirus should not allow the topic of wildlife conservation to provide a smokescreen for prejudice”. Gerald Roche gives a superb discussion on the wider societal effects of this in The Epidemiology of Sinophobia, but this is not what I want to focus on for this post.

On top of trying to justify their prejudice, Person X posted follow-up comments with information that was more scientifically sound but in direct contradiction to the original post. Person X justified this by saying that they were not an expert in this field, that the original post was copied and pasted from an unknown source, and that their intention was to present information from both sides. The interaction between myself and Person X ended with the following comment, after I questioned why in the midst of a pandemic a medical professional would choose to share false information that could easily have been verified before posting:

Person Y came to the defence of Person X stating that we are all entitled to our own opinions regarding the virus, and that they are “not really phased what that is” but “when [I] sit behind [my] phone or laptop and comment away while people are putting their lives at risks and in the trenches fighting all over the world, just be careful what [I] say. If [I] know everything about the “claims” then [they] would recommend going to assist with fighting this virus, fruit salts.”

The sentiments of both these people raise three questions:

  1. Does sharing information from multiple sources in an effort to present all sides of a story make one guilty of contributing to an information disorder if the information is factually incorrect?
  2. Is everyone entitled to an opinion, no matter how falsified it is?
  3. By virtue of their work, are frontline workers above criticism for their opinions?

To answer these questions we need to look with both a scientific and legal mind, which is a unique opportunity the COVID-19 pandemic provides us. At the time of writing this, at least eight people had been arrested for being in violation of COVID-19 Disaster Management Regulation 11(5). This regulation states that “any person who publishes any statement through any medium, including social media, with the intention to deceive any other person about a) COVID-19, b) COVID-19 infection status of any person, or c) any measure taken by the Government to address COVID-19 commits an offence and is liable on conviction to a fine or imprisonment for a period not exceeding six months, or both such fine and imprisonment”. What is important here is how intent is defined. In this instance the legal definition of intent is not just a person meaning to deceive others by sharing false information (Dolus directus), but also a person seeing the possibility of others being deceived before sharing the false information (Dolus eventualis) or a person genuinely believing the shared false information to be true themselves (Dolus indirectus). The three legal definitions of intent align quite heavily with the three categories of information disorders.

This legislation places a responsibility on all of us to check the validity of the information we are spreading, and I hope going forward this experience places this responsibility in the forefront of our collective conscience. If we do not know enough or are not willing to critically evaluate information that is presented to us, a safer option would be to not perpetuate the information at all. In The Salmon of Doubt Douglas Adams penned, “All opinions are not equal. Some are a very great deal more robust, sophisticated and well supported in logic and argument than others”. A central tenant of a democracy is the right to hold an opinion, but these opinions should not shielded from criticism and debate. In fact, one of the health indicators of a democracy is the quality of the debates. In doing this however, we must critically assess which opinions are worth debating, discarding those not founded upon evidence instead of debating for the sake of debate.

I believe the desire to share all sides of a story is a result of how the media has approached presenting complex stories in the past. All too often we see panels consisting of experts such as medical virologists and meteorologists seated alongside non-experts such as anti-vaxxers and climate-change denialists, for the sake of “balance”. This gives a false legitimacy to the side whose opinion is not supported by scientific evidence. I advocate for deplatforming people who hold opinions and beliefs that go against established scientific theory, such as the safety of vaccines, the effectiveness of genetic modification as a breeding tool, or the impacts of anthropogenic climate change. This is not to say that I don’t believe that these complex topics have nuance which needs to be unpacked debated, but rather that we would make better use of our time debating amongst experts over said nuance rather than with those who reject reality. This includes “front-line” workers of all types.

I believe the responsibility of ensuring your opinion is backed by facts is heightened when you are in a position of perceived authority as a front-line worker. In this instance I regard anyone who works directly with the public as a “front-line” worker, as these professions will have the largest influence on public opinion. In the case of the COVID-19 pandemic, when front-line workers such as nurses and doctors share false information, they undermine the work and credibility of the entire industry that supports them. The is includes everyone from the virologists working on understanding the virus, to the research groups working on developing a vaccine, to medical researchers working on developing treatment protocols, to government agencies trying to coordinate disaster relief efforts and reduce the spread. In perpetuating false information, these front-line workers reduce public support for these highly coordinated efforts, eroding the public’s trust in the scientific establishment, increasing tensions both locally and globally, and ultimately costing us lives as the public becomes less likely to follow guidelines aimed at reducing the spread of the virus.

Too many lives have already been lost from information disorders surrounding life-saving technologies such as vaccines and biofortified GM crops. If there is only a single positive thing to come out of the COVID pandemic, I truly hope that it is us a society taking the threat of information disorders more seriously.

Richard Hay


Fallacies and freedom of expression in Science

Can ignorance be cured?

The Coronavirus (COVID-19) pandemic has caused a global panic, unfortunately, during these trying times many people have suddenly become ‘medical experts’. It was alarming for me when I received a WhatsApp text message claiming that the virus ‘can be cured by a bowl of freshly boiled garlic water’. The panic was; how many people received this message? How many people believed it? How many lives are at risk because of it?

The results of spreading fake news can be catastrophic. They could lead to deaths that could be prevented if the people received scientifically correct information. The World Health Organisation (WHO) reported that the vast majority of coronavirus information shared across social media comes from fake news sites. These fake news range from conspiracy theories of where the virus originated from to healing remedies. It was unsettling to see prominent leaders also prematurely announce treatments that weren’t approved by medical specialists. The repercussions of this misinformation have even sparked racial discrimination and lead to shortages of Plaquenil which is used to treat malaria.

I hope that as a community we can work together to spread the scientifically correct news and save lives! The NewsGuard has created the Coronavirus Misinformation Tracking Center which keeps a record of all the fake news websites. In South Africa, it is best to read updates and health-related advice directly from the government’s website (

In the age of the fourth industrial revolution (4IR) people have become more susceptible to fake news. Old wives’ tales are no longer just beliefs of small groups but are vastly spread through the internet. Unfortunately, science is not exempt from fake news. An article posted by Psychology Today mentions that people fall for fake science news based on their individual ability to recognize misinformation, group beliefs, and societal factors. Addressing these individual/societal beliefs with facts doesn’t help much, research shows that evidence-based arguments are most likely to curb these beliefs. 

The most popular in Astronomy has to be the belief in astrology, even print media cashes in on this one. People strongly believe that star signs directly affect their moods, personality, finance and love life. A study carried out at the University of Arizona showed that 78% of 10000 students believe that astrology is ‘sort of science’. A shocking 48% of students from the science faculty also believed that astrology is science-based. Counter to popular belief, star signs are based on a group of stars that appear in the sky at a particular time of the year, they have no effect on any individual. They were initially used by farmers to indicate the time of the year and navigate through a season change.

Right after astrology would be the ‘flat earthers’, a group of people who choose to still base their belief on an experiment performed in 1985. What is alarming is that the number of people who believe this myth seems to be on the rise. An experiment carried out by science enthusiast clearly shows that the earth is spherical and not flat.

The above examples are less critical, however, results of believing fake science news can be life-threatening.  For example, a study carried out in the US showed that a third of the public disagrees that climate change is due to human behaviour. These individuals would be less likely to be more precautious when using objects that cause pollution or directly impact climate change. The truth is, we only have ~ 10 years to curb the climate change catastrophe, this can only be achievable if we work in unity. It is promising to see young individuals boldly advocating for this cause because it is the younger generation that will suffer the consequences of our ignorance.

Another hazardous myth is that vaccines are harmful to babies. This myth stemmed from a fake study that linked autism to the measles‐mumps‐rubella (MMR) vaccine.  This hesitancy to vaccinate has caused a global increase in vaccine-preventable diseases and sometimes result in fatalities that could have been prevented. The truth is research has shown that vaccines save lives! They do not just protect the vaccinated individual but also provide community protection by reducing the spread of disease within a population.

It is promising to see that as much as 4IR might be the cause of the acceleration of the spread of fake news, it can also be the solution. A lot of research has gone into using machine learning and artificial intelligence as a resolution. These 4IR tools can be used to detect fake news based on text. Other studies include adding warning texts to articles that emanate from untrusted websites, these studies reveal that people are less likely to believe articles that are tagged as fake.

As scientists, I feel that it is our duty to educate the public with matters that we are well informed about. Ideally, it should be mandatory for all science postgraduate students to be enrolled for a science communication module. This would enable us to effectively communicate our science with a range of audiences. Hence, allowing us to engage with the public at a level which is not condescending but equally informative.

Majority of postgrad funding is from the government, either directly from NRF or through SARChI chairs, hence, science communication should be a public service from the recipients of the funding. The government already has science engagement avenues such as SAASTA and could escalate public engagement by working with more postgrad students.

Finally, as scientists we should be equally visible on social media, presenting evidence-based facts to combat the spread of fake science news. If we are not thrilled to do this as a public service then let us consider it as a mission to save mankind (Science Avengers maybe?).