Recommended Sponsor Painted-Moon.com - Buy Original Artwork Directly from the Artist

Source: The Conversation (Au and NZ) – By Katie Attwell, Associate professor, The University of Western Australia

Thought Catalogue/Unsplash

For decades, anti-vaccine movements have generated and spread rumours that vaccines cause serious health problems. The rollout of COVID vaccines has provided new opportunities to spread misinformation.

At the start of the pandemic, people were already worried about the virus and the impact of other public health measures, such as lockdowns, on their physical and social well-being. As COVID vaccines were rolled out, concerns mounted about the small but serious risk of blood clots linked to the AstraZeneca vaccine.

Alongside this, there has been a degree of panic around unsubstantiated rumours of adverse events – extremely rare medical problems after being vaccinated – circulating on social media.

But contrary to the popular belief that social media creates these rumours, our new research suggests social media generally only aids the spread of these rumours.




Read more:
The 9 psychological barriers that lead to COVID-19 vaccine hesitancy and refusal


What ‘vaccine harms’ are people sharing on social media?

We have been studying community attitudes to COVID vaccinations, including the flow of information on social media, the kinds of information being shared, and by whom.

In our latest study, we tracked emerging concerns about alleged adverse events globally. We used Google Trends and Crowdtangle – a research platform for studying Facebook’s public-facing data. We focused on the most commonly searched and discussed events to track where they were coming from.

We dug into the five most frequently searched adverse events: clotting, fainting, Bell’s palsy, premature death and infertility.

Clotting

Clotting was associated with the AstraZeneca vaccine and the rare instances of thrombosis with thrombocytopenia syndrome (TTS). This led to the vaccine’s suspension or authorities placing age restrictions on its recipients in many countries.

News reporting on clotting was generally reasonable and in line with the threat the condition posed. Because the issue was newsworthy on its own, it did not require sensationalist reporting. Social media spread these reports globally, so the first reports of clotting, emerging from Austria, spread on Facebook pages as far as Ghana, the Philippines and Mexico within eight hours.

A nurse puts a bandaid on a woman's arm after an vaccination.
The risk of clots caused concern.
CDC/Unsplash

Fainting, Bell’s palsy and premature death

There was no scientific basis for the other four rumours we investigated. However, three of them drew specifically from “traditional” (television and newspaper) news reporting on specific incidents.

For example, a Tennessee nurse fainted on television shortly after receiving the Pfizer vaccine. Traditional media reports included the nurse’s own disclosure of a history of fainting and cautioned against attributing it to the vaccine.

Likewise, elderly baseball legend Hank Aaron died from natural causes two weeks after receiving a COVID vaccine on camera. He had hoped to inspire other Black Americans to be vaccinated.

These two incidents were widely reported in traditional media and soon flowed into social media posts attributing them to the vaccine.

The Bell’s palsy rumour came out of news reports in Bangladesh, which were then picked up by a UK outlet, attributing the rare condition to the Pfizer vaccine.

Infertility

The rumours of COVID vaccines causing infertility were the only ones that we could not trace to an original “traditional media” source. Instead, two internet stories misrepresenting the work, and words of scientists were shared widely on social media. Traditional media only picked up the story to report on the misinformation occurring.

We describe this as an example of vaccine sceptics “theory crafting” online. This is when a group of people on the internet use their collective resources to analyse information to develop plausible explanations for events.

In the case of infertility, a willing community misused two scientific sources to construct what they represented as compelling evidence of a cover-up. This theory then led to a persistent internet rumour that COVID vaccines caused fertility problems.

In the other four cases above, we found traditional media still played an important role in determining people’s awareness of alleged adverse events.

Pregnant woman holds her dress.
Traditional media reported on vaccine myths about fertility.
Ömürden Cengiz/Unsplash

What did mainstream news outlets do?

Traditional media outlets were important to those sharing the social media posts, as they treated mainstream media reports as markers of credibility.

Vaccine-sceptical communities used international media sources to build “evidence” for adverse events. They then redistributed this “evidence” among their international networks.

Disreputable outlets chased “clickbait”, accelerating the spread of misinformation. For instance, when 86-year-old Aaron died, one site led with the headline “Hank Aaron Death: MLB Legend Shockingly Passes Away Weeks After Taking COVID-19 Vaccine”. This headline spread much faster and further on social media than the majority of reports that explained Aaron’s death was not a result of his vaccination.

Inaccurate and sensationalist headlines in mainstream media went on to drive significant searches and shares. The rumours flowed globally, unfettered by national boundaries.




Read more:
Media reports about vaccine hesitancy could contribute to the problem


Despite most of the rumours we investigated gaining traction because of media reporting, journalists also played an important role in suppressing or debunking illegitimate claims.

The disruption of earlier media models clearly poses a challenge for the accuracy of information shared on the internet. The imperative for news sources to generate clicks can outweigh the imperative to provide accurate and reliable information.

So what’s the solution?

We can see no easy answers for resolving the flow of misinformation online.

However, the use of credibility markers for both authors and stories on social media is one possible solution. A system where publicly recognised topic experts can “upvote” and “downvote” news stories to produce a “credibility score” would help readers judge the perceived credibility of particular stories and information.

In the meantime, we recommend scientists and health professionals, where possible, promote their own perspectives when a story about alleged adverse events needs clarifying. Doing so can potentially change the trajectory and spread of a story.

Scientists and health professionals speaking out can’t prevent the stories from being shared within online communities of vaccine-refusers. These people are invested in sharing such information regardless of its veracity. However, professionals can limit the damaging spread of rumours once media outlets begin to report their debunking.




Read more:
We can’t trust big tech or the government to weed out fake news, but a public-led approach just might work


The Conversation

Katie Attwell receives funding from the Australian Research Council and the WA Department of Health. She is funded by ARC Discovery Early Career Researcher Award DE1901000158. She is a specialist advisor to the Australian Technical Advisory Group on Immunisation (ATAGI) COVID-19. All views presented in this article are her own and not representative of any other organisation.

Tauel Harper does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Social media spreads rumours about COVID vaccine harms … but it doesn’t always start them – https://theconversation.com/social-media-spreads-rumours-about-covid-vaccine-harms-but-it-doesnt-always-start-them-184169

NO COMMENTS