Katie Attwell’s research is funded by the ARC and WA Health. Her research is supported by ARC DE1901000158. She assists COVID-19 Australian Technical Advisory Group on Immunisation (ATAGI). Her article reflects just her views, not those of any other organisation.

Anti-vaccine groups have spread the notion that vaccines cause health problems for decades. Misinformation about COVID vaccinations is spreading.

At the onset of the pandemic, many were worried about the virus and public health precautions like lockdowns. As COVID vaccinations were distributed, concerns about the small but substantial danger of blood clots rose.

Unsubstantiated rumors of adverse occurrences – rare medical problems after vaccination – on social media have sparked concerns.

Our recent research reveals, however, that social media spreads rumors.

Social media posts regarding ‘vaccine dangers’

We’ve studied community sentiments toward COVID vaccinations and social media traffic, sorts of information shared, and who shares it.

In our latest investigation, we followed suspected adverse events worldwide. Using Google Trends and Crowdtangle, we analyzed Facebook’s public data. We looked at the most-searched and-discussed events to determine their origin.

Clotting, fainting, Bell’s palsy, premature mortality, and infertility were studied most often.


Thrombosis with thrombocytopenia syndrome was connected to the AstraZeneca vaccination (TTS). In several countries, the immunization was suspended or age limitations were imposed.

Closing news coverage was proportional to the threat. Since the situation was newsworthy, sensationalist reporting was unnecessary. Within eight hours, Austrian clot reports reached Ghana, the Philippines, and Mexico.

Four rumors we investigated lacked scientific backing. Three were inspired by “conventional” news coverage of incidents (television and newspapers).

A Tennessee nurse fainted after getting the Pfizer vaccine. Traditional media publications mentioned the nurse’s past fainting and warned against  immunization.

Hank Aaron died of natural causes two weeks after receiving a COVID immunization. He believed his narrative would motivate African-Americans to get immunized.

Social media messages blaming the vaccine for these two incidents circulated swiftly.

Bangladeshi news reports blamed the Pfizer vaccine for Bell’s palsy. A UK publication followed up the allegation.


Only the rumor that COVID vaccinations induce infertility was unsourced. Two internet stories misinterpreted the research, and social media propagated scientists’ words. Traditional media highlighted the misinformation.

Vaccine skeptics “theory-craft” online link like this. When internet users combine their resources to analyze data and explain occurrences, this happens.

Two scientific sources were distorted into persuasive proof of an infertile cover-up. This notion inspired an internet rumor that COVID immunizations affected fertility.

In the other four situations, traditional media continued to influence people’s knowledge of purported unfavorable events.

How did the media react?

Those who shared social media posts trusted traditional media reports.

International news sources provided “proof” of harmful vaccine reactions. They spread this “proof” globally.

“Clickbait” media sites disseminate falsehoods. Hank Aaron died weeks after taking the COVID-19 vaccine, according to one website. This headline traveled faster and further on social media than most news claiming Aaron’s immunization did not cause his death.

Incorrect and dramatic headlines boosted searches and shares. Rumors are borderless.

Although most of the rumors we researched were spread by the media, journalists also helped refute erroneous claims.

Previous media formats’ disruption threatens the authenticity of online information. Clicks may trump accuracy and reliability for news sources.

Then what?

Online disinformation has no simple remedies, in our opinion.

Social media credibility markers could help authors and tales. A system that allows subject matter experts to “upvote” and “downvote” news stories would help readers determine the legitimacy of certain stories and information.

When purported adverse events demand clarification, scientists and health professionals should promote their own points of view. This can affect a story’s progression.

Scientists and doctors who speak out can’t stop online vaccine-refusing networks from sharing their experiences. These persons are financially motivated to spread misleading information, regardless of its accuracy. Once media outlets debunk damaging rumors, professionals can help limit their spread.

Leave a Reply

Your email address will not be published.