News Update

Opinion: The consequences will be deadly if we don't fight vaccine misinformation

.leftside-floating-image {
width:100%;
display:block;
margin:0 auto 1.5rem;
}
@media screen and (min-width: 200px) {
.leftside-floating-image {
float:left;
width:50%;
max-width: 200px;
display:block;
margin:0 1.5rem 1rem 0;
}
}

Perspectives K.
According to Pew Research Center, about 30% of Americans said they are not likely to get a vaccine, though this figure has steadily declined from September 2020. Misinformation and disinformation about vaccines and their effectiveness, particularly on social media, are one of the major drivers behind this hesitancy, and governments have been limited in how to rein it in. That’s why social media companies must do more to combat vaccine misinformation and disinformation. If they don’t, the consequences could literally be deadly, as vaccinations are key to stemming the pandemic. What’s more, Covid-19 misinformation could have a halo effect on other public health issues in the future.
For anti-vaccine forces, social media platforms are invaluable. The platforms have made it possible to create an information ecosystem and a networked environment that drives anti-vaccine sentiments across the globe. And governments and public health forces have little ability to regulate this misinformation given how quickly it spreads. By lowering barriers to create, post and forward “information,” each user of the platform is a mass medium. The good, in this case, clearly comes with the bad.
Facebook, which owns Instagram and WhatsApp, recently announced that users who are in states that have opened appointments to all adults will get notifications about their eligibility at the top of their news feeds. The company has also unveiled a set of tools to help people in the United States and other countries locate places where they can get vaccinated, and it claims it is making data available to governments to address vaccine hesitancy and using WhatsApp chatbots to help in registration. Other platforms, too, such as TikTok and Twitter, have policies to monitor Covid-19 vaccine-related misinformation and encourage viewers to access information from reliable sources, such as the World Health Organization. But these are small, limited efforts that fail to deal with a more fundamental problem: Messages aggressively arguing against vaccinations are still too easy to find.
Here are some more steps social media companies should be taking:

Create tools for public health authorities

The onus for identifying and removing disinformation lies with the platforms — which have the technical capacity — and not on users. Social media companies should create tools and products to help public health authorities counter anti-vaccine propaganda, but also train them in using these tools. The tools could monitor misnformation, alert health authorities when it gains traction and then actively aid in countering it. And there’s no reason this can’t be a two-way street. Public health authorities should develop their own surveillance systems that allow them to track public health misinformation and disinformation in the information environment, including on social media. Similar tools should be made available without cost and with greater access to all their features in the interest of public health.

Tweak the algorithms if necessary

The absence of “gatekeeping” — the series of editorial judgments that a typical news story filters through before it gets on the air or in print at a traditional news media outlet — means that there is nothing to check the baseless claims and distortions of facts that gets a sense of legitimacy just because it is on the Internet. Unlike the more traditional news media, like television and radio, which are subject to laws governing free expression, social media platforms, at least in the United States, are protected under Section 230 of the Communications Decency Act, which absolves them of responsibility for almost all content produced by a third party, such as a person or organization advocating against vaccines.
Several people, including members of Congress, are suggesting that protections offered to social media under Section 230 should be revisited or even rescinded. Social media companies say they are monitoring the misleading Covid-19 content and doing their best to flag it or even remove it. But is this sufficient? If their efforts were truly effective, vaccine misnformation wouldn’t continue to show up on their platforms. While the efforts of social media companies so far are in the right direction, more can be done to tweak their algorithms to more strongly monitor, identify and downrank misinformation. The question is whether they’re willing to take a closer look at how their algorithms are driving misinformation and disinformation and use their technical prowess to stop it. If not, calls for regulation of social media will continue to increase.
For those who hold free expression as a core value, regulation of platforms may or may not be desirable — after all, they are not making widgets, they are dealing with the spread of information and ideas. Any government regulation of ideas is a slippery slope, but if this happens as a result of the platforms’ hesitancy to crack down on vaccine misinformation, then social media companies will have only themselves to blame.
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Most Popular

To Top