top of page
  • Inner Voice Artists

How Platforms Are Addressing Covid Misinformation

By Ben Spaeth

Photo: The New York Times – Source

Covid-19 misinformation has been the biggest challenge of the pandemic. It is also perhaps the main reason the pandemic has lasted so long. Misinformation tends to spread rapidly over social media. So much that oftentimes it overwhelms the algorithms social media sites use to identify Covid-19 misinformation.

When people think of Covid-19 misinformation, they often think of that one crazy uncle that posts a little too much on Facebook sometimes, but the vast amount of effective misinformation comes from people with large audiences. Like a certain former president or one Joe Rogan. People genuinely listen to what these two have to say. When people here Donald Trump saying you don’t need to wear a mask or Joe Rogan saying you don’t need to take the vaccine, even if they aren’t initially convinced, it can still lead them down a Covid-19 misinformation rabbit hole under the guise of “doing your own research.”

So let’s examine how some of the biggest media platforms handle misinformation. Spotify has been under water lately for comments Joe Rogan has made and guests he’s had on his podcast. The outrage got so bad that several artists from Neil Young to Joni Mitchell have now taken their music off Spotify. This coupled with an open letter from 270 physicians to action against Covid-19 misinformation, forcing Spotify’s hand to draft platform rules. The rules Spotify put in place are fairly weak, especially when compared to other platforms. Spotify’s rules surrounding Covid-19 are as follows:

Content that promotes dangerous false or dangerous deceptive medical information that may cause offline harm or poses a direct threat to public health includes, but may not be limited to:

  • Asserting that AIDS, COVID-19, cancer or other serious life threatening diseases are a hoax or not real

  • Encouraging the consumption of bleach products to cure various illnesses and diseases

  • Promoting or suggesting that vaccines approved by local health authorities are designed to cause death

  • Encouraging people to purposely get infected with COVID-19 in order to build immunity to it (e.g. promoting or hosting “coronavirus parties”)

There is certainly some wiggle room in there for misinformation. The phrase “includes, but may not be limited to” is doing a lot of work for this policy. Most of the misinformation Joe Rogan has spread, like that young healthy adults don’t need the vaccine, is not explicitly covered by this policy. While some of the more dangerous elements of misinformation are covered by this policy, it does little to regulate someone declaring that people don’t need the vaccine.

Moving on to Facebook. Facebook, despite being known for its ability to spread misinformation, actually has a fairly robust policy in regards to Covid-19 misinformation. It’s too long to paste here but, if you’d like to read it in its entirety, click here. Unlike Spotify, Facebook actually lays out a plethora of known debunked claims and expressly forbids them in the community guidelines. For instance, Facebook does not allow anyone to advocate or promote that other people not take the Covid-19 vaccine. Spotify makes no mention of vaccine misinformation aside from those who suggest that the vaccine may be deadly. Admittedly, Spotify is a smaller platform than Facebook and experiences fewer uploads over the course of day. Most of Spotify’s uploads are songs that have nothing to do with Covid-19. Thus they might not need extensive regulations in regard to Covid-19. However, misinformation manifests itself within the cracks of our system. The longer we let it sit there, the more it spreads.

YouTube, like Facebook, also has extensive guidelines related to Covid-19 misinformation. Facebook and Google have done a lot to ensure that Covid-19 misinformation doesn’t permeate on their platforms. They are not perfect by any means, their algorithms still have difficulty in picking up non-English misinformation, but they are at least trying to flag Covid-19 related content. It’s difficult for these companies to create effective algorithms for certain languages because often the coders primarily speak English. Thus the software developed around recognizing misinformation in non-English languages is often subpar. One could then make the argument that maybe a company shouldn’t operate in a country, if they can’t effectively regulate their own community guidelines.

While there are some holes in all of the previously mentioned platforms guidelines, the only worse than having ineffective guidelines is not having them at all. Which is exactly the case for Apple Podcasts. While Spotify has been taking fire from all of the various Joe Rogan controversies, it’s important to point out that Spotify is only the second biggest Podcasting app. The first being Apple Podcasts which does not officially have a Covid-19 misinformation policy. There might be certain elements of their content guidelines that could be stretched to cover Covid misinformation and uploads do have to go through a review process. However, the guidelines do not explicitly state anything related to Covid-19.

Misinformation is a serious problem in our society. At this point in the pandemic, there is no reason why only 65% of the United States population is vaccinated given the United States’ abundant vaccine supply. According to the Kaiser Family Foundation, 78% of Americans believe in at least one false statement about Covid-19.

People, now more than ever, are looking for alternative news sources because of the clearly present biases on mainstream media networks. People are turning to Facebook, Twitter, and YouTube for fresh takes on the world. It’s the duty of these alternative media companies to ensure that the content their algorithms spread, isn’t misinforming the masses.


bottom of page