WASHINGTON – U.S. Senators Mark Warner (D-Va.), Bob Menendez (D-N.J.), and Mazie Hirono (D-Hawaii) today slammed Facebook for failing to remove vaccine misinformation from its platforms. The rapid spread of dangerous misinformation across social media could hamper the efforts of public health officials as they work to vaccinate hard-to-reach communities and hesitant individuals, representing a serious concern for public safety. Studies show that roughly 275,000 Facebook users belong to anti-vaccine groups on the platform.
“As public health experts struggle to reach individuals who are vaccine hesitant, epidemiologists warn that low rates of vaccine rates coupled with the relaxing of mask mandates could result in new COVID-19 outbreaks,” the senators wrote in a letter to Facebook CEO Mark Zuckerberg. “Moreover, most public health officials agree that because herd immunity in the U.S. is now unlikely, ‘continued immunizations, especially for people at highest risk because of age, exposure or health status, will be crucial to limiting the severity of outbreaks, if not their frequency’. In short, ‘vaccinations remain the key to transforming the virus into a controllable threat’.”
A recent report from Markup.org’s “Citizen Browser project” found that there are 117 active anti-vaccine groups on Facebook. Combined, the groups had roughly 275,000 members. The study also found that Facebook was recommending health groups to its users, including anti-vaccine groups and pages that spread COVID-19 misinformation and propaganda.
The lawmakers asked Zuckerberg a series of questions, including why users were recommended vaccine misinformation; how long anti-vaccine groups and pages remained on the platform before being taken down; and what specific steps the company is taking to ensure its platforms do not recommend vaccine misinformation to its users.
A copy of the letter can be found here and below:
Dear Mr. Zuckerberg,
We write to express our concern over recent reporting alleging that Facebook failed to remove vaccine misinformation from its platforms. As the U.S. struggles to reach vaccine hesitant individuals and the world grapples with new variants, it is more important than ever that social media companies such as Facebook ensure that its platforms are free from disinformation.
In a February 2021 blog post, Facebook promised to expand “the list of false claims [it] will remove to include additional debunked claims about the coronavirus and vaccines. This includes claims such as: COVID-19 is man-made or manufactured; Vaccines are not effective at preventing the disease they are meant to protect against; It’s safer to get the disease than to get the vaccine; [and] Vaccines are toxic, dangerous or cause autism.” According to data from the Markup.org’s “Citizen Browser project,” misinformation regarding COVID-19 and vaccines are readily available on Facebook. According to Madelyn Webb, a senior researcher at Media Matters, as late as April 2021, she found 117 active anti-vaccine groups on Facebook. Combined, those groups had roughly 275,000 members. Even more troubling is the finding that Facebook “continued to recommend health groups to its users, including blatantly anti-vaccine groups and pages explicitly founded to propagate lies about the pandemic.” As public health experts struggle to reach individuals who are vaccine hesitant, epidemiologists warn that low rates of vaccine rates coupled with the relaxing of mask mandates could result in new COVID-19 outbreaks. Moreover, most public health officials agree that because herd immunity in the U.S. is now unlikely, “[c]ontinued immunizations, especially for people at highest risk because of age, exposure or health status, will be crucial to limiting the severity of outbreaks, if not their frequency.” In short, “vaccinations remain the key to transforming the virus into a controllable threat.”
In March 2021, Senator Warner wrote to you expressing these same concerns. Your April 2021 response failed to directly answer the questions posed in his letter. Specifically, you failed to respond to a question as to why posts with content warnings about health misinformation were promoted into Instagram feeds. Given Facebook’s continued failure to remove vaccine misinformation from its platforms, we seek answers to the following questions no later than July 5, 2021.
1. In calendar year 2021, how many users viewed vaccine-related misinformation?
2. In calendar year 2021, how many users were recommended anti-vaccine information or vaccine-related misinformation?
a. Why were these users recommended such information?
3. In calendar year 2021, how many vaccine-related posts has Facebook removed due to violations of its vaccine misinformation policy? How many pages were removed? How many accounts were removed? How many groups were removed?
a. On average, how long did these pages or posts remain on the platform before Facebook removed them?
4. What steps is Facebook taking to ensure that its platforms do not recommend vaccine-related misinformation to its users? Please be specific.
5. What steps is Facebook taking to ensure that individuals who search out anti-vaccine content are not subsequently shown additional misinformation?
6. In March 2019, Facebook said it would stop recommending groups that contained vaccine-related misinformation content. It wasn’t until February 2021 that the company announced it would remove such content across the platform. Why did it take Facebook nearly a year to make this decision?
Thank you in advance or your prompt response to the above questions.