Facebook and Instagram Remove Robert Kennedy Jr.’s Nonprofit for Misinformation
Facebook and Instagram on Thursday removed the accounts of Children’s Health Defense, an organization led by Robert F. Kennedy Jr. that is one of the largest U.S. anti-vaccine groups, for spreading medical misinformation.
In an emailed newsletter, Children’s Health Defense said Facebook and Instagram had taken down its accounts after a 30-day ban by the social networks. The nonprofit, which Mr. Kennedy has run since 2018, accused the apps of censorship.
“Removing CHD accounts is evidence of a clearly orchestrated attempt to stop the impact we have during a time of heightened criticism of our public health institutions,” the group said.
In a statement, Mr. Kennedy said, “Facebook is acting here as a surrogate for the Federal government’s crusade to silence all criticism of draconian government policies.”
Children’s Health Defense is widely regarded as a symbol of the vaccine resistance movement. Last year, the organization was named one of the “Disinformation Dozen,” which refers to the top 12 superspreaders of misinformation about Covid-19 on the internet, according to the Center for Countering Digital Hate.
Read More on Facebook and Meta
- A New Name: In 2021, Mark Zuckerberg announced that Facebook would change its name to Meta, as part of a wider strategy shift toward the so-called metaverse that aims at introducing people to shared virtual worlds.
- Morphing Into Meta: Mr. Zuckerberg is setting a relentless pace as he leads the company into the next phase. But the pivot is causing internal disruption and uncertainty.
- Zuckerberg’s No. 2: In June, Sheryl Sandberg, the company’s chief financing officer announced she would step down from Meta, depriving Mr. Zuckerberg of his top deputy.
- Tough Times Ahead: After years of financial strength, the company is now grappling with upheaval in the global economy, a blow to its advertising business and a Federal Trade Commission lawsuit.
Meta, which owns Facebook and Instagram, said it removed the main accounts of Children’s Health Defense because the group had “repeatedly” violated the company’s policies on medical misinformation during the coronavirus pandemic. Children’s Health Defense said that in total, it had more than half a million followers on its main Facebook and Instagram pages.
Facebook’s and Instagram’s actions are a blow to Mr. Kennedy, who is the son of the former senator and U.S. Attorney General Robert F. Kennedy. But the account removals do not completely block him from speaking online. While Mr. Kennedy was personally barred from Instagram in February 2021, his personal Facebook page — with nearly 247,000 followers — is still up.
Other Facebook pages dedicated to Children’s Health Defense, including those of its California, Florida and Arizona chapters, also remain online and have thousands of followers, according to a review by The New York Times.
Over the course of the pandemic, Children’s Health Defense has repeatedly questioned the safety of Covid-19 vaccines, falsely saying that the vaccines cause organ damage and harm pregnant women. The organization has also tried sowing doubt about other kinds of vaccines. Over the last two months, it claimed that vaccines for tetanus caused infertility and that polio vaccines were responsible for a global rise in polio cases.
Meta has policies forbidding medical misinformation but has struggled with how to enforce them. The company has had over a dozen discussions about removing the accounts of Children’s Health Defense from Facebook and Instagram over the last year, said two people with knowledge of the conversations, who asked to remain anonymous because they were not authorized to speak publicly.
Last month, Nick Clegg, Meta’s president of global affairs, wrote an open letter about the company’s medical misinformation policies. He said its strict policies about Covid-19 misinformation, which were put in place at the start of the pandemic, might need to be reconsidered as many parts of the world returned to normal.
Before the pandemic, Meta removed only posts that could “contribute to a risk of imminent physical harm,” Mr. Clegg said. During the pandemic, the company broadened that to remove false claims about “masking, social distancing and the transmissibility of the virus.” Those latter measures may no longer need to stay in place, he suggested.
He added that over 25 million pieces of content related to Covid-19 misinformation had been removed since the start of the pandemic.
Misinformation experts have said that conspiracy theories and falsehoods about Covid-19 remain prominent on Facebook and Instagram and have continued to attract attention.