Facebook bans vaccine misinformation targeted at children
Facebook bans vaccine misinformation targeted at children
Just as the FDA officially approved Pfizer's COVID-19 vaccine for children ages five to 11, Facebook's brand new identity, Meta announced that it's going to be tougher for vaccine misinformation targeted at children. Making policies. (via Engadget). The platform previously banned COVID-19 vaccine misinformation in late 2020, but did not have specific policies for children.
Meta said in a new blog post that it is partnering with the Centers for Disease Control and Prevention (CDC) and the World Health Organization (WHO) to remove harmful content related to children and the COVID-19 vaccine. This includes any posts that indicate a COVID-19 vaccine for children is unsafe, untested, or ineffective. Additionally, META will provide in-feed reminders in English and Spanish that the vaccine is approved for children, and will also provide information on where it is available.
Meta notes that it has removed a total of 20 million pieces of COVID-19 and vaccine misinformation from both Facebook and Instagram since the start of the pandemic. These numbers are odd from what we've seen from internal documents leaked from Facebook - Facebook Papers made it clear how unprepared the platform was for misinformation related to the COVID-19 vaccine. Had Facebook been more prepared, it would launch a campaign for both children and adults to combat misinformation in the pandemic, potentially resulting in the removal of more false content.
No comments