[ad_1]
Simply because the FDA formally authorised Pfizer’s COVID-19 vaccine for teenagers between the ages of 5 and 11, Meta, Fb’s model new identification, introduced that it’s rolling out stricter insurance policies for vaccine misinformation focused at youngsters (through Engadget). The platform beforehand put restrictions on COVID-19 vaccine misinformation in late 2020, however didn’t have insurance policies particular to children.
Meta says in a brand new weblog publish that it’s partnering with the Facilities for Illness Management and Prevention (CDC) and the World Well being Group (WHO) to take down dangerous content material associated to youngsters and the COVID-19 vaccine. This consists of any posts that indicate the COVID-19 vaccine is unsafe, untested, or ineffective for youngsters. Moreover, Meta will present in-feed reminders in English and Spanish that the vaccine has been authorised for teenagers, and also will present details about the place it’s obtainable.
Picture by Fb
Meta notes that it’s taken down a complete of 20 million items of COVID-19 and vaccine misinformation from each Fb and Instagram because the starting of the pandemic. These numbers are at odds with what we’ve seen from the leaked inner paperwork from Fb — the Fb Papers made it clear simply how unprepared the platform was for misinformation associated to the COVID-19 vaccine. If Fb had been extra ready, it would’ve rolled out campaigns to fight misinformation earlier within the pandemic, each for youngsters and adults, probably eradicating extra false content material consequently.
[ad_2]