Facebook announced Thursday it will start warning users if they have liked, reacted or commented on harmful Covid-19 posts that the company has found to be misinformation and removed.
After the WHO declared Covid-19 a global health emergency in January, Facebook started removing misinformation about the outbreak from its platforms. The company said Thursday it’s removed hundreds of thousands of pieces of misinformation that could lead to physical harm, such as inaccurate content that says physical distancing is ineffective or drinking bleach cures the virus.
“We want to connect people who may have interacted with harmful misinformation about the virus with the truth from authoritative sources in case they see or hear these claims again off of Facebook,” Rosen said.
Facebook, which has been criticized for its handling of health issues, has made several coronavirus-related adjustments to its platform over the past few months.
For example, it has increased the number of partners working on fact-checking to limit the spread of false claims. It also started showing pop-ups that link to official health resources on Instagram and Facebook, which have directed more than 2 billion people to resources from health authorities.