In the coming weeks, users have liked, shared, or commented on false content will receive a pop-up warning in their News Feed. The message will direct them to a World Health Organization (WHO) webpage where coronavirus myths are debunked. “We want to connect people who may have interacted with harmful misinformation about the virus with the truth from authoritative sources in case they see or hear these claims again off of Facebook,” Guy Rosen, Facebook’s vice president for integrity, wrote in a blog post. The system was announced the day after campaign group Avaaz released a scathing report on Facebook’s “fake news” problem. [Read: Social media firms will use more AI to combat coronavirus misinformation, even if it makes more mistakes] Avaaz investigated 104 Facebook posts and videos about coronavirus that independent fact-checkers had flagged as false. The researchers estimated that the content had been shared almost 1.7 million times and racked up 117 million views. They also found that Facebook can take 22 days to issue warning labels for misinformation. Nonetheless, the group praised Facebook for becoming the first social media platform to alert every user exposed to coronavirus misinformation.