Facebook says it will who repeatedly share misinformation. The company introduced new warnings that will notify users that repeatedly sharing false claims could result in “their posts moved lower down in News Feed so other people are less likely to see them.”
Until now, the company’s policy has been to down-rank individual posts that are debunked by fact checkers. But posts can go viral long before they are reviewed by fact checkers, and there was little incentive for users to not share these posts in the first place. With the change, Facebook says it will warn users about the consequences of repeatedly sharing misinformation. Pages that are considered repeat offenders will include pop-up warnings when users try to follow them, and individuals who consistently share misinformation will receive notifications that their posts may be less visible in News Feed as a result.
Researchers who study misinformation have pointed out that it’s often the same individuals behind the most viral false claims. For example, a recent report from the Center for Countering Digital Hate found that of anti-vaccine misinformation was linked to just 12 individuals.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.