Facing criticism that it spreads extremism and misinformation, Facebook is taking new steps to respond. It’ll more closely monitor groups that unfold fake info and limit links that are more prominent on Facebook than across the net. It’s additionally adding more professional fact checkers from outside the corporate.
Facebook says it’s rolling out a large range of updates aimed toward combating the unfolding of false and harmful data on the social media website. The updates will limit the visibility of links found to be significantly more prominent on Facebook than across the net as a full. Facebook is calling this a “click-gap” signal.
The company is also increasing its fact-checking program with outside professional sources, as well as The Associated Press, to vet videos and other material posted on the social network. Facebook groups will be more closely monitored to prevent the unfolding of fake data, as well as “reducing the reach of Facebook groups that repeatedly share misinformation.”
The company has been facing criticism for the spread of extremism and misinformation on its flagship website and on Instagram. Congress members questioned a corporation representative Tuesday concerning how Facebook prevents violent material from being uploaded and shared on the site.
In a post on the Facebook Newsroom page, Guy Rosen, VP of integrity, and Tessa Lyons, head of news Feed integrity wrote: “Since 2016, we have used a technique called “remove, reduce, and inform” to manage problematic content across the Facebook family of apps. This involves removing content that violates our policies, reducing the spread of problematic content that doesn’t violate our policies and informing people with further info so they can opt for what to click, browse or share. This strategy applies not only during critical times like elections, but year-round.”