Facebook
The new policy applies to both manipulated images and text posts. Previously, Facebook only removed content that spurred violence directly, but after facing criticism will now remove posts that do so indirectly as well. Tessa Lyons, a Facebook product manager, commented, "We have identified that there is a type of misinformation that is shared in certain countries that can incite underlying tensions and lead to physical harm offline...We have a broader responsibility to not just reduce that type of content but remove it." The new rules only apply to Instagram and not to WhatsApp, where fake news has also circulated.
This is possibly one of the most difficult issues for Facebook to tackle because there is really no clear line about whether something incites violence. The platform is also very insistent on not being an arbiter of truth and free speech. It will be particularly difficult to expand the new rules to countries such as the US where speech is legally protected by law and Facebook will have to be extra careful in navigating such a landscape. They have also faced pressure and criticism from conservative groups that have argued that the company is unfairly targeting posts with conservative viewpoints.
The new rules are already in effect in Sri Lanka and are soon to be introduced in Myanmar. From there, they will eventually be expanded to other countries. Facebook has said that they will form partnerships with local civil society groups in order to effectively identify and target misinformation.
- https://www.nytimes.com/2018/07/18/technology/facebook-to-remove-misinformation-that-leads-to-violence.html
- https://www.businessinsider.com/facebook-removing-misinformation-stirring-violence-2018-7
- http://thehill.com/business-a-lobbying/397939-facebook-to-start-removing-misleading-posts-that-incite-violence