Nobody ever thought that somebody could go ahead and Live-stream a heinous crime such as New Zealand Mosque shooting, which took more than 49 people’s lives’ including Atta Elayyan, MetroTube app developer and the CEO of Lazyworm Apps. Nea Zealand took prompt actions and held talks with social media giant Facebook to introduce restrictions on who can go Live. Facebook promised that it will be introducing the restrictions soon.
Facebook yesterday announced that it has introduced some new rules on Facebook regarding who can go Live.
Following the horrific terrorist attacks in New Zealand, we’ve been reviewing what more we can do to limit our services from being used to cause harm or spread hate. As a direct result, starting today, people who have broken certain rules on Facebook — including our Dangerous Organizations and Individuals policy — will be restricted from using Facebook Live […]
Today we are tightening the rules that apply specifically to Live. We will now apply a ‘one strike’ policy to Live in connection with a broader range of offenses. From now on, anyone who violates our most serious policies will be restricted from using Live for set periods of time – for example 30 days – starting on their first offense. For instance, someone who shares a link to a statement from a terrorist group with no context will now be immediately blocked from using Live for a set period of time.
One interesting thing to note here is that the social media giant has no intention to ban anyone from going Live for a period of more than 30 days.
The company then talked about challenges it faces to automatically detect and block edited versions of violent videos.
One of the challenges we faced in the days after the Christchurch attack was a proliferation of many different variants of the video of the attack. People — not always intentionally — shared edited versions of the video, which made it hard for our systems to detect.
Microsoft’s Brad Smith earlier suggested that companies should work together to create a “major event” protocol to identify and catch edited versions of the same content.