What is the severity of Facebook Live’s violence problem?

Facebook Live can be an engaging way to share memories with your friends and grow your audience. However, today’s violent state of the world combined with Facebook’s mass user base is creating a deadly platform being used to capture live acts of torture, rape, and other forms of violence.

With live videos such as a 49-year old man committing suicide on camera, it is painfully clear that Facebook Live is being misused as a platform to perform violent acts in front of anyone who clicks on the videos many of them ingenuous to the nature of the content they are about to witness.

According to a Buzzfeed analysis, at least 45 instances of violence have been broadcasted via Facebook live since its launch in December 2015. This averages out to about 2 videos per month, and the number will continue to rise considering the worsening state of our world today, with terror acts in the UK and the unstable political climate in the Middle East.

An obvious solution to this problem seems to be to simply have a code of conduct and warn users that they will have their video taken down if its nature does not comply to Facebook’s policies. Unfortunately, this is easier said than done. Facebook is already aware of its violence problem and will be hiring an additional 3000 moderators on top of their already 4500 staff throughout this year to help review questionable content on its platform.

However, considering the fact that Facebook receives millions of reports each week, will it be enough? Even with the 4500 moderators that currently exist, two videos of a Thai man killing his 11-year old daughter was broadcasted on Facebook Live, and wasn’t taken down until 24 hours later.

The addition of the 3000 Facebook staff will certainly help speed up the process of removing the extreme videos, but a more efficient way to solve the problem is to change the Facebook Live algorithm itself.

Moreover, artificial intelligence could be used to help determine if whether a questionable video is compliant with Facebook’s terms of service or not. This would eliminate the need for additional workers, and would speed up the process of reviewing reported videos. Alas, this is also easier said than done.

According to Sarah Roberts, an information studies professor at UCLA, “Despite industry claims to the contrary, I don’t know of any computational mechanism that can adequately, accurately, 100 percent do this work in lieu of humans. We’re just not there yet technologically.”

In order for AI to accurately ‘investigate’ live videos, they need to be able to distinguish between a video that is glorifying violence and a video that is bringing attention to violence. In the latter case, Facebook Live has actually become a helpful tool in bringing awareness to situations that require police or law enforcement. This is evident through the Myrtle Beach seaside shooting, which was seen on Facebook Live on Sunday and is being used as evidence to catch the suspect.

Without a doubt, Facebook Live is an excellent platform for users to interact with their followers while creating engaging content. However, it is not without flaws. The company needs to develop a comprehensive solution to its violence problem in a quick and efficient manner to prevent itself from evolving into an unmoderated platform for barbarity.

 

Read More: “How Will The Aftermath of The UK Election Affect The Stock Market?”

Canada-based Ayushi Patel, through her writing wants to help people overcome and fight injustices that are occurring in their lives.

Leave a comment

Your email address will not be published. Required fields are marked *