By Shannon Connellan and Johnny Location
Announced on Tuesday, the company will implement a "one-time strike" policy that will prohibit anyone who violates the standards of the social network community from using Facebook Live.
Users who violate the most serious rules of the network will be prohibited from using Live for a certain period of time, starting from their first offense. An example of an offense is a user who "shares a link to a statement of a terrorist group without context".
Guy Rosen, Vice President of Integrity of Facebook, said in the blog that the company's goal was to "minimize the risk of abuse on Live while allowing users to use Live live every day ".
Rosen said these restrictions will be extended to other areas of the platform over the next few weeks, starting with the restriction barring users from placing ads.
Previously, Facebook had simply removed content that violated its community standards. If this person continued posting raping content, they would be stuck on the platform for a while. Some have been totally banned.
The restrictions apply to according to an updated definition in , which has seen the ban of many controversial personalities, including Alex Jones, the leader of the Nation of Islam, Louis Farrakhan, Milo Yiannopoulos and others.
In addition to these new live broadcast restrictions, Facebook has also declared investing in research to prevent incidents such as the rapid spread of the video shooter in Christchurch, which has been modified to avoid detection and allow remail.
The company will invest in a $ 7.5 million partnership with three universities: the University of Maryland, Cornell University and the University of California, Berkeley.
The money will go in search for better detection of images, video and audio handled, which could also help handle problems such as deepfakes.
(tagsToTranslate) facebook (t) livestreaming (t) facebook-live (t) tech (t) cybersecurity (t) social media companies</pre></pre>
The British Parliamentary Commission the audience was stimulated by the broadcast of Christchurch's graphic shooting video, which the struggled to contain. The gunman, who killed 50 people and wounded 50 others in two mosques in New Zealand, witnessed his crime on Facebook.
Both Liberal and Conservative politicians criticized companies for allowing hate content to proliferate and, in the case of YouTube, promoting it.
"What are you doing on Earth!" You are props of radicalization, accessories of crime, "said MP Stephen Doughty, according to BuzzFeed.
"You make these crimes possible, you facilitate these crimes," said the president, Yvette Cooper. said "It's surely a serious problem."
Neil Potts of Facebook said that he could not rule out that there were still versions of Chirstchurch trying to pull the platform. And Marco Pancini, YouTube's director of public policy, acknowledged that the platform's recommendation algorithms were pushing people towards more extremist content, even if it was not what they "wanted" .
I've said that Facebook's automation may have a hard time detecting banned content when it's deliberately manipulated, edited and modified to "subvert the system"
– Matthew Thompson (@mattuthompson) April 24, 2019
Stuart Macdonald, MP: "If you're directing innocent users to extremist and hateful content … I do not really see why these algorithms stay in place
Pancini: "It's a very good question, Mr. McDonald"
– Mark Di Stefano 🤙🏻 (@MarkDiStef) April 24, 2019
President Cooper was particularly upset that Facebook said it did not report all crimes to the police. Potts said Facebook was reporting when there was a threat to life, Twitter and YouTube said they have similar policies.
"There are different scales of crime," said Potts. To which Cooper replied. "A crime is a crime … who are you to decide which crime should be reported and which crime should not be reported?"
"You make these crimes possible, you facilitate these crimes."
MPs began to test for themselves how YouTube's algorithm promoted extremist content. Prior to the hearing, they had searched for terms such as "British News" and, in each case, were directed to far-right inflammatory content through the referral engine.
"You may be played by extremists, you actually offer a platform for extremists, you activate extremism on your platforms," Cooper said. "Yet, you continue to provide platforms for this extremism, you continue to show that you do not follow it and, frankly, in the case of YouTube, you continue to promote it to the lives of families and communities in the world. across the country ".
In addition to removing the original livestreamed video, Facebook has said removed 1.5 million instances of video, with 1.2 million of these videos blocked during download within 24 hours of the attack.
The Christchurch video showed how extremists exploit social platforms to spread their message and radicalize their users. Facebook, Youtubeand Twitter all are working to increase both automated and human moderation, building new tools and hiring thousands of employees. But lawmakers have claimed that they were bandits for systemic problems, and extremists use the services exactly as they were supposed to be used: to broadcast and share content, ignite passions and give everyone a platform.
In an open letter published in the New Zealand HeraldSheryl Sandberg, Facebook's operations director, explained how the company has changed its policies following criticism of the Christchurch terrorist attack. Among them: new rules on who can use Facebook Live.
"We are exploring restrictions on who can go live based on factors such as past violations of community standards," wrote Sandberg.
The proposal was light on the details. Sandberg did not specify exactly how Facebook could restrict the ability to broadcast live, without suggesting that those who broke company rules could be affected.
Nevertheless, the fact that Facebook even explores restrictions on Facebook Live suggests that global criticism has an effect. Many people said the company needed to master its live broadcast function after filming Christchurch, in which the shooter had broadcast a 17-minute live video on his attack.
Facebook later stated that it had received its first report only 12 minutes later – and that the original live stream had been viewed more than 4,000 times before it was deleted.
The video was also shared on other websites and then transferred back to the social network more than 1.5 million times, according to Facebook. Although the company managed to prevent the sharing of many of these copies, parts of the video were widely distributed online immediately after the attacks, despite means of the police not so
But as long as we do not know the changes that Facebook could implement and how it will implement them, it's impossible to say how much policy change could be effective.
Sandberg provided details of other means the company uses to make changes. She noted that Facebook just announced that it will ban white nationalism and white separatism and will use its AI technology to help enforce the ban. This change will apply to several New Zealand-specific groups, said Sandberg, including the Lads Society, United Front Patriots, Antipodean Resistance, and the National Front New Zealand.
Facebook will also "provide support" to local mental health organizations in the country, Sandberg said.