VAIDS

Tuesday, March 19, 2019

Facebook & Others Social Media face Pressure to Monitor Content after New Zealand attack

Pressure is building on Facebook and other social media platforms to stop hosting extremist propaganda including terrorist events, after Friday’s deadly attacks on two mosques in New Zealand were live-streamed.

Australia’s prime minister has urged the Group of 20 nations to use a meeting in June to discuss a crackdown, while New Zealand media reported the nation’s biggest banks have pulled their advertising from Facebook and Google.

“We cannot simply sit back and
accept that these platforms just exist and what is said is not the responsibility of the place where they are published,” New Zealand Prime Minister Jacinda Ardern told parliament on Tuesday. “They are the publisher, not just the postman. There cannot be a case of all profit, no responsibility.”

Facebook said it had been working directly with New Zealand police and across the technology industry to “help counter hate speech and the threat of terrorism”.
The lone shooter accused of killing 50 people in the New Zealand city of Christchurch live-streamed the murders, with the video continuing to be widely available on a range of platforms hours after the attack. The suspect, an Australian, uploaded his hate-filled manifesto online shortly before launching his assault.

Offensive content
It’s the latest example of social media companies struggling to keep offensive content from sites that generate billions of dollars in revenue from advertisers — a problem that’s seen Facebook founder Mark Zuckerberg grilled by the US Congress.
The shooting video was viewed fewer than 200 times during its live broadcast, and no users reported the video during that time, Facebook vice-president and deputy general counsel Chris Sonderby said. It was reported to the company 29 minutes after the video started and viewed 4,000 times before being removed, he said.
The Group of 20 nations should discuss the issue at its Osaka summit in June, Australian Prime Minister Scott Morrison said in an open letter to this year’s host, Japan counterpart Shinzo Abe. The group should work to ensure technology firms implement appropriate filtering and remove terrorist-linked content, and show transparency in meeting those requirements, he said.

“It is unacceptable to treat the internet as an ungoverned space,” Morrison said. “It is imperative that the global community works together to ensure that technology firms meet their moral obligation to protect the communities which they serve and from which they profit.”

Ardern’s government will look at the role social media played and what steps it can take, including on the international stage. Previously she vowed to seek talks with Facebook, which said it blocked the upload of 1.2-million video clips and removed another 300,000 within 24 hours.
The New Zealand business community is becoming increasingly vocal that the social-media companies should be rebuked by restricting their bottom line.

The Association of New Zealand Advertisers is encouraging advertisers to recognise they have a choice where their advertising dollars are spent and to carefully consider where ads appear.
“We challenge Facebook and other platform owners to immediately take steps to effectively moderate hate content before another tragedy can be streamed online,” the association said.
Meanwhile, New Zealand’s three biggest broadband providers called on Facebook, Twitter and Google to join an urgent discussion at an industry and government level to find a solution to the live-streaming and hosting of video footage such as that produced in Christchurch.

“The discussion must start somewhere,” the CEOs of the companies said in an open letter on their websites Tuesday. “Social media companies and hosting platforms that enable the sharing of user-generated content with the public have a legal duty of care to protect their users and wider society by preventing the uploading and sharing of content such as this video.”
Artificial intelligence techniques could be deployed and, for the most serious types of content, more onerous requirements should apply including taking down the material within a specified period, proactive measures and fines for failure to do so, they said.

“Now is the time for this conversation to be had, and we call on all of you to join us at the table and be part of the solution.”

  • Bloomberg

No comments:

Post a Comment

Share

Enter your Email Below To Get Quality Updates Directly Into Your Inbox FREE !!<|p>

Widget By

VAIDS

FORD FIGO