By Charlotte Graham-McLay – The New York Times
WELLINGTON, New Zealand — New Zealand’s prime minister said on Wednesday that she would meet with French leaders next month in hopes of forging an agreement between governments and technology companies aimed at eliminating violent extremist content on the internet.
After a gunman carried out a massacre at two mosques in Christchurch, New Zealand, last month that he broadcast live on the internet, the prime minister, Jacinda Ardern, promised international collaboration on preventing the use of social media to spread militant ideology.
“We all need to act, and that includes social media providers taking more responsibility for the content that is on their platforms,” Ms. Ardern told reporters on Wednesday.
Her meeting on May 15 with the French president, Emmanuel Macron — whose country has also been scarred by terrorist attacks — will come as digital ministers from the Group of 7 industrialized nations gather in Paris. World leaders and tech executives will be invited to the talks on blocking extremist content, though it was unclear how many would attend.
Ms. Ardern acknowledged that her task would be “incredibly difficult,” and she left it unclear exactly what she and Mr. Macron planned to ask the social networks to do. She said that while the “Christchurch call to action” — her name for the pledge she is preparing with Mr. Macron — would include “specific expectations on governments and internet companies,” it would not include new regulations.
Analysts cautioned that any agreement that did not outline specific consequences for failing to halt extremist content would be unlikely to significantly alter tech companies’ behavior.
“It does need to be determined what the prime minister really wants,” said Robyn Caplan, a researcher at Data & Society, a research institute in New York, and a doctoral candidate at Rutgers University. “Without some sort of incentives or disincentives, I’m not quite certain what change will happen.”
Sri Lanka blocked Facebook and other social media platforms after the bombings there that killed more than 350 people at churches and elsewhere on Easter Sunday. Officials said they feared that misinformation and hate speech on the platforms could provoke more violence.
Since the Christchurch attacks five weeks ago, Ms. Ardern has repeatedly told reporters that she would not rush to a solution because any compact with social media companies needed to be global.
New Zealand, which has a population of 4.8 million people, suffers from a “small market problem,” Ms. Caplan said, comparing it to other countries like Canada that have also tried to bolster regulation of social media platforms.
“The companies, depending on what the regulation is, might just pull out rather than comply,” she said, referring to Google’s decision to ban political advertising ahead of the Canadian elections after new transparency laws were introduced.
But France, a much bigger market, has already taken action on its own. It announced in November that it would embed regulators at Facebook for the first six months of 2019 to determine whether its processes for removing hate-fueled content could be improved.
In May, French lawmakers will debate an update to the country’s online hate speech law in an attempt to force social media platforms to take more responsibility for taking down heinous content. Under the legislation, the companies could be fined up to 4 percent of their global revenues if they fail to withdraw extremist content within 24 hours.
Ahead of this debate, government officials have called on platforms like Facebook, Twitter, YouTube and Instagram to act against extremist content. “You are too slow,” France’s junior minister for gender equality, Marlène Schiappa, wrote in a tweet in November. “Your responsibility is to delete content! Stop being accomplices.”
Australia, which historically has been more open to limiting speech than the United States, has also taken strong steps, recently approving regulations that will impose fines on social media platforms that fail to swiftly remove violent content. That change was vehemently opposed by the tech industry, and there has been no suggestion from Ms. Ardern that she plans to enact similar regulations.
She also appears sensitive to worries about infringing freedom of speech, saying on Wednesday that the pledge she and Mr. Macron were preparing would refer specifically to terrorist activity.
Sign up for The Interpreter
Subscribe for original insights, commentary and discussions on the major news stories of the week, from columnists Max Fisher and Amanda Taub.
“This isn’t about freedom of expression; this is about preventing violent extremism and terrorism online,” Ms. Ardern said. “I don’t think anyone would argue that the terrorist had a right to livestream the murder of 50 people.”
But Ms. Caplan said that simply asking tech companies to remove violent content had not worked for some other countries that had tried.
“Each country will have its own definitions of what constitutes hate speech and what constitutes harassment,” she said, adding that there was often a problem of resources as countries asked tech companies to hire more staff members with specific cultural and linguistic knowledge.
Still, Ms. Ardern recounted “positive” interactions with major tech companies ahead of the summit meeting, saying she had spoken to the head of Facebook, Mark Zuckerberg, although she would not say what specific commitments he had made.
Ms. Caplan pointed out that extremists were often radicalized — as the man accused of the Christchurch shootings claimed he had been — outside the largest social media platforms, including in WhatsApp messaging groups and on message boards like 8Chan.
Other analysts cast doubt on Ms. Ardern’s belief that the tech industry was prepared to make major changes. Tech leaders have started to express more openness to government intervention, with Mr. Zuckerberg last month calling for a more active role for governments in setting basic rules.
“I don’t think they operate in good faith,” said Eric Feinberg of the Global Intellectual Property Enforcement Center, who tracks content linked to terrorist organizations.
Mr. Feinberg said that, using his own algorithm, he had found copies of the Christchurch gunman’s video — which is illegal to possess or distribute in New Zealand — on Facebook, Instagram and YouTube, some with hundreds of thousands of views. He said some had been online since shortly after the attack.
“I don’t know what they’re relying on, that there are this many out there and we have to tell them to take it down,” he said of Facebook’s content removal process. “It’s just mind-boggling.”
Elian Peltier contributed reporting from Paris.