JUDY WOODRUFF: Now: how the U.S. government is trying to stem the tide of terrorism messages online, and the role played by social media.
The story is part of our coverage of the 15th anniversary of the 9/11 attacks. It was produced in partnership with tonight’s PBS “NOVA” special, “15 Years of Terror,” reported by Miles O’Brien, and it’s our weekly segment about the Leading Edge of technology.
MILES O’BRIEN: They called it Think Again, Turn Away. The concept? Use sarcasm as way to turn Islamic State images into an argument against their grim techniques of terror.
The creator and producer? The U.S. State Department. Today, everyone agrees the message was worse than ineffective. It played right into the hands of the terrorists.
RICHARD STENGEL, Under Secretary of State: You know, part if it is that I don’t think the government should do snark or sarcasm. I don’t think we’re good at it.
MILES O’BRIEN: Richard Stengel is the undersecretary of state for public diplomacy and public affairs. He realized he was the one who needed to think again.
RICHARD STENGEL: So, one of the things that we realized is that we’re not the best messenger for the message that we want to get out there. In fact, when we are a messenger at all, they use that against us.
MILES O’BRIEN: So they stopped trying to do it themselves, and hired some marketing pros.
TONY SGRO, Edventure Partners: I think that they just — it was out of their league.
MILES O’BRIEN: Tony Sgro is an advertising and marketing veteran who is spearheading a novel competition for college students to create a counter or alternative narrative to the Islamic State propaganda campaign.
TONY SGRO: The government is not the most capable person to develop a counternarrative for a 21-year- old. There’s no credibility factor there.
MILES O’BRIEN: The competition is called Peer to Peer. Undergraduates at two dozen schools all over the world have participated.
In Afghanistan, they created a campaign that included a talk show that focused on understanding the real message of the Koran against extremism. They reached more than five million people. Lahore University produced a plea from students to students to turn apathy into empathy.
The winner of the competition was the Rochester Institute of Technology. Students there came up with this multiplatform campaign.
MAN: Join the campaign.
WOMAN: Because it’s time.
WOMAN: To ex-app.
TONY SGRO: It’s beyond bombs, bullets, and drones. It really is. I mean, we need that stuff. But we really — how are you going to win the hearts and minds? That’s communications. It’s a marketing communications issue.
MILES O’BRIEN: Countering the messages is one thing. Trying to stop them from spreading in the first place is another. For years, the social networking platforms took a laissez-faire approach to this problem. And it only got worse.
But in 2013, things started to change. During a horrifying assault on a shopping mall in Nairobi, Kenya, attackers with the al-Qaida affiliate Al-Shabaab live-tweeted for hours as they shot more than 175 people, killing 67.
JEFF WEYERS, Ibrabo: That was the first time where Twitter was actively removing the content that they were posting.
MILES O’BRIEN: Jeff Weyers is a police officer and terrorism analyst based in Ontario, Canada.
JEFF WEYERS: They were actively tweeting their attack online, and it was the first time we really saw that. ISIS completely blew that out of the water. Right? They took that concept and magnified it by a million.
MILES O’BRIEN: Today, Twitter claims it aggressively goes after accounts linked to terrorists. The company says it closed 360,000 of them in the past year.
JEFF WEYERS: They were very much pushed into it, as opposed to wanting to go down that road. And now I have no doubt that they’re spending millions of dollars just countering that message.
MILES O’BRIEN: Facebook is the largest social networking platform on the planet. It says it has a zero tolerance policy for extremists, but it must contend with a tsunami of content.
Facebook has more than a billion users actively posting every day. The company says about one-half of 1 percent of flagged items are linked to terrorism. But that is still a lot of material.
Monika Bickert is Facebook’s head of global policy management.
MONIKA BICKERT, Facebook: We use photo-matching technology to identify when somebody is trying to upload to Facebook an image that we have already removed for violation our policies.
Of course, the image may or may not violate our policies when it’s uploaded again, because it could be somebody who is sharing a terrorist image the Times Square part of a news story or to condemn violence. So we use automation to flag content that we will then have our teams review.
MILES O’BRIEN: But are there more advanced ways to stop the extremist messages from spreading? Is there a better technological solution?
HANY FARID, Dartmouth College: We have the technology to disrupt, not eliminate, but to disrupt the global transmission of extremism-related content.
MILES O’BRIEN: Hany Farid is a computer scientist at Dartmouth College. He has developed a technique to permanently attach unique digital signatures to images, making it possible for the social networking companies to identify and stop the spread of videos made by, of and for terror.
HANY FARID: So, here is the actual raw frame that you’re seeing processing one frame at a time. And in the frame, we actually analyze multiple blocks within it. The yellow crosshairs that you’re seeing are enumerating the various blocks of the video that we’re analyzing.
This yellow histogram is a distribution of the measurements that we’re making from each individual block, and then that gets translated into an actual digital signature, which I visualize here with a stemplot.
MILES O’BRIEN: The sheer volume of the problem is daunting, billions of uploads a day, each of them with millions of pixels. Can a computer possibly be trained to sort through it all and find the images that inspire new recruits, incite new violence and terrify us all?
Farid has already proven the technology works. He got the idea 10 years ago. The Internet had become a platform for child pornographers. Applying digital signatures to those images has greatly reduced child pornography on the big social networking sites.
HANY FARID: So, if there’s just one image in an upload of yours that has child pornography, we — the account can be frozen. The contents of that account can be assessed, and new content can be discovered.
MILES O’BRIEN: But extremist content is much harder to clearly define. The director of the Free Expression Project at the Center for Democracy and Technology is Emma Llanso.
EMMA LLANSO, Center for Democracy and Technology: The challenge with having a hard-and-fast rule against any a kind of content is that it really does shut down the opportunity for discussion around that sort of material.
MILES O’BRIEN: In June 2016, David Thomson, a French journalist who reports on Islamic extremism, found his Facebook page suspended for three days. The offense? A photo he had posted in 2013 as part of the story he was doing included a partial depiction of the Islamic State flag, a cautionary tale of the unintended consequences of targeting terrorism online.
EMMA LLANSO: It seems appealing to say, oh, just have the major social media companies take a hard-line approach to anything having to do with ISIS. But the fact is, that will end up blocking a lot of speech. That will end up deactivating the accounts of many users.
I mean, some of the platforms have had issues with just deactivating the accounts that women whose first name is Isis. This is a difficult kind of censorship to enact.
MILES O’BRIEN: It’s no longer just a war of bullets, drones and bombs. Technology has created a new battlefield online, and civil society is still grasping for strategies to engage in a virtual battle.
Miles O’Brien, the “PBS NewsHour,” Washington.
JUDY WOODRUFF: And you can watch Miles’ full report, “15 Years of Terror,” tonight on “NOVA” on most PBS stations.
The post Why it’s so hard to fight extremist propaganda online appeared first on PBS NewsHour.