Yasmin Green is talking about the considerable social media following of a 14-year-old Kurdish girl. Green, the director of research and development for Jigsaw, which, along with Google, is part of the umbrella entity called Alphabet, confesses she was a bit envious.
This girl was “super smart, and she had a social media following of like 30,000 people at 14,” says Green. “I feel like I’ve been on social media a long time, and I haven’t managed to get any traction, so I was kind of curious.”
This large cadre of devotees looked to the teenager for updates about Justin Bieber’s music and concerts, which would explain why an obscure kid in a remote corner of the world would amass such a following. But Green says the innocent fan-girling eventually gave way to content about ISIS–the Islamic State of Iraq and the Levant.
A pubescent girl who rose to influencer status among the “Beliebers” had turned to messaging about the terrorist group in her social feeds. The reason, Green says, was that this girl gave her account away to a foreign fighter she thought was going to marry her. “Think about social media accounts as assets,” Green explains. “That was something valuable she had, and that she then traded.” Her account was suspended when she went to Raqqa, Syria.
Green has countless stories like this from her experience leading projects in Iran, Syria, the UAE, and Nigeria. Sitting in a small conference room at Jigsaw’s offices located behind an unmarked door in New York’s Chelsea Market, she offers a brief rundown of the work to date. In 2012, Green led a multi-partner coalition to launch Against Violent Extremism, the world’s first online network of former violent extremists and survivors of terrorism. In fact, the walls outside the communal kitchen are hung with 15 large black-and-white photographs. They are portraits of former violent extremists who have renounced brutality, attended the Jigsaw Summit Against Violent Extremism in 2011, and work to help young people leave terrorist groups.
Based on her own interviews with ISIS defectors like this 14-year old girl and jailed recruits, Green launched the Redirect Method, a new deployment of targeted advertising and video aimed at confronting online radicalization and stopping kids like this from joining the ranks of terrorist groups.
Although the program is rooted in Google’s AdWords technology and curated YouTube video content, Green insists that algorithms aren’t the whole picture. When you consider the rise in the cybersuccess of ISIS, she says, their most impressive accomplishment wasn’t the tech savviness or the innovation. “It was the insight into what makes humans tick, and how to use readily available online tools and social media to exploit people based on their insecurities, prejudices, and fears,” she observes.
ISIS gained control of more than 34,000 square miles in Syria and Iraq, from the Mediterranean coast to south of Baghdad by 2014. (At the end of 2016, that territory diminished to about 23,320 square miles.) They’ve claimed responsibility for attacks in multiple countries, including Syria, Iraq, Saudia Arabia, Turkey, Australia, Germany, the U.K., the U.S., and more, killing thousands and injuring countless more.
And it was the human stories that Green maintains helped build out the Redirect Method as a potential solution. Over the course of multiple trips to interview young people who were being held in custody by the Kurdish regional government, Green emphasizes the massive disconnect that exists between our perception of radical terrorists and the reality she experienced. “You see these images on the news of executions and beheadings and such a violent, gory activity that’s committed by ISIS, and then you see the people that they’re attracting,” she says. Most were in their early 20s, wearing sportswear, like Adidas pants and flip-flops, “pretty scrawny, really,” she says.
“They want to belong”
Yet these young people had left home, gone to Raqqa (the capital of the so-called caliphate), underwent military training and religious indoctrination, and served in ISIS’s army. “Some had trained as suicide bombers, some were night watchmen, [others] were in technical roles,” she says. Back then, she says, ISIS accepted everybody who was interested in joining their ranks. “But that’s not true on the flip side: You can’t leave,” says Green.
One young man used an image from a Tom and Jerry cartoon to illustrate the finality of the decision: “When Jerry wants to escape and Tom locks the door and swallows the key,” Green recalls. “And you see the key going down his throat, and you have the sinking feeling in your stomach that you’re never going to be able to get out.” This is not the image most people would conjure up when trying to picture what someone who had trained as a suicide bomber would look like, much less talk about, says Green.
Another girl was just 13 years old when she tried to travel to Syria. “You wonder what could possibly make [her]–she doesn’t want to behead anyone– want to do that,” says Green, shaking her head. When Green asked her to explain her thinking behind her decision, the girl confessed to looking at pictures online of what life was like in Syria and said, “I thought I was going to go and live in the Islamic Disney World.” Says Green, “That’s somebody that, hopefully, we can spare from being radicalized. She thought she was going to go hang out in the mall with her friends, meet some jihadi Brad Pitt, and get married and live happily ever after.”
It sounds like a simple proposition, especially when you think of how an impressionable kid could be swayed by glamorous images and the promise of security. But Green says they all have different motives. Some were on a quest that they thought was their religious duty, and others wanted to fight for the right of Muslims and live in a gender-segregated utopia. Still others were subjected to local recruiters who were feeding into grievances from their respective village or town. Ultimately, says Green, “They want to belong.”
The quest to “do the right thing”
Their personal stories, along with how technology played a role in their recruitment, held powerful lessons for Green and the team at Jigsaw. One was that once these kids got far enough into involvement with ISIS, they became disillusioned, and the myth was exposed. “We asked the Tom and Jerry guy, ‘If you were to know everything that you know now, like the brutality and the corruption, and the starvation, the day you left home, would you still have gone?\’” He said yes, which Green says was “kind of devastating to hear, because I thought, What can we possibly do then?”
Of course, by that time, the boy had been so brainwashed that he felt compelled to go. But then he said that if someone had intervened even six months before, he might have changed his mind. “You need to reach them when they’re still researching, and they go online and search for answers,” Green says, “which means that there’s a mechanism for us to reach them.”
The other lesson posed an interesting challenge. Green says that there’s a tendency, both within the tech sector and outside of it, to think of counterterrorism as binary. “That there’s good and there’s bad, and we should remove the bad from the internet, even when the bad is people and content,” Green says.
Of course, Green says, there are definitely “evil people” who are part of ISIS. “But the people we spoke to seem like people who could have been saved from going in. They didn’t seem like they were irredeemably bad,” she says.
At Jigsaw, Green says, “We try to spend a bit more time engaging with the issue to really understand the emotional aspect of it and whether it is a threat.” Indeed, with this project, Jigsaw is a living illustration of the evolution of Google/Alphabet’s motto. Once famous for, “Don’t be evil,” Google dropped that line with the creation of Alphabet and exchanged it for, “Do the right thing.”
The right thing in this case, according to Green, was to take Google’s “pretty effective algorithms” for its targeted ad technology (which generated almost $95.4 billion in revenue in 2017) and serve up relevant content that would counteract what was coming from ISIS.
But they couldn’t just rely on the algorithms. According to Green, the thing that really required a lot of human oversight was the targeting keywords and the videos. The Jigsaw team worked with external groups Moonshot CVE and Quantum Communications that were responsible for vetting and updating keywords in English and Arabic. This was constantly evolving as new propaganda material, news agencies, and battlefield developments would change what people would be searching for. The videos were reviewed by a council of theologians, law enforcement, and other experts.
Not just anyone gets served up this type of content. Someone might be searching for ISIS to find out how many attacks have been carried out, for example. Green says that through conversations with the former radicals, the Jigsaw team learned how these individuals conducted their searches. One woman told her that she purposely avoided BBC News “because they hate Muslims and you can’t believe anything they say.” So she’d search for content coming directly from ISIS channels. Someone who has a steady diet of mainstream media wouldn’t be a target and not get served that counterterrorist content.
A pilot program was launched in 2015. Over the course of eight weeks, 320,000 individuals watched over half a million minutes of the 116 videos that were selected to refute ISIS’s recruiting themes. Green admits that Jigsaw doesn’t currently have a mechanism to track how many potential recruits have changed their minds about joining ISIS.
However, she does say that Moonshot CVE has taken the methodology and deployed it for other ideologies in other regions, too. She says they just got some $1.5 million to target violent Islamists and the violent far-right in Canada. “They’re also deploying in the U.S. with Gen Next Foundation,” she says. The venture philanthropists who are funding the program is a point of pride for Green, as it proves that Redirect doesn’t require Google or Jigsaw funding. The open-source nature of the technology means it can be adopted by others who want to defeat radicalization by getting in before it’s too late.
For Green, who is about to give birth to her second child, the implications of this human approach to counterterrorism is part and parcel of the work Jigsaw is doing with other programs.
For example, Protect Your Election spanned Ukraine, Hungary, Mexico, Zimbabwe, Brazil, Latvia, Sweden, and India in 2018. Between news and election organizations, the Jigsaw team held various meetings that reached an estimated 10,000 people, trained hundreds of election officials in the U.S., and distributed about 5,000 security keys for two-factor authentications. Another program called Perspective is aimed at flagging online abuse in media properties. Perspective’s API has already scored more than 15 billion comments to train its machine-learning models. The Perspective research team has created the largest dataset of abusive comments.
“I would never form a hypothesis about how to address a digital threat without firsthand conversations with victims and people who were former perpetrators,” Green says. “It’s just incredible how often things like kindness and fairness and self-esteem come up as factors in either why somebody committed a threat, or a threat was effective in creating harm. And those aren’t really factors that you often hear technologists, or even policy makers, talk about.” But, she maintains, “If you accept that that’s a major driver of both the creation of harm and effective harm, then you end up generating different technology prescriptions.”
Green has been with Google for 13 years, and she can recall when everything was about mobile. Now, with everything being about AI, Green says she’s optimistic, but also underscores how important the human role will continue to be in the development of this technology. Although it’s still too early to see the results of these pilot projects and new tools, Green says she’s glad that the team is staffed with people who have anthropology and ethnography backgrounds.
The question weighing on Green now, both personally and professionally, illustrates how far Jigsaw and others have to go. “What can we achieve if we aspire to something greater than computer intelligence?” she asks. “Because it looks like that’s what the bad guys are doing, and they’re really on to something.”
This content was originally published here.