Facebook ups fight against terrorists, hate-mongers

Dr. Erin Saltman tells ‘Post’ how 7,500 reviewers deal with questionable comments from 2 billion users.

Counter-terrorism expert Dr. Erin Saltman, who helps oversee a team of 7,500 people who review questionable Facebook pages and posts (photo credit: AVRAHAM ASCAF)
Counter-terrorism expert Dr. Erin Saltman, who helps oversee a team of 7,500 people who review questionable Facebook pages and posts
(photo credit: AVRAHAM ASCAF)
These are not easy times at Facebook as it faces criticism over the Cambridge Analytica saga and about Russia using the social-media giant to sway public opinion during the 2016 US elections.
But there is at least one area where Facebook had faced heavy criticism in the past – from Israel and much of the organized Jewish community – where it seems to have turned the tide.
While there can always be improvement, Facebook has been getting higher marks from the Israeli government for addressing posts that promote hate and for removing terrorist- related postings.
Ironically, part of the success is seen by critics of Israel as the result of the Jewish state criticizing Facebook for policing what they consider free speech.
However, the truth, while definitely involving Israel, is much broader than just an Israel story.
Advertisers pressure Facebook after data breach, March 22, 2018 (Reuters)
A Related Video You May Like:
 
The story is one of numbers and also people, perhaps most prominently counterterrorism expert Dr. Erin Saltman, who recently gave a rare interview to The Jerusalem Post, providing a look inside Facebook’s major shift to combating abuse of its platform by terrorists.
Advertisement
Saltman leads a team of 180 experts globally. She more broadly helps oversee a team of 7,500 people who review questionable Facebook pages and posts.

Stay updated with the latest news!

Subscribe to The Jerusalem Post Newsletter


She has a daunting task. There are issues of how to balance combating abuse of the platform by terrorists, protecting free speech and confronting hate speech, even if it cannot be removed.
Saltman and Facebook must also balance a combination of using machine learning and algorithms to detect problems, with human reviewers to sort out nuances of language and culture.
Between 18 and 24 months ago, after years of criticism and lawsuits for being slow to address these issues, Facebook launched a major shift, which included hiring Saltman.
Facebook’s efforts over that time are impressive.
Some 85% of Facebook’s two billion users are located outside the US and Canada. Accordingly, Saltman is based in London. She has a high-ranking colleague in Singapore and there are teams in several locations around the globe. Their physical footprint allows them to address issues in various places without a significant time delay.
Saltman utilizes experts who speak local languages and understand the nuances of local culture so that they can help differentiate between free speech, humor and violations of Facebook’s usage rules.
With a PhD from University College London, experience studying Islamic terrorist groups and time in the field undercover studying neo-Nazis, Saltman speaks fluently about the terrorism arena. She consults the US designated terrorists list and UN guidelines, and uses a holistic human process behind the scenes to evaluate posts which have been flagged as potentially problematic.
HER TEAM includes others like her who have depth on the policy and academic side, including former law enforcement personnel and engineers to help figure out how best to apply their ideas to Facebook’s systems.
“We have lots of moving parts to help us to try to work effectively to scale. But we keep localized and in-house knowledge also and we do not solve everything overnight,” said Saltman.
She cautioned against treating “terrorism and extremism interchangeably. We have to be careful. We have real guidelines... You can look at the amazing movements which could be defined as violent – women’s suffrage [in the early 20th century.] There was some violence there and that is something we would not want to counter.”
“Then there is extremism we want to challenge and pivot away from – that is hate-based extremism. Then there are people targeting and inciting violence against protected categories of people, which go beyond race, gender, creed and religion.”
Facebook has “a broader approach to protected categories,” said Saltman, which include “gender identity, severe disabilities and illnesses.”
Explaining the breadth of Facebook’s protected categories, she said, “We have eight protected categories; the UN only has four to five.”
How does the process work for removing content which violates Facebook’s rules? Saltman said that Facebook now gets a sizable number of referrals not just from individual users, but from governments which have a greater capacity for tracking such activity.
She said Facebook is, “getting better and better at things, reviewing quickly and with expertise. We have recently almost doubled the number of human reviewers and soon will be hiring another 10,000.”
Reviewing posts is “not just about doing things quickly, but doing it well. We are not just looking to take things down instantly. Sometimes users flag things that are not against the guidelines. Sometimes they flag cat videos and you have no idea why.”
“Machines are most useful to help cue things up for human review,” she said. “They can look at images, child exploitation, do photo matching to help in making some decisions more quickly... but often for judging incitement and violent rhetoric, we really do need a lot of human capacity.”
Another impressive issue that Facebook says it is addressing is the challenge of singular and sudden major events. She said Facebook has the manpower to throw a large number of human reviewers at an explosion of activity on one issue in a specific location, while still managing oversight over its more typical global issues.
“It is also a good thing seeing other tech companies take on that approach. Google hired a counterterror expert on its policy team, so Facebook is not operating in a silo alone,” she said.
Further, she said that Facebook is “opening doors to smaller platforms as well. Facebook has the capacity to build a team with specialties. Oftentimes small companies might have only five people. There is not a priority for the sixth person to be hired to be a counterterror specialist.”
“We can help with a basic understanding of lots of types of platforms and bring their level of expertise up a notch. We can explain to them what to do with government requests for data – how to use both human and machine learning to advance the efforts for that small team,” she added.
“We need to work as a tech community to bring everyone” to a higher level in dealing with abuse of the Internet, she said, noting, “None of us today use just one app. We see things shifting between platforms.” Moreover, she said for addressing cross-platform problems, Facebook became a founding member of the Global Internet Forum to Counter Terrorism.
On another front, Saltman said that Facebook is performing threat assessments and championing Facebook users’ ideas.
Facebook is also part of a global coalition in which it works with the international ICT4Peace Foundation. That organization has combated cyberterrorism and “promoted cybersecurity and a peaceful cyberspace through cooperation” with governments and international organizations in recent years. She explained these efforts might take longer to bear fruit, but could have a major two-fold impact – both internally within Facebook to confront external issues.
Saltman said she is increasingly working on global Internet forum counterterrorism issues, with that now taking up to 30% of her time.
She qualified that by adding how much time she spends on internal posting issues versus external cooperation on such issues varies.
“This year, in 2018, I am looking to engage the global Internet forum with a second initiative. GIFCT [the Global Internet Forum to Counter Terrorism] will conduct four to five different workshops on four different continents. One engagement will hopefully be in Israel this year.”
The idea is “to bring a bunch of different tech companies to the table to work on the basics and share ideas. Sometimes large companies like Facebook can help troubleshooting some of the same issues. Sometimes we can learn the most from others’ innovative ways” of managing abuses of their platforms.
ASKED FOR an example in which Facebook made a broader change due to her intervention she said, “Facebook is now very thoughtful about things before it acts. Now people say, ‘Let’s think through how bad actors can use our tools before” they are put out there.
“Facebook has supported the one-to-one initiative. It was an amazing thing I was part of in my previous organization. Former extremists and survivors of extremism reach out online through Facebook Messenger to a current extremist in a one-to-one personal dialogue,” she said.
“In phase one of the initiative, for one dollar, you could ensure that the message would reach the extremist’s inbox instead of their spam folder .By phase two, that tool wasn’t available anymore... Bad actors and spam entities were also using the capacity. So we pulled the tool down and changed the methodology,” she continued.
Regarding Israel, Saltman said, “We are cross-functional. It is not just me on the sidelines. We have a local office based in Israel and international teams in five different countries with language expertise. On public policy, we have Jordana Cutler and she and others can feed into the policy perspective, including local language knowledge. If something is happening in Hebrew or Arabic, they can give context behind the scenes.”
Cutler told the Post that in the last 18 months, “I have been focused on trying to bring Israel to Facebook and Facebook to Israel. I invest a great deal in making sure that the company is aware of the challenges Israel faces on a range of topics – from highlighting nuances regarding hate speech [to] working closely with the cybercrime unit of the State’s Attorney’s Office on counterterrorism and illegal content.”
In terms of addressing the high intensity of anti-Israel and antisemitism issues, Saltman said, “We recognize continued issues and sometimes in the world there is growing antisemitism. We also recognize that Israel is one of the most difficult markets regarding violent extremism and terrorism. It is not only important that we have a local office based in Israel to handle things quickly, effectively and locally, but also to champion counter-speech efforts in Israel.”
“We look at other and different parts of the world for championing counter-speech around very specific events in your market. We recognize the diversity within Israel itself. We do not only look at global conflicts. We also look at one specific market... We want to channel things positively locally in Israel,” she added.
She discussed supporting the Global Jewish Congress “with the hashtag ‘We remember’ campaign. This sets a tone and gives off a message of positivity that we want to champion. That is besides just countering all the negativity.”
How did her position get created and what led her out of academia and undercover studies of radicalized extremist and terrorist groups to join Facebook?
“I admit... it is not necessarily a normal thing for social media to have in-house expertise in counterterrorism and violent extremism. But I had done partnerships with NGOs and academic experts... with in-house and outside efforts. Facebook was by far the first social media company to have an impact and to formalize an in-house structure,” she recounted.
Saltman was involved in “The Institute for Strategic Dialogue.
I ran programs in different parts of the world,” she said. “I developed youth programs to avoid violent extremism. A lot of my work was being supported by Facebook.”
She is a big believer in engaging people who are potentially on the borderline of a problematic path, but who have “not necessarily fully embarked on a total hatred and radicalization trend. You can see negative... trends and engage with people. Censoring does not get rid of hate-based ideology. You need to work with civil society and work on developing localized counter-speech efforts.”
She said she was impressed with Facebook’s thoughtful approach in the relevant issues and its assembling experts in human rights, law enforcement and engineering to use innovative methods to combat hate-based ideology. With the scale of her potential impact on Facebook’s two billion users, she said she just couldn’t say no.
Saltman answers to Brian Fishman and Monica Bicker, whom she complimented for their expertise. Regarding Mark Zuckerberg and Sheryl Sandberg, she said they “do not put up barriers for countering terror and violent extremism. It has been a top priority the last few years and will continue to be.”
Addressing the boomerang effect of pleasing some who wanted Facebook to police its platform more, but disappointing some free-speech advocates, she said, “We are always going to be trying to strike the right balance between taking down posts and freedom of speech. What looks like incitement could be political speech and is different in different parts of the world.”
Facebook has been hit particularly hard with accusations of censoring free speech in Indonesia. She referred to her Singapore colleague for answering more specific Indonesia-related questions, but said, “We are looking constantly to involve all parties and to have nuance. That being said, we will never find a policy which make all world governments and populations perfectly happy. The best we can do is keep evolving and learning internally as well as hiring people with diverse backgrounds to get that cultural nuance and understanding.”
That is easier said than done, and Russia’s and Cambridge Analytica’s abuse of the platform still have Facebook on the defensive. But after years of criticism for not combating hate speech and terrorist uses of its platform, Facebook now appears to be confronting these trends in a serious, comprehensive and determined manner.