What would Peres say if asked: Does Facebook support terror?

Facebook has taken a few steps in the right direction. More needs to happen.

Shimon Peres (photo credit: REUTERS)
Shimon Peres
(photo credit: REUTERS)
‘I love Hamas.”
Sound like incitement? Support for a terrorist organization? You bet.
Now imagine that a social media company like Facebook decides to ban those three words from its platform, in English, Arabic, French – or any language for that matter.
What happens, though, if BBC, or even this paper, decides to post a story about a terrorist who yelled those three words right before an attack? Would it also have to be removed?
Not such a black and white question anymore.
On the other hand, take the case of Richard Lakin, the 76-year-old retired Connecticut school principal, who was murdered by Palestinian terrorists during an attack on a Jerusalem bus in October.
After the murder, authorities discovered that one of the attackers had posted on Facebook his desire to become a martyr and kill Jews. A day after the attack, there was already a new page on Facebook glorifying the terrorist, and urging others to join the resistance against Israel.
Today, Lakin’s family is part of a billion- dollar lawsuit against Facebook, and the Israeli government is pushing new legislation that would allow it to take Facebook and other social media platforms to court every time they don’t comply with a government demand to remove content deemed as incitement.
I mention this since a high-ranking delegation of Facebook executives arrived in Israel this week, the first visit of its kind. Leading the delegation was Joel Kaplan, Facebook’s vice president for public policy, who previously served as deputy chief of staff under president George W. Bush. He was joined by Monika Bickert, head of product policy and counter-terrorism.
Bickert is basically the company’s counter-terrorism czar, responsible for preventing the platform from being used by criminals and terrorists. A graduate of Harvard Law School, Bickert spent several years as an assistant US attorney in the Chicago area, where she handled a number of high-profile criminal cases.

Stay updated with the latest news!

Subscribe to The Jerusalem Post Newsletter


That experience serves her today at Facebook, where she plays a key role in setting the company’s “community standards” – the basic guidelines on how people can use the platform.
The categories vary, but some are straightforward – child pornography, for example, is immediately removed. On the other hand, there is also content that the company views as “contextually complicated.” Take, for example, someone who posts Hamas’s logo. If it’s posted by a terrorist then it should be removed, but if it’s posted by an academic writing about the symbolism in the logo, then it should be allowed to remain.
Due to the grayness surrounding this issue, Facebook cannot automatize the process. Instead, it has hundreds of monitors – real people – who sit in centers around the world, receive reports of abuse by users, evaluate them, and decide if the content needs to be removed.
Justice Minister Ayelet Shaked, for example, revealed this week that Facebook recently took offline 95% of the posts it was asked to remove by Israel. But she forgot to mention that most of the pages removed belonged to actual terrorists, people who could easily be identified.
In a previous list sent to Facebook several months ago, the content Israel was asking to be removed included posts with text and photos. Then the percentage removed was much lower. Why? Because those posts were contextually complicated.
It is exactly because of cases like these that it is unpractical to expect Facebook, Twitter and others to use automated systems to remove Hamas logos from Facebook. Such legislation – like the bill proposed recently by Zionist Union MK Revital Swid, which calls on social media companies to pro-actively search and take down incitement automatically – is simply unrealistic.
There is no such thing as automatic in these cases. Posts need to be reported. They need to be evaluated. And then they need to be removed. While some cases are black-and-white, many others are gray. That is why enforcement of community standards continuously shifts, and the definitions of banned material are regularly updated. The hundreds of monitors who sift through the millions of reports receive new guidelines on a weekly basis, advised to look out for new slurs, terror groups and language.
This does not mean that Israel isn’t right for demanding that social media companies do more to remove incitement from their platforms.
Public Security Minister Gilad Erdan deserves the credit for getting Facebook to take Israel’s concerns more seriously. Kaplan and Bickert’s visit this week was important, as was the recent appointment of Jordana Cutler – who until this summer served as chief of staff for Israeli Ambassador to the US Ron Dermer – as the new head of public policy for Facebook in Tel Aviv.
Cutler’s job will not be simple. On the one hand, she will need to navigate between the needs of the largest social media company in the world and its desire to maintain a free and open site, and Israeli demands that it do more to remove incitement. Her experience navigating the Israeli-US relationship – especially over the last few years of tense ties between Washington and Jerusalem – will undoubtedly serve her well.
But with a billion-dollar lawsuit hanging over its head, as well as pending legislation in the Knesset, top Facebook executives will need to do more than just visit Israel to reassure this country – as well as others in Europe facing a growing terrorist threat – that they too are doing all they can to stem online violence.
While Erdan is determined to keep the pressure on Facebook, he is also being reasonable in his demands. If the follow- up to this week’s visit is a new, determined and concerted effort by Facebook to redefine standards and remove incitement, he will likely modify the proposed legislation, although still insist on a minimal bill that allows Israel to turn to the court to demand that something be removed.
What is happening now with Facebook in Israel is an opportunity to take a step back and think about what Facebook and social media in general are meant to do. Facebook, Twitter and the rest were created to break down borders, to give people the opportunity to be heard by the world, and to share their stories with others.
Some, like terrorists, take advantage of this openness. Others, like Israelis, are sometimes hyper sensitive to the dangerous reality they face on a daily basis. That doesn’t mean Facebook is the enemy. Israelis, for example, need to cultivate a greater appreciation for freedom of speech. A country that still today has a military censor is naturally going to always put security as a priority. But this will not be the same for other countries around the world, which Facebook also needs to take into consideration when setting guidelines and global community standards.
Don’t misunderstand me. By no means do I believe that social media platforms are not responsible for the hateful and violent incitement that sometimes fills their pages. They need to take that responsibility seriously, and take considerable action to cut down the incitement to ensure that their platforms are a safe place for their users.
Facebook has taken a few steps in the right direction. More needs to happen.
***
IF THERE is one Israeli leader who has a fond appreciation for Facebook, it is Shimon Peres.
In March 2012, Israel’s ninth president visited Facebook headquarters in California, where he met with Mark Zuckerberg.
In the years since, Peres would go on to refer to Zuckerberg – even as he moved into his 30s – as “the 27-year-old Jewish kid.”
For Peres, a self-described dreamer, Facebook was a partial solution to a lot of the world’s problems.
“Here was a 27-year-old kid without a political party and without a single piece of land who revolutionized the world,” Peres told me when we last met five months ago.
As I write this, Peres is still hospitalized after suffering a stroke earlier this week. And while it is not the time to sum up Peres’s career and life, it is an opportunity to reflect on his love for science, innovation and the Jewish genius.
The French, he told me, had a revolution, invented the guillotine, and then cut off the heads of 200,000 people. The Soviets came along, spoke about freedom and equality, but then gave up freedom and killed those who disagreed. But Zuckerberg, he said, did something that made everyone else irrelevant: he broke down borders, shortened distances, and made the world smaller and more intimate.
But Peres also recognized the threats that are hidden within all these platforms.
“Science is not neutral,” he observed during our conversation at his seaside apartment in north Tel Aviv. “It can be used by anyone – good people, and evil people too.”
When I went to see Peres that day in April, I wasn’t sure what to expect. What I found surprised me: a man in his 90s speaking about the future as if he still had decades ahead of him.
Peres’s greatness has always been his chutzpah. From his early days as an aide to David Ben-Gurion until today, he fought for what he believed to be right even when others thought he was completely wrong. A short list of his struggles include the founding of Israel’s nuclear program, the development of Israel’s missile systems, the establishment of Israel Aerospace Industries, the investment in science and technology, and the Oslo Accords.
In each of these cases, Peres pushed while others pushed back. But he never gave up, and until his stroke this week he still spoke about Israel’s greatness and potential. He remained the incurable optimist.
“There is a great contradiction in this world,” he told me. “There have been so many wars that man is intuitively negative. But the world goes on, it advances, and its turns a gloomy history into something that could be positive.”
Shabbat Shalom.