One way to get attention in the Jewish world is to be anti-Israel. Another is to publicize a survey with frightening results. The latest example of this was the much ballyhooed Ispos finding that 33 percent of Americans agree that boycotting Israel is justified and 24 percent would support a boycott.
The Chicken Littles immediately jumped on the result to suggest the anti-Semitic boycott, divestment, and sanctions (BDS) movement is having a serious impact on American support for Israel. If one-third of Americans support BDS, we must be in serious trouble, and wouldn’t this at least partly explain the hostility toward Israel on college campuses?
These findings should be taken with a hefty helping of salt because they probably do not accurately reflect American views. The reason is that the media rarely bothers to look at the methodology used to conduct surveys before publishing headline grabbing numbers.
The Ipsos poll, for example, has several methodological problems.
1) This was an online survey so it did not represent a random sample of Americans.
2) Some of the people who do participate are likely to be anti-Israel partisans who like an outlet for their vitriol.
3) People who participate in online polls are more comfortable with technology and skew younger and other polls have suggested that younger Americans are less supportive of Israel.
4) Most polls show a large number of Americans either are not sure or don’t have an opinion on Middle East issues. In this poll, they were only given a binary choice of agreeing or disagreeing. When forced to choose, they will pick a side, but they may have chosen another option if one was presented.
5) In any poll, question wording and order can effect results. In this case, for example, respondents were asked, “Do you agree or disagree with the following statement: Boycotting Israel is justified’”? There is no explanation of what the boycott means and the answers might change if, for example, the question said, “Do you support the boycotting of Israeli universities or artists refusing to perform in Israel?”
The Ipsos poll provides its own example of a question that is more explanatory, which, depending on how it’s written, can also influence the respondent. This survey asked, “Some people claim that: ‘The movement to boycott Israel is a modern form of anti-Semitism because it seeks to harm the national homeland of the Jewish people.’ Do you agree or disagree with that claim?”
What would you expect the answer to be?
Given that one-third thought a boycott was justified and one-fourth were willing to support one, you might expect those same people to disagree with the claim. After all, if you agree a boycott is anti-Semitic and then say you support one, isn’t that admitting you are an anti-Semite? In this case, 38 percent disagreed, which suggests that people may be less willing to boycott Israel if they think the tactic as anti-Semitic.
To show how question order matters, consider the likely responses if the anti-Semitism question had been asked first. Once they read that a boycott seeks to harm Israel, they have a definition they did not have before. I suspect that if this question was asked before the boycott question, a smaller percentage would have supported a boycott. We can only speculate because the pollsters did not rotate the questions.
If polls like this are going to be cited, then they cannot cherry pick the results they like or reinforce preconceived ideas. If we are going to accept some of the results as valid, then we can’t ignore responses to other questions, which were actually more newsworthy. For example, 81 percent of the respondents said they were pro-Israel, that’s 16 points above the all-time high recorded in Gallup polls. Given that level of support, some of Israel’s must have also said they favor a boycott. Would this still be true, however, if the boycott was explained? I doubt it.
Another headline should have been that 62 percent see the BDS movement as anti-Semitic. I would like to see this tested in a more serious poll; nevertheless, this finding suggests that making the case that BDS is anti-Semitic does resonate and that people who understand this are less likely to support boycotts. This strengthens my argument that we should not shy away from calling BDS exactly what it is – an anti-Semitic campaign that denies the Jewish people the right to self-determination in their homeland.
Many Jews are afraid to tell it like it is because they feel the BDSers will try to discredit them by calling them McCarthyites or arguing that Jews shout anti-Semitism to silence criticism of Israel. Let them make these specious accusations. If Jews say BDS is anti-Semitic, who are they to tell us otherwise? It’s no different than a racist telling an African-American they can’t be accused of bigotry because they’re just expressing their legitimate opinion about n****rs.
The BDSers should be constantly put on the defensive to try to prove they are not anti-Semitic, something they cannot do without disavowing the raison d’ê·tre of the movement.
Despite its flaws, could the survey still be accurate? Possibly, but if you look at the backlash against the BDS movement in Congress and state legislatures, the refusal of most artists to be intimidated not to perform in Israel, and the failure of BDS to take hold on more than a handful of campuses, it is clear the American people are not as anti-Semitic or gullible as the BDS proponents believe.
The point of this little methodological exercise was not to teach the ins and outs of survey research; rather, it was to caution the media and its consumers not to accept every poll as accurate or meaningful. Do not fall for selective or misleading reporting of results.
Jews should be especially wary of jumping to conclusions every time a survey is published that offers some sky-is-falling statistic.
Dr. Mitchell Bard is the author/editor of 24 books including The Arab Lobby, Death to the Infidels: Radical Islam’s War Against the Jews and the novel After Anatevka: Tevye in Palestine.