Two well-known Jewish organizations with contrasting attitudes toward Israel have recently claimed to have plumbed American Jewish attitudes in this subject. Each group has claimed that its own political stance is the one actually favored by the Jewish community as a whole. But since neither of these groups — JStreet on the one hand, the Committee for Accuracy in Media (CAMERA) on the other — has used scientific methods of public opinion research, neither’s claim can be said to be supported.
I have recently written an article in which I summarized my objections to JStreet’s methods, including its polling, so I will not repeat this material here. My objections to CAMERA’s polling materials will become clearer presently.
Some thirty or forty years ago my colleague Tony and I were sipping a little something in the Faculty Club, and this is the amusing tale he told:
It seems that a couple of decades before this, a man who later became quite important as “an intellectual” — let’s call him X — crossed the US-Canada border from Detroit to Windsor to spend a half hour of “observation” in Ontario. He carefully took note of the automobiles that passed him in the street of Windsor, noting the manufacturer of each. Upon returning to Michigan, he penned a report to his nephew. Canadians, X averred, favor the Ford automobile over any other make, by a margin of about ten to one. That “observation,” I believe, later became enshrined in the X’s published oeuvre.
But snake oil sold as social-science wisdom is not always so charmingly harmless. During the presidential election campaign of 1936, the Literary Digest polled ten million Americans (of whom about 2.5 million responded) and concluded that Ralph Landon, the Republican, would be an easy winner. In November, as we all know, it was the Democrat Franklin D. Roosevelt who won, overwhelmingly, carrying 46 out of 48 states.
What went wrong ? And what went wrong with the current polling of American Jews that I am so concerned about here ?
When properly done, the science of public opinion polling can accomplish remarkable feats of understanding. By consulting about two thousand people — an appropriate random sample of about this number — it is possible to gain insight into the opinions and attitudes of millions. The theory of this sampling (i.e. probability theory) has been understood by mathematicians for hundreds of years, but it has been the social science of the twentieth century that has developed the techniques to accomplish adequate public opinion polling. But recent times have also brought to the fore a host of charlatans in this area. How can we tell the genuine from the specious ? The genuine from the grey-area operator ?
The principles are clear enough. On the one hand there is a “population” or “universe,” too large or otherwise impractical to study directly, on the other hand there is the random sample which, to a known degree of accuracy, “represents” this population. How can this sample be obtained ? The most basic requirement is that each member of the population has an equal chance to be drawn for the sample. So, in principle, we must have a complete listing of the members of the population, and then a mechanism, such as a lottery cylinder, to draw individuals by strict random methods.
In practice, the strict adherence to random principles is generally impossible, not least because a complete enumeration of the underlying population does not exist. If American Jewry is postulated as the population, there is also the additional problem of definition: who is a Jew, exactly; is synagogue affiliation either a necessary or sufficient attribute ? Jewish parents ? If so, how many ? And so forth. Also, as I have shown elsewhere, there are inherent problems of a sample of American Jews if it is based on a random sample of all Americans, primarily because American Jews are not distributed randomly in the American population, so that such samples systematically under-sample areas of Jewish concentration. All such problems have reasonable solutions, but these are scientifically complex, and also generally more expensive than certain “pollsters” will want to consider. The National Jewish Population Survey, on the other hand, furnishes an example of responsible scientific work.
For the use of public opinion polls in general, the New York Times has published its own very sensible standards. What can the reader do when faced with reported “public opinion data” of unknown quality ? Responsible, high quality social science in this area is not always easy to verify, since there are so many variables: the selection of a scientific sample (obviously the first necessity), the formulation of the questions (sometimes inadequate, sometimes biased), the overall scientific quality of the various steps in the research process. On the other hand, there is a telltale of absolutely unacceptable work: failure of the researcher to disclose the details of his work. When, as is the case of both JStreet and CAMERA here, the researcher fails to specify how his sample was obtained, the research, if for no other reason, is unacceptable.
As it happens, I have in the past corresponded with the executives of CAMERA, and so felt free, especially in view of my overall support of the work of that group, to express my suggestions in regard to their use of polling data. I wrote to two of these people, for a total of three times, without ever once receiving a reply. Here is the text of one of my messages:
It would appear that the Luntz poll, which CAMERA sent around in its latest Alert, is not a scientific poll. If I am right on this, it should be labelled non-scientific, to be accepted, if at all, with caution.
I am particularly interested in this problem because I recently had to criticize the polling practices of JStreet….It would appear that my methodological points here apply to Luntz as much as to Gerstein (JS’s pollster). The problem is the following: it is very difficult (read expensive) to have a valid sample of the American Jewish population. As I point out in my blog, the National Jewish Population Survey does a very good scientific job of surveying the Jewish population, but, as far as I can tell, nobody else does. I wrote to JS’s Gerstein to voice these concerns, but never received an answer.
Yesterday I wrote to Luntz, as follows:
Would it be possible to get details on how your sample was selected ?
My interest in the matter is detailed here:
thanks for your help
to which I received the following reply:
Thank you for contacting us. We appreciate your thoughts, suggestions and time it took you to write us.
You MUST register ON OUR WEBSITE to be eligible for one of our focus groups or nationwide surveys. You can sign-up on our website at http://www.theworddoctors.com/ Sorry, but requesting to sign you up by emailing us will not work.
Due to the high volume of emails we receive, we cannot guarantee a response to your email.
Remember: it’s not what you say, it’s what people hear.
Dr. Frank Luntz & The Word Doctors Team
*Become a fan on Facebook* http://www.facebook.com/pages/Dr-Frank-Luntz/249263279310
The report of the Luntz survey, to which CAMERA links, contains no information on how the sample was selected. When this information is missing, no knowledgeable reader can accept the results as scientific. I think that you should press Luntz to explain his methodology publicly. If he does not provide this information, and/or if, as I suspect, his methods prove to be less than scientific, there needs to be a disclaimer on your website, IMHO.
No doubt you will appreciate the position of CAMERA supporters like myself when we criticize JStreet’s various obfuscations. If, as I hope it will, CAMERA comes out for truth in polling, our criticisms of JStreet can gain significant additional force.
IN MEMORIAM: John Gray Peatman (1904-1997), my first statistics professor at CCNY, ca. 1949
UPDATE, MARCH 2013
The organization Workmen’s Circle has an old and proud history in the American Jewish community. Formed by Eastern European immigrants in the early 20th century, it had connections with the anti-Stalinist Jewish socialist movement. It gained many members through its “fraternal benefits,” i.e. funeral arrangements. I myself belonged to it for a short while.
But lately, partly through its emphasis on its Yiddish-speaking heritage, it has largely fallen prey to a new type of membership: militantly secularist, allied to anti-Israel causes. Its old-time membership, people in their eighties, seem bewildered and outgunned.
Now this latter-day WC published what it calls a poll of American Jewish opinion, arriving at conclusions that purport to show that American Jews actually care little about Israel. And how did the pollsters of the WC learn all this ? Here is their description of their sampling method:
The poll was commissioned by the Workmen’s Circle / Arbeter Ring. For more information on the organization, go to:www.circle.org.
Principal investigators were Professor Steven M. Cohen of the Hebrew Union College-Jewish Institute of Religion (HUC-JIR) and Professor Samuel J. Abrams of Sarah Lawrence College and Stanford University.
The Washington office of IPSOS, under the direction of Dr. Alan Roschwalb, fielded the survey. Respondents included 1,000 American Jews, by Internet, who had previously agreed to participate in social research conducted by IPSOS. Survey was conducted April 19 – May 3, 2012.
The results were weighted to reflect the American Jewish population with respect to age, gender, regional distribution, educational attainment, marital status, intermarriage status, and Jewish parentage (none, one, two parents). They were also weighted to reflect registered voters
The participants in this “poll” were, it would seem, self-selected. All were internet users, which of course automatically eliminates Haredi Jews. The procedure seems, as if by design, to evade all scientific understanding of sampling.
Or did I perhaps miss something ? Can something be said by way of reasonable scholarly explanation of this poll ? I sent polite separate e-mails to Professor Cohen and Abrams, as well as to IPSOS and even the WC itself, asking for more details on the sampling method used in the poll. Not one of these bothered to answer my questions.