APACAgricultural Policy Analysis Center

Back to Articles

Polls provide useful information but how poll questions are worded matter

Have you ever wondered about those polls that tell you that a certain percentage of US residents are for or against a particular issue? And then there are those candidate polls in which each candidate declares that the polls in his district agree with her/him on a given issue even though each candidate holds polar opposite positions on the issue at hand.

In the first place, there are a number of reputable polling organizations that regularly poll US residents on their attitudes about a wide range of issues including their knowledge about and support for various candidates for elective office. These polls are generally conducted by organizations that have no direct stake in the questions they ask or the results they publish.

The other day we answered the phone and ran into one of those other polls. After a series of questions about educational level, home ownership, and attitudes toward various corporations and industrial sectors—all designed to make us comfortable with the interview process—the interviewer got down to business. From the questions that were asked next, it was clear that the poll was sponsored by someone with interest in the natural gas industry and was designed to elicit support for the process of releasing natural gas from deep rock formations by a process called fracturing.

The interviewer presented us with a piece of information and them began to ask questions about the information just given. In polling parlance the process of providing a given set of information about a controversial issue and then asking questions about the issue is known as a “push poll.”

Most often a push poll is used in political campaigns to influence the interviewee’s support for a candidate A by stating the position of opponent candidate B on issue X. Usually issue X is a controversial issue and the description of candidate B’s attitude toward that position is presented in a less than straight-forward way and designed to illicit a certain response—a response that may or may not accurately reflect the respondent’s overall view of the issue. Though the poll’s originators may report the results of the poll, the results may very well be biased if a goal of the poll was to push the interviewee toward support for the candidate who funded the poll.

In the case of the poll of which we were a part, we are very familiar with the issue of fracturing and obtaining natural gas from deep rock formations. And, the information provided in the poll did not acknowledge any of the critiques of the process—thus our characterization of the poll as a push poll.

That was bad enough, but then came the questions. First, we were told that before they conduct the fracturing process, companies must submit to governmental reviews and must comply with a set of government regulations governing the fracturing process and the obtaining of natural gas.

Then we were asked, “With regard to that information about government oversight and permitting are you very comfortable, somewhat comfortable, neither comfortable nor uncomfortable, somewhat uncomfortable, or very uncomfortable with that information.” We said we could not answer that question because it was unclear how the survey organization was going to interpret our answer—the question was ambiguous.

To start with we are neither comfortable or uncomfortable with information. Information is the lifeblood of what we do. We crave information. Now, we may be uncomfortable with the contents of the information, but information itself in neutral and necessary.

And if we say we are very uncomfortable with the information about government regulation, will the compilers of the information think we are uncomfortable because we oppose most government regulation and think the gas industry should be free to get the gas any way they deem fit? Or will they interpret our answer to mean that we are uncomfortable with that information because we are familiar with the regulations and think that they are too weak?

One of the requirements of writing a survey question, is that any given answer should have a clear meaning. Here is another example. The question is “Do you go to church regularly?” How does a person who attends church every Christmas and Easter answer that question? Their attendance is very regular—every Christmas and Easter—though not very frequent. As a result a yes answer to that question provides little useful information.

In survey design, questions such as these fall into the category of “measurement error” because it is unclear what is being measured. Are we measuring frequency or regularity in attendance? Are measuring whether the respondent thinks that government regulations are like the three bears’ porridge—too weak, too strong, or just right—or do we think that there shouldn’t be any regulations at all?

We are all bombarded by attitude surveys and polling information day in and day out. Our experience as interviewees reminded us that we need to be cautious when evaluating survey results by paying close attention to how the survey questions are worded.

Daryll E. Ray holds the Blasingame Chair of Excellence in Agricultural Policy, Institute of Agriculture, University of Tennessee, and is the Director of UT’s Agricultural Policy Analysis Center (APAC). Harwood D. Schaffer is a Research Assistant Professor at APAC. (865) 974-7407; Fax: (865) 974-7298; dray@utk.edu  and  hdschaffer@utk.edu;  http://www.agpolicy.org.

Reproduction Permission Granted with:
1) Full attribution to Daryll E. Ray and Harwood D. Schaffer, Agricultural Policy Analysis Center, University of Tennessee, Knoxville, TN;
2) An email sent to hdschaffer@utk.edu indicating how often you intend on running the column and your total circulation. Also, please send one copy of the first issue with the column in it to Harwood Schaffer, Agricultural Policy Analysis Center, 309 Morgan Hall, Knoxville, TN 37996-4519