Monday, September 13, 2004

Political Polling
Blowing in the Wind

By Sara Pentz

Each day prior to the 2004 presidential election an online organization called realclearpolitics.com records rolling averages demonstrating that Bush leads Kerry or Kerry leads Bush. The Zogby poll, considered to be the most historically accurate, shows the election is a flat-out tie one day and the next it’s a different story. A USA Today/CNN/Gallup poll shows Kerry leading by 1.0 point. An ABC/Washington Post poll shows Bush ahead by 6 points.

Indeed, from minute to minute poll numbers tumble and levitate as if blowing in the wind. However, polls can be much more dangerous that the colorful fall leaves that glide so innocently to Earth. They can be used to influence performance, control thought, verify predetermined theses, counter rational objection, lead segments of society to certain false conclusions or discourage voters from casting ballots—and these are just a few of the more malevolent goals.

Even worse, if a bias is built into the statistical sampling or a testing error is caused by systematically favoring some outcomes over others polls, the outcome can be seriously harmful and misleading. The most scientific polls are constantly searching for the truth—the facts. It is often impossible to determine which is which unless someone takes the time to research the specific circumstances of the poll.

Political polling is based on the art of statistics, an imperfect science at best. The only perfect poll of the American public would be one that interviewed everyone in the United States. Since that is impossible, pollsters take random samples—a selection of participants from the population in which each subject is chosen entirely by chance. Polling less than about 1,500 people is not considered reliable for our population of 209 million.

The margin of error is anywhere between three percent and five percent for any of the more carefully crafted polls. That margin rises dramatically as the group controlling the poll searches for a prescribed outcome. It is critical to understand that polling errors occur when there are flaws in the wording of questions, in the order of the questions, in the nature of the question response options and in the timing of the poll.

Unless there is a difference of, at least, ten points between one candidate and his opponent, the chances of either winning is up for grabs—especially in political polls where one single news event prior to a November presidential election could change the minds of thousands of people.

Somehow it seems that for all of these conditions to be in perfect sync, it would be impossible to poll. Yet, pollsters abound and report their findings sometimes overnight, sometime collectively with an assortment of other pollsters and mostly with impunity.

When appraising the use of polls to determine election outcomes, it is clear there is a deep variation of opinion from differing segments of society that is often not reflected in the polls. There is also a deep variation in the way polls are conducted. Some are honest attempts to reflect opinion and some are not. Some are accurately reported in the media, some not.

According to an article published for the Public Agenda, a nonpartisan opinion research organization, public opinion researchers liken polling to making a big pot of soup. “To taste-test the soup, you don't have to eat the whole pot, or even a whole bowl's worth. You only have to try a bite,” they say. The same is true, they allege, of public opinion. “You don't have to ask every single person in America to find out what Americans think; you only need to ask a few to get the flavor of public opinion.”

In fact, there is a flaw in their soup. You cannot compare one taste of the same soup to the variety of answers of 1,500 different people because the contents of the soup remains exactly the same while the ideas of people do not. This soup analogy is not valid when talking about the great American melting pot.

Poll watcher and respected commentator Bruce Bartlett, a senior fellow at the National Center for Policy Analysis, writes at Townhall.com “If polls were truly scientific, if the public were well informed, and if public opinion was stable, this (polling) might help advance political debate. However, none of those things are true. Moreover, it is too easy to load questions so as to get pretty much whatever answer is wanted by whoever is paying for the poll.”

It is important to understand Mr. Bartlett’s pronouncements because with the proliferation of polling the cost has decreased—making it more affordable for any ragtag group to create questions to suit partisan goals.

Then again, some people polled do not honestly reflect their true opinions. In fact, in a time-diary analysis done in l994 to account for every minute of a person’s life, 26 percent of Americans actually went to church weekly, although the Gallup poll for the same period reported the figure at 42 percent. So there can be a vast difference between what people do and what they say they do. Some respondents may not be willing to state their own beliefs, either because of the sensitive nature of the question or the possibility their answer may be considered socially unacceptable.

Pollsters ask questions on many subjects, in many different ways. Here’s an actual example: Do you think Bush’s handling of the war is appropriate? How could an individual answer yes, no or maybe to that question? It is ridiculous to assume that the average citizen could consider—or even know—all the complicated information known only to the president as he guides the nation. Therefore, any reasonable answer to that question must be: How can I know what the President knows?

How people get their information, is, of course, at the heart of what judgments they make and what opinions they hold. With accusations of bias aimed at so many of the mainstream media is it often nearly impossible to know what is fact and what is the opinion of the reporter. Individuals often make fatal judgments about their own personal lives because they do not consider all the relevant facts. If they are not getting all the facts from the news media outlets, or those facts are slanted, how can they possible answer questions for a poll?

For example, The New York Times has publicly endorsed Mr. Kerry. Its polls consistently show Kerry leading. A combined Times/CBS polls favors Kerry more times than not. It is a documented fact that Times writers consistently write about how the public does not like Bush and CBS News has been castigated for a discredited report smearing Bush.

Furthermore, bias is a documented fact when the reporting of polls by the elite media is involved. For example, whenever a CBS News poll placed Bush ahead of Kerry, the CBS Evening News ignored the result, according the Media Research Center (MRC) online.

Again according to MRC, when a Newsweek poll put Kerry ahead of Bush by three points, 49 to 46 percent, the NBC Nightly News touted it, reporting that Kerry’s "performance had sharply improved his standing with voters" and trumpeted Kerry's lead as "a big jump for the challenger after a month of trailing the President." But two weeks later when a fresh Newsweek poll showed a Bush rebound with the President ahead of Kerry by two points, 48 to 46 percent; the NBC Nightly News did not consider it newsworthy even though the newscast spent six-and-a-half minutes on campaign coverage. In other words, it consciously ignored the information. Now, that’s bias.

It is important to understand bias—the practice of influencing in a particular or typically unfair direction and from prejudicial point of view by favoring certain facts or opinion over others. Often the mainstream media will twist the facts in order to favor a point of view, or they will eliminate certain facts and weight other that are more to their liking. Most reporters resist that label: some for ugly ulterior motives, others for self-serving reasons and most because they simply do not care.

Biased reporting is meant to influence voters. The hope is that as the polls go, so go the voters. Most reporters would say it is their duty to guide the public because only they can know what’s best for ‘the great unwashed masses.’ This attitude deeply permeates the mainstream media and, in much the same way that politicians contradict themselves when playing to their constituents, reporters and anchors hold to this ‘I know best attitude’ dogmatically in order to solidify their job security. In fact, there is no mistaking the fact that at the base of this arrogant attitude is a bald grab for power.

Polling sinks to new depths when self-serving organizations attempt to affect outcomes for political purposes. In fact, a number of academically sponsored, or so-called think tank organizations, pop new studies just prior to elections that are deeply flawed and riddled with assumptions, rough approximations and inaccuracies. Writing for National Review online Chester E. Finn Jr. points out a report that was created and then contradicted by the prestigious RAND Corporation: “One can't help but recall four years ago when, weeks before the election, a RAND analyst released a "study” purporting to show that (educational) achievement gains in Texas were not as rosy as then-Governor Bush claimed. It was immediate rebutted by, among others, another RAND analysis.”

Speaking of the nadir of poll reporting, The Wall Street Journal’s online Opinion Journal recently discussed the following Baltimore Sun story, "Heavy viewers of the Fox News Channel are nearly four times as likely to hold demonstrably untrue positions about the war in Iraq as media consumers who rely on National Public Radio or the Public Broadcasting System, according to a study released this week by a research center affiliated with the University of Maryland's School of Public Affairs.”

But wait! This "study," turns out to be pure propaganda, according to Opinion Journal. Here are the questions that were asked. They are designed as trick questions, each containing an erroneous belief. 1. "Saddam Hussein has been directly linked with the Sept. 11, 2001 attacks." 2. "Weapons of mass destruction have already been found in Iraq." 3. "World opinion favored the U.S.-led invasion of Iraq." One would have to label each of these questions untrue, therefore slanting the outcome and twisting the survey numbers in a partisan direction.

In the last analysis, one should ask the question: Are political polls a rigged affair or a fair appraisal of public opinion? As in all cases of opinion, one must look at the circumstances, the goals, the motives behind the surveys—keeping in mind that most of the information about polls is filtered through the views of pollsters, journalists, self-interested parties and those who pay the bill. Given those issues, one must always be cautious of relying on polls to form or solidify opinions.

In a broader sense, it is absolutely imperative that we do not allow polls to influence our vote, regulate our minds or dictate law. That is the fundament caveat implicit in understanding the roll of polls.

(Sara Pentz is media and political commentator. She has been a TV news reporter and an editor/writer for magazines and newspapers. sara@sarapentz.com)

This article was written September 27, 2004