The Atlanta Journal Constitution ran an online poll soon after Georgia Congresswoman Cynthia McKinney publically demanded an investigation into 911 and possible foreknowledge by the Bush Administration.
The poll asked if people were convinced that the Bush Administration had no foreknowledge and apparently the paper was convinced that the public would say they were sure that the government did not have foreknowledge.
However, the results of the poll were that nearly 50% of respondents said otherwise.
Quick to do damage control, the paper felt it necessary to write an article about how the poll meant nothing. Clearly they were taken off guard, otherwise they would never have put the poll up in the first place.
However much they try to tell everyone what to believe, the truth is starting to get out. More and more people are uncomfortable with the false 911 story they are supposed to accept as fact.
Public opinion 101: What does a poll really say?
When is a poll not really a poll?
Answer: When its results are virtually meaningless.
That's probably oversimplifying things. There are lots of other answers, which gives us a chance this morning to go over some things you should know about polls, surveys and scientific validity.
Nearly every day in the online version of the newspaper, ajc.com, we invite readers to weigh in on a story in the news by clicking on an instant poll linked to it.
Often in the print version of the paper we tell you that you can register your opinion on the subject by going to ajc.com and "voting."
Such online "polls" are really just opportunities for readers to register an opinion. If you feel strongly about something we put up as an online poll, you can click away all day as the votes get tallied before your very eyes.
The hotter the topic, the more likely the tallies will be skewed by people taking a strong stand
Last week when we asked online readers to vote whether they agreed with Rep. Cynthia McKinney (D-Ga.) that the Bush administration might have been aware in advance of the Sept. 11 attacks, the responses broke down the tabulator we use to keep track of the votes.
Groups and individuals who believe there is evidence of a conspiracy in the attacks urged friends to vote on ajc.com to send Congress a message of the need to investigate.
They needn't have bothered. Even if we hadn't taken the poll down after a few hours because of mechanical problems, the "results" would have been virtually meaningless.
So lesson No. 1 is that most online polls are really just opportunities to register an opinion. (This week we added a disclaimer about the surveys on ajc.com not being a reflection of the general public.)
Not all Internet-conducted surveys are meaningless. Some have more validity than others.
For instance, the newspaper, ajc.com and Marketing Workshop, a professional polling group, team up regularly to create what we call the Voice of Atlanta. Participants in this survey are e-mail volunteers who respond to online queries from us on certain topics. Since they volunteer and provide us information about themselves in advance of the questioning, we're able to break down the responses and compare the results for sub-categories, such as sex or age or geography.
We do this nearly every weekend on a topic in the Sunday @issue section and provide the overall results of the Voice of Atlanta panel members who responded, as well as some of the subcategories within the survey sample that appear interesting.
These results are much more scientifically based than most online polls, but they still lack the critical factor of randomness. By agreeing to be a part of the Voice of Atlanta panel, the respondents to these polls could never be considered "randomly selected."
Still, surveys such as those conducted through the Voice of Atlanta have much more claim to legitimacy than the instant online polls you see most of the time.
The last, most scientific, measurement of public opinion is a randomly selected, person-to-person poll of a cross section of the community. Most are conducted by telephone and the pool of participants is usually drawn from blocks of telephone numbers in the geographic area to be sampled.
Politicians, and increasingly special-interest groups, pay for these polls, which can be very expensive and are often performed by professional polling organizations, like Marketing Workshop. News organizations also conduct scientific polls from time to time, especially during hotly contested elections. We also used them to measure community feeling on major issues or to simply keep track of lifestyle and business trends.
The key questions that ought to be answered whenever you read (or we write) about the results of these polls should be: How big is the sample? How big is the margin of error in the results? (The bigger the sample, the smaller the margin of error is, especially when the results are close.) When was it conducted? (The more recent the better because polls are snapshots in time and the results can change within days.) Who paid for the poll? (You want to know whether it was an independent group, such as a news organization, or a politician or interest group that might have a stake in how the results come out.)
Keep these things in mind the next time you hear someone claim there's a poll out there that you might find interesting. The results could be fascinating, but they may also bear no resemblance to actual public opinion.