To Anonymous and Fly on the Wall, thanks for the comments to my blog. I appreciate your well thought out and reasoned additions.
I made my own comments to yours as well.
Hope you stick around throughout the season. It is nice to hear from you
Here now, another in our series from Jeff Young on How America Elects: This time on polls:
How America Elects
The Power of Polling
Every political campaign, from the race to the White House on down to the local level, uses polling of some sort to provide strategic information critical to success at the ballot box.
Think of the political process as a marketing exercise. Candidates and their parties present themselves to the voters (consumers) as a product they can purchase at the ballot box. Is the product viable in the marketplace? Polling tells campaigns where they stand in the competitive field.
By doing surveys of the voting population, pollsters such as The Gallup Organization, Zogby International, the Mason-Dixon Poll and others provide political campaigns with current marketing information on the candidate's popularity and the ranking of issues that matter to voters. As is true with any marketing exercise, polls also reference current information with that obtained in earlier surveys, to create what it called "trending." Trending indicates, for instance, whether the candidate is gaining or losing popularity over time, or whether the public believes that one major party has gained voter perceptions that is better capable of addressing a particular problem as compared to the other major party.
Just as important as the information gained through polling is its analysis. Campaign strategists may see that polling indicates their candidate has weaker than desired support in a particular region or state. Campaigns respond by having the candidate visit that area more often, and by beefing up the campaign organization there. Conversely, areas where the candidate is overwhelmingly supported are areas where the candidate can spend less time and effort.
Such analysis would also tell candidates which issues resonate strongest with voters in a particular area, or a particular age or other demographic group. Then, candidates typically develop multiple "messages" to address those groups. For instance, if polling showed that older men put national security at the top of their concerns, the candidate's speeches to, say, veterans groups would stress national security. If the polls showed that 18-35 year old voters rank envrionmental concerns highly, the candidate's appearances involving that demographic group would obviously address his or her plans to clean up the environment.
As for the mechanics of polling, logistics make it impossible to survey an entire country's population. So, pollsters create a "sample group" to represent the population at large using mathematical formulas that have been proven over time. This is how pollsters can survey one thousand people to represent the opinions of an entire country of perhaps 300 million.
Nearly all polling is presently done by telephone as opposed to face-to-face or via the Internet. But Internet polling is growing, though pollsters have had the challenge of trying to ensure that data collected over the Web remains "one person, one opinion" because of the ease someone might have using the Internet to "stuff" the survey box. Another trend affecting telephone polling is the embracement by younger demographic groups of mobile phones versus traditional landline versions. Polling presently isn't done to mobile phone numbers, but perhaps mobiles will have to be included somehow to properly sample certain groups.
Perhaps the most famous U.S. election polling failure of all time took place during the 1936 race for the White House, in which incumbent Democratic President Franklin D. Roosevelt was challenged by Republican Alf Landon. A well-known national magazine, "Literary Digest,"
announced that its election poll showed that Landon was going to handily defeat Roosevelt.
The results on election day were quite the opposite - Landon was crushed nearly nationwide.
What happened? Pollsters say Literary Digest constructed a fatally flawed survey sample group. The magazine drew its sample from telephone directories, its own subscribers, and motor vehicle registrations. The problem was that in 1936, with the United States struggling to emerge from the Great Depression, many people did not have automobiles or telephones.
People short of cash also didn't typically spend their money on a subscription to Literary Digest. So by creating a sample group based on those factors, the sample was made up of people who did not represent the bulk of the voting public. Incidentally, after the embarrassment of its survey, Literary Digest folded not long after the 1936 election.
Jeffrey Young VOA-TV
This look at election polling is part of the VOA-TV series "How America Elects."
If you want to view this and other segments of How America Elects, go to:
http://www.voanews.com/english/HowAmericaElects.cfm
Comment: Before publishing this piece, I surveyed my "staff"--38 percent liked the piece; 35 percent did not like it, 27 percent had no opinion. Just kidding :-)
March 10, 2008
Subscribe to:
Post Comments (Atom)
1 comment:
Thank you very much. This was a great help.
Post a Comment