News Update

Here's what pollsters think happened with 2020 election surveys

The research — presented to those attending the conference of the American Association for Public Opinion Research — provides some hints of the contours of the problem, which likely involves pollsters’ difficulty in reaching certain segments of Trump’s supporters. Still, it offers few immediate fixes.
One of the most likely culprits, the research suggests, is a failure to fully account for some form of differential nonresponse bias. In other words, a bloc of voters that was more likely than others to support Trump may also have been less likely to be included in polls, either because they weren’t reached by pollsters or they didn’t participate in the surveys.
It’s always been the case that some groups of people are more likely than other groups to respond to surveys. Older Americans, for example, are generally more likely than younger ones to answer the phone. Pollsters correct for those tendencies by weighting their results to match the population they’re trying to measure on demographics such as age, race and education. Some also adjust for political factors, such as partisanship or past vote history. In the past, those adjustments have been enough for survey results to be broadly representative of the full electorate, even as response rates to polls declined.
That, however, wasn’t enough this time around, which suggests there could be specific types of nonresponse that pollsters haven’t yet honed in on. For instance, it’s possible that the Republicans who were most supportive of Trump were also less likely than other Republicans to answer surveys, meaning even if a researcher had the right share of Republicans in their survey, the result would not have been representative of Republicans generally. It’s also not yet clear how much of the error might dissipate in an election year without Trump on the ballot, or without the background of a national pandemic.
The research is most conclusive in ruling out some potential problems as major contributing factors. Those include some of the issues most prominently cited in the wake of the 2016 election, such as a surge of late-deciding voters swinging toward Trump and a failure by pollsters to weight results on voters’ educational levels.
Assumptions about how likely different voters were to turn out, which led to an underestimation of Barack Obama’s numbers in 2012, likewise didn’t appear to be the main factor this time around. Changes to how people voted, including a surge of early and absentee votes, didn’t seem to throw off those likely voter models, either. And there’s still not a lot of evidence to support the recurring theory of so-called shy Trump voters lying to pollsters about their true intentions out of embarrassment.
The inquiry into the 2020 polling miss comes amid major upheaval in the polling industry, which has increasingly turned to new methods of reaching voters. In 2016, according to the presentation, 36% of public, national-level election polls consisted of phone calls to numbers selected through a traditional technique known as random digit dialing. In 2020, that method accounted for just 6% of national polls, with the majority instead being conducted online. Pollsters have also increasingly adopted the use of voter files, which build off publicly available election data. No one method, however, stood out as especially accurate in last year’s election.
Election polls pose a particular challenge because an error of even a few points can dramatically affect how their results are interpreted. Still, other types of surveys have performed relatively well, suggesting the mechanics of polling haven’t entirely broken down. Polls on Americans’ willingness to get vaccinated, for instance, have closely matched up with official statistics.
“Pre-election polling is more challenging than polling the general public on issues, and the 2020 election was particularly challenging to poll on,” Dan Merkle, AAPOR’s president, said in an email to CNN. “While all polling has error, it’s still the best way we have to measure the views of the public.”
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Most Popular

To Top