Nautilus Members enjoy an ad-free experience. or Join now .

Workers count votes at a polling place in Worcester, Mass.SuperStock via Getty Images

Last Thursday the UK’s Conservative Party stomped to an electoral victory that fairly shocked the country. The Tories won a comfortable majority of seats in parliament, enabling them to govern the nation without a coalition partner. That result contrasted sharply with the pre-election polls, which (on average) predicted a dead heat between the Conservatives and their long-time rivals in the Labor Party, each one projected to net 33.6 percent of the vote. The substantial error has provoked much hand-wringing among British pollsters, whose industry organization is conducting an official review of the fiasco.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

So what did really go wrong? Nearly a week out, no one seems to have come up with a convincing answer, though a range of potential explanations—some connected, some mutually exclusive—have emerged. In an era when statistical analysis is coming to ever greater importance to fields as diverse as politics, finance, sports, and even culture, it is important to try to find the weaknesses of our models—and how they can be improved.

Here are some of the leading potential explanations:

Nautilus Members enjoy an ad-free experience. Log in or Join now .

Late-Deciders Picked the Tories

Election experts say a surprising number of people make up their minds right at the end of the campaign, and it can be difficult to capture their opinions in a poll. Michael Bruter, a political scientist at the London School of Economics, told Nature‘s Davide Castelvecchi:

There was an obvious gap between what the polls predicted and the results, but I was not surprised. In our research we find in election after election that up to 30 percent of voters make up their minds within one week of the election, and up to 15 percent on the day itself. Some people either don’t know ahead of time or change their mind when they’re in the booth.

Usually what happens is that some of these people cancel each other out. Some who thought they would vote Conservative ended up voting Labour, and vice versa. What seems to have happened yesterday is that more people changed their mind in one direction than in the other.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

Tom Mludzinski, the head of political polling at ComRes, told Susannah Cullinane at CNN:

We actually did a poll on the day of about 4,000 people, around 12–13 percent had made up their mind in the last 24 hours… The night before the election we had 20 percent saying they still might change their minds … We’d like everyone to make up their minds nice and early.”

The Conservatives Didn’t Win More Votes So Much as Other Parties Lost Them

Peter Kellner, the president of polling firm YouGov, says this election had a similar political dynamic as the 1992 election, another time the Conservatives won a huge surprise election that was not predicted by the polls:

Nautilus Members enjoy an ad-free experience. Log in or Join now .

It comes down to human psychology. Voting is a different exercise from answering a poll. It is a choice with consequences, not just an expression of a view. This year, as in 1992, the Tories have a weak image. They are widely thought to be out of touch and for the rich. But, at the margin, there may be some people who both have a poor view of the party but nevertheless think it will run the economy better than Labour. They are “shy Tories” [people who support Conservatives in elections but not in polls] not because they are unwilling to admit their choice of party to a stranger but because they really would like to support someone else but, faced with a ballot paper in the privacy of the polling booth, simply can’t.

Pollsters Hid the True Numbers

The founder and CEO of Survation says the British polling firm did actually collect very accurate poll numbers just before the election—but because they conflicted with the other published figures, they simply declined to release them.

The results seemed so “out of line” with all the polling conducted by ourselves and our peers—what poll commentators would term an “outlier”—that I “chickened out” of publishing the figures—something I’m sure I’ll always regret.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

Nate Silver, the founder and editor in chief of FiveThirtyEight, says he believes he has seen the problem of hidden outliers before, last year, when the polling firm Rasmussen deleted the results of a poll that seemed out of line with what other pollsters were reporting.

In our view, a highly plausible explanation is that this wasn’t an error per se so much as an example of “herding.” Polls such as Rasmussen Reports’ that take methodological shortcuts reach only a tiny fraction of the voter population, resulting in poor raw data. However, as we’ve described, and as other researchers have found, these polls tend to produce more accurate results when there are other polls with stronger methodological standards surveying the same races. Essentially, the cheap polls may be copying or “herding” off their neighbors. This can take the form of a sin of commission: manipulating assumptions like those involving turnout models and demographic weighting to match the stronger firms’ results. Or it may be a sin of omission: suppressing the publication of polls that differ from the consensus.

Polling Is Inherently Flawed

Polling is a business (even when run as a non-profit one), and its practitioners are limited by constraints on resources. Bruter pointed to this issue:

Nautilus Members enjoy an ad-free experience. Log in or Join now .

My research uses very long questionnaires that can take 15–25 minutes to answer. This takes a lot of time and money. Election polls normally take one minute to answer…

Random samples are much more expensive [than the quota samples used by polling firms]. The other thing that companies use to drive the cost down is the mode of polling. Face-to-face polling would be much more expensive [than calling or doing Internet surveys], but you’d be more likely to have a random sample and to have a real representation of the electorate. Many people on the phone will refuse to answer a survey.

Labor’s election pollster in fact told the BBC’s Chris Cook that the party’s private data was better than what was in public thanks to their own, more time-intensive polling:

The main difference between our polls and the newspaper polls is that we don’t ask the voting intention first… we first ask respondents to think about the country, the economy, their top issues, the parties and the leaders. We think it gets them closer to their ballot box mindset.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

[This design] delivers a much lower ‘don’t know’ number—generally half the level found in the public polls… Of course, that requires many more questions and so is more expensive to implement especially for a phone pollster where every minute costs money.

The New, Smart Models Weren’t as Smart as We Thought

Even if the polls are tilted, there are statistical models that aim to suss out the truth from them, or at least the likelihood that they are off-base. Nate Silver and his stat-whiz colleagues at FiveThirtyEight have become renowned for predicting election results by making clever adjustments to publicly available polls, which helped them successfully predict the results of every state in the 2012 U.S. presidential election. This time the models didn’t do so well. Ben Lauderdale, a social researcher at the London School of Economics and co-creator of FiveThirtyEight’s UK election predictions, wrote:

We did try to capture the possibility of a national poll miss. We ran calibrations on historical polling to see how badly we might expect the polling average to miss the national vote shares. Clearly we need to look very carefully at how we did this. The 2015 poll miss was somewhat smaller than 1992 and we did have 1992 in our data. But perhaps this failed to introduce sufficient uncertainty due to the complexity of having multiple parties, each of which could be above or below its polling level.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

Polling Is Far From an Exact Science—and It’s Getting Harder All the Time

Silver writes:

Polls, in the U.K. and in other places around the world, appear to be getting worse as it becomes more challenging to contact a representative sample of voters. That means forecasters need to be accounting for a greater margin of error.

This is an important point, and a welcome corrective to the common interpretation of the polls, which may be an even bigger issue than the inaccuracy itself. The consensus before the election was that the race was “too close to call,” and in some sense, that was true: The election results were within the margin of error for most of the polls.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

But this framing misleads by putting the emphasis in the wrong place. It’s more enlightening to say that the uncertainty in the measurement of the electorate’s opinion was too great to make any confident call about the outcome. A pithier wording would be, “We just don’t know.” Saying the race is “too close to call” gives the impression that polling is very precise and the race was extremely close; the truth of the matter is that polling is pretty inexact, and for all we knew, the race could have yielded anything from a clear win for the Conservatives to a big win for Labour. Polling firms do mention their margins of error, but the significance of that uncertainty often gets forgotten in the eager surge to provide a tidy, confident answer. 


Amos Zeeberg is Nautilus’ digital editor.

close-icon Enjoy unlimited Nautilus articles, ad-free, for as little as $4.92/month. Join now

! There is not an active subscription associated with that email address.

Join to continue reading.

Access unlimited ad-free articles, including this one, by becoming a Nautilus member. Enjoy bonus content, exclusive products and events, and more — all while supporting independent journalism.

! There is not an active subscription associated with that email address.

This is your last free article.

Don’t limit your curiosity. Access unlimited ad-free stories like this one, and support independent journalism, by becoming a Nautilus member. $9.99/month. Cancel anytime.