None of the polls in the final days of the recent Quebec election campaign came close to the final vote count. Three major published polls called for a tight race between the Liberals and CAQ, but the latter defeated the Couillard Liberals by almost 13 percentage points and captured more than twice as many seats, winning a commanding majority.
Philippe Couillard resigned right after the election. Some pollsters should consider doing the same.
The reason election polls are so often wrong is well known by everyone except the pollsters: it has become increasingly difficult and expensive to get a representative sample of people to respond to polls. Every Canadian who has ever been harassed by unwanted texts, emails or phone calls trying to sell duct cleaning knows this.
Part of the problem is that polling companies often conduct election surveys for no compensation other than the publicity it attracts (oops!). Some pollsters work in partnership with media outlets who are usually bent on spending as little as possible on the polls.
Sampling costs have been whittled down to almost zero — and much of the data generated is worth just that.
The CBC didn't commission any polls in Quebec during this election; it simply aggregated the polls of other journalistic organizations. Generally, pollsters and the media eschew investments in improved survey methodology. It has been said that sampling once represented 80 per cent of research budgets, but in the competitive world of polling today, sampling costs have been whittled down to almost zero — and much of the data generated is worth just that.
CBC doesn't really poll anymore. Instead they purloin media polls. of mixed qualtiy, without permission, for their aggregator. The potential for steering strategic voters in the wrong direction due to sketchy forecasts is a real concern;particularly in a tight election https://t.co/stwRMWnOHy— Frank Graves (@VoiceOfFranky) March 21, 2018
Pollsters have publicly complained about the media's parsimony and the misuse of polls, but concrete solutions for improving this state of affairs are wanting.
Increasingly hard to reach respondents
After embarrassing polling outcomes such as the 2018 Quebec election, Brexitor the 2016 U.S. election, which, to be fair, got the national popular vote right, pollsters have a well-worn series of explanations and rationales for missed election calls: polls are not meant to be predictions, just a snapshot in time; voter turnout was higher/lower than expected; last-minute changes in voting intention must be the culprit, etc.
In truth, the biggest hurdle is that the general population is increasingly harder to reach and reluctant to respond to surveys.
As the head of research at CBC just after the 1995 Quebec referendum on separation, which had a razor-thin, nearly 50/50 outcome, I commissioned a poll from the same polling firm we had used during the nail-biting campaign. The post-referendum poll was an experiment meant to test how and why respondents had voted in the referendum.
To our surprise only about 25 per cent said they had voted for separation, or half of the proportion who had actually voted in favour of separating from Canada. Sometimes people won't tell pollsters the truth. It is difficult to get respondents to share opinions about controversial political choices, especially if they challenge the status quo. This variable was rarely mentioned during the recent campaign, despite the fact the CAQ had never formed a government.
'Low response rates are the polling equivalent of low voter turnout'
When things go wrong pollsters are quick to point to perceived methodological shortcomings in others' polls. The favourite whipping boys include the use of web vs. telephone surveys, landline-only vs. cell phones samples, live interviewers vs. robocalls, random dialling vs. telephone listings, weighting of data vs. modelling data, employment of panels vs. random samples, etc. All these issues have serious effects on survey results.
Those who respond to surveys were as much as five times more likely to be engaged in civic affairs.
Plummeting response rates to surveys (except perhaps government surveys where one can be compelled to respond) lead to inaccurate data. Low response rates are the polling equivalent of low voter turnout; a low participation rate runs the risk the outcome won't be representative of the population.
The Pew Research Center, an independent "fact tank" in the U.S. that conducts high-quality, nonpartisan public opinion research, recently published a study on the declining rates of response to surveys. Pew revealed that response rates to its own telephone surveys had declined from well over 30 per cent to less than 10 per cent in the past 20 years. The study found that many variables were unaffected by lower response rates but, tellingly, it found those who respond to surveys were as much as five times more likely to be engaged in civic affairs.
Meaning: those less interested in politics and likely to make voting decisions on cursory information are probably not well represented in most election polls.
No Canadian pollster conventionally provides information on survey response rates. In the U.S. pollsters are at least willing to address the issue. For example, the New York Times polling unit revealed that it made over 2.8 million calls for polls in the 2018 midterm election and admitted to a response rates lower than two per cent. Getting access to similar Canadian data is virtually impossible, and the two industry organizations that could have studied the problem recently shut down.
Many of the polls relied on by Canadian journalists could have response rates as tiny as one or two per cent, or even less, and some, such as many web surveys, don't even use random samples and cannot calculate response rates. Canadian pollsters seem to have forgotten that the foundation of their research is random, representative sampling.
More from HuffPost Canada:
Improving response rates has been the subject of much experimental research over the decades, and it is known that additional attempts to reach selected respondents, extending the days of a survey, and using financial incentives (especially among younger people) can improve response.
Most election polls are done over two to three days, so increasing the number of survey days for any given poll is a simple fix, which accommodates efforts to increase participation. Another idea is for the polling companies and media outlets to create an organization like Pew Research to undertake co-operative election polls with higher response and quality generally. Otherwise the polls are going to continue to undermine trust in the media and knowledge of our political system.
Also on HuffPost: