Home || Political Parties || Polling Data || Ridings || Marginal Seats || Women & Elections || Election Laws || Links
2015 Final Election Polls
Here are the final election polls released by the major companies surveying national voting intentions. The table below gives the 2015 election night results plus the poll results:
How accurate are polling companies' reports of voter support? You can view quick charts of polling accuracy from the 2000-2008 elections and see just how right (and wrong) they can be!
Probing people about their second choices of party to vote for gives some indication of the room for growth each party has in the near future. However, there has been very little movement in second choice support since April 2010 until the final week of the campaign, indicating a very settled set of voter preferences. However in the dying days of the 2001 campaign, the rise of the NDP as a first choice appeared to have drained off some of the NDP second choices that had previously been recorded; in their place, the Liberals and Greens grew to their hightest levels in a year.
Unsurprisingly, an Angus Reid poll conducted in August 2009 found very different perceptions of the different political parties held by the supporters and non-supporters of each party. This poll is worth exploring to assess the strengths and weaknesses each party has in trying to attract more voters.
Although polls tend to show some meaningful variations over the course of an election campaign, those changes largely occur because of shifts in opinions held by a minority of Canadians. For example, an Angus Reid poll found that in the 2008 election 54% of Canadians already had made up their minds whom to vote for when the election was first called. But considerable shifts can occur among other voters that make the campaign period very decisive in determining the election outcomes. Interestingly, about 15% finally made up their minds on election day; many of those voters, however, would already have been leaning towards one party.
Election polling has evolved in recent elections, and it is now standard for major polling companies to conduct polls continuously throughout the election period. They will report results on a near daily basis, with those polls based on the previous 3 or 4 days worth of polling.
You can look up just how successful Canadian pollsters have been in predicting the last 3 elections - compare the support for the parties in the polls and at the polls!
Reflections on Polling
The Library of Parliament has an interesting background paper on Public Opinion Polling in Canada. Matthew Mendelsohn and Jason Brent of Queen's University provide a useful guide to Understanding Polling Methodology - this is essential reading for an insight into just what significance we can attach to particular survey results. One thing to keep in mind is that most polls are published with the national margin of error reported; but the margin of error will be much higher for any provincial or regional figures that are also reported.
The Laurier Institute has an interesting collection of material on opinion polls and electoral support for political parties as well as seat predictions based on the current polls.
When reading opinion poll results pay close attention to how large the sample size and "undecided" figures are. A large sample makes for more accurate results, particularly at the regional level. A small undecided figure means that the polling company has probed with follow up questions to finding out who a person is most likely to vote for, and included them in the pool of decided voters. But these leaners, as they're known, are expressing a preference, not a decision. They can also be quite volatile and either change their party preference or even decide not to vote at all. Also be cautious in reading too much into polls that were conducted with only one day of interviewing. Generally speaking, several days of interviews are thought to be preferable in order to ensure more chance of capturing a wider range of society in the poll's sample.
Because of differences in polling techniques and handling of leaners, each polling agency may develop their own structural bias that under- or over-reports support for particular parties.
Mark Pickup provides a very useful chart showing Canadian election polls with their biases corrected and margins of error mapped. This may be the best indication of the "true" levels of support.
The reported levels of support for parties can also vary according to the wording of the questions put in a survey. For example, Nanos fairly consistently reports about a third to a half of the support for the Green Party, compared to Ekos polls. According to Mr Nanos, this discrepancy is because Nanos does not list all the party names in their survey question while Ekos does; the explicit mention of the Green Party reminds disgruntled voters that there is an alternative to the main parties. (Hill Times, June 14, 2010, p.23). The drop in support for the Greens in Nanos polls, leads to higher reported levels of support for the Conservatives and Liberals
Undefined depth of support for specific parties
A fundamental problem with media reporting on opinion polls is that there is usually little indication in the initial reports of how soft the reported support is for each party, or how each polling company tries to probe for voting intentions. Polls results often loosely proclaim "40% of decided voters... etc" However, that final figure may be based on having to prod the respondents at least twice into expressing a preference: 1) "If an election were held today which party would you vote for?" and, if they say don't know, then they are asked 2) "which party are you leaning towards voting for?" The second group are only leaning and should not be viewed as actual support, and yet most polling companies roll the leaners in with the truly decided. Moreover, even those who name a party on the first question may still be actively considering another party (or whether to vote at all). It would take other specific questions to establish that the voter has settled their choices and is not likely to switch parties.
We in the public often miss the fluid nature of public opinion captured when polls are reported. It is not often clear how settled are the voting intentions reported in a poll. Thus it is useful when a polling agency releases results of questions that probe the nature of people's voting decisions. A Segma poll released a week before the October 2008 election showed that a third of voters could still change their minds about which party to support.
Several polls during the 2006 election showed large numbers of voters were prepared to change their vote. The potential for voters to switch in the last days of the campaign was realized in the 2004 election, when the Liberals gained and the Conservatives dropped several points in the last 2-3 days before voting day.
Because of this potential for voters to switch their choice of party or candidate, it is important to follow the polls to track the ebb and flow of public opinion during an election campaign.
I welcome any feedback and suggestions for fresh material to add to this site -
Political Science Department -- Simon Fraser university