Sunday 12 November 2017

For the Love of Elections

The final resolution of the New Zealand Parliament a few weeks back once again bucked my predictions for another term of conservative government, proving that however hard predicting an election may be it’s nothing compared to the uncertainty that follows once politicians come to power.

With any luck we will fare a little better in calling the Queensland election on the 25th of this month. Before then, however, we will have the results of the same-sex marriage poll released on Wednesday, so we should have a quick look at that.

Voter Response Rate

Firstly, here is a graph for the rate of return of the survey papers.

Polling dated from the last day of data collection.

Generally, these types of polls have a margin of error in the realm of 2-3 percentage points. There are two outliers from October 2nd, however, that are well beyond this (marked in red). These results by Newgate Research and ReachTEL suggested 77% and 79% of eligible voters had returned their survey forms, about 30 percentage points higher than the previous day’s result from Essential (47%) and higher than all subsequent polls but the final one, taken one day before submissions closed.

There are many other hints that this data may be in error. The article publishing the Newgate Research figure included scepticism from Australian Christian Lobby’s director, who said the 77% figure “would surprise me”, while the source for the ReachTEL poll reported a suspiciously low 17.5% ‘No’ response. Crucially, these two polls were the only two (excluding the very first and very last) during the survey that did not ask the likelihood of voting from those who had not returned their ballots. The 77% and 79% figures are also in the ballpark of those who intended to vote in the polls taken prior to the survey beginning. I suspect, therefore, that these high numbers capture not only actually returned ballots but those who intended to return them—perhaps people who had filled out the papers and sealed them in the envelope but not yet posted them.

Whatever the reason, we will ignore these two figures going forward.


Conveniently, this line always trends positively now that the outliers have been removed, which makes sense as no one should be able to un-post their papers.

Voter Response

The latest data from Essential (see page 13) suggests that 64% of people who voted ticked ‘Yes’, almost double the combined ‘No’ and ‘Prefer not to say’ votes.

While polling has had some embarrassing moments over the last couple of years, at times wildly diverging from the actual results, this data, being based on historical fact and not subject to change like opinion, should in theory be a reliable indicator of the result.

Nevertheless, for the fun of it, let’s look at some other data collected to predict a result.

Here is a graph of support for the ‘Yes’ and the ‘No’ camps, ignoring the undecided vote (where recorded) as people either likely to not vote, or to split broadly along similar lines to those who answered:

Polling dated from the last day of data collection. Ipsos poll (9/11/2017) omitted for uncertainty. Results for dates with multiple polls are averaged.

Including the undecided/rather not say/other among polls where ‘Yes’+’No’ < 100%, the graph is similar:


This data however, is a mix of polls from people who have already voted, intend to vote or both. We can separate out some of this data accordingly:


Although voting intention seems to fluctuate dramatically, actual results seem to be quite flat and featureless compared to the preceding graphs. The available data for these graphs only begins from the start of October which, according to our very first graph, is after half of the submissions were already cast. This gave a sizable fixed baseline, which fluctuations in slowly accruing votes then had little impact upon.

Overall, there is a slight growth in the ‘No’ vote later in the survey, but not a particularly concerning one for the ‘Yes’ campaign.

The ‘other’ data among those who have already voted, presumably declining to answer rather than ‘don’t know’ as may surveys put it, is reasonably constant. By comparison, the ‘other’ among those yet to vote grows over time. This probably does not reflect voter’s wavering so much as a proportional reduction in people intending to vote. As more votes were cast those not intending to vote became a larger proportion of people in this graph.

All of the graphs examined so far would indicate a safe win for the ‘Yes’ camp, with ‘No’ never exceeding 50% and only getting close towards the final weeks of the survey among those yet to vote. By the end of the yet-to-vote graphs (22/10/2017) 75% of eligible votes had already been returned. A further 9% would be sent after this date, limiting the impact of this (comparatively) high ‘No’ support.

Two More Graphs

With these last four graphs, we can calculate an approximate value for ‘Yes’ and ‘No’ support over any given period from the slope of the dividing line(s).

For the sake of producing something more than mere graphs of publicly available data, here is a table of the daily support for ‘Yes’, ‘No’ and ‘Other’ (‘?’) extrapolated from the known polls (in grey):


And here is a table of the incremental increase in surveys returned over time:


The listed averages have been chosen to align with known points in the increase of surveys returned. Prior to October 1, at least 50% of the returned surveys had voted ‘Yes’. 18% of eligible voters replied between then and October 15, with a calculated average ‘Yes’ vote of 48.13%. 2% of forms were posted in the following 24 hours according to the polls (remembering these dates are somewhat artificial as they are collected over several days), at a calculated 50.13% ‘Yes’. The next 8% before October 22nd had 57.75% voting ‘Yes’. Similar information can be determined for the declared ‘No’ votes and the unknown ‘?’s:


A similar graph can be constructed based on reported surveys returned, but the simpler way to do this is to simply multiply the percentage of returned votes by the percentage that were ‘Yes’, ‘No’ or ‘?’ like so:


Which yields:


Conclusion

All of the polling throughout the survey has indicated a win for the ‘Yes’ campaign, and polling from the penultimate day of voting reports a 64% ‘Yes’ response—higher if some of the undeclared votes are also ‘Yes’. According to the final graph above, this not only makes ‘Yes’ more than 50% of the returned votes, but more than 50% of the issued votes. In other words the ‘Yes’ result would be greater than the ‘No’ result (including all undeclared votes) and the ‘did not answer’s combined.

Previous experience, particularly the British experience with Brexit, has left me a little wary of polling. It is tempting to suspect a hidden vote as we saw in the US presidential race, the Brexit vote and the UK general election. These hidden votes came from both left and right, but always in favour of the underdog, as though people were ashamed to admit they were voting for the less popular option.

On the other hand, over-cautiousness about this exact issue proved unfounded in New Zealand. In NZ, the people expected to vote voted. In the other elections we saw an atypical and unpredicted surge in voters from the disenfranchised working classes voting for Trump, whipped-up nationalists voting for Brexit and politically engaged youth voting for Corbin. The question, then, becomes one of whether we’ve been polling the right demographics in the right proportions, something we cannot know until Wednesday’s result.

However, there are two good reasons in my mind to trust the polling in this case. Firstly, most of the polls offered an option to remain uncommitted or not answer. This allowed a pollster-shy voter base to be captured without declaring their position. If there is a hiding ‘No’ vote I would expect it to be mostly contained in the thin grey bar in the above graphs.

The second reason I trust the polling data is that this is not an election held on one day. Unlike all of the bad-polling examples, this vote was conducted over almost two months. As a result we not only have data on how people intend to vote, but on how they claim they actually did vote. This data is free from late-season changes in attitude or people intending to vote not getting around to it.

That said, completely aside from any science or reason, I will slightly hedge towards a stronger-than-expected ‘No’ vote based on nothing but gut feeling. The last polling has ‘No’ at 31%, plus a 5% undisclosed result. If this all went to ‘No’ it’d be a 64:36 (or 16:9) victory for the Yes camp. I’ll go a little further still and predict something in the order of 61:39, but nevertheless by all accounts including my own, we should see a clear ‘Yes’ result on Wednesday.

1 comment:

  1. So according to News.con the ABS only got 76.5% of surveys returned. This is well below the 86% indicated by polling.

    So already there is reason to be sceptical of the polling. Great.

    ReplyDelete