Social Europe

politics, economy and employment & labour

  • Themes
    • European digital sphere
    • Recovery and resilience
  • Publications
    • Books
    • Dossiers
    • Occasional Papers
    • Research Essays
    • Brexit Paper Series
  • Podcast
  • Videos
  • Newsletter

5 Things Everyone Should Know About Opinion Polls

Peter Kellner 21st August 2015 1 Comment

Peter Kellner

Peter Kellner

1) Poll Results Are Snapshots, Not Predictions

Public opinion is seldom fixed. Views change. Good polls tell us what people think at the time they were interviewed. Someone reviewing polling results might make a prediction based on the data, but that is a personal judgement, not a poll finding. Suppose you know that a runner is leading by ten metres in a 1500m race, with one lap to go. You might predict that the runner is likely to win – and you would make your prediction with more confidence knowing the state of the race at that point. But you cannot be certain of the outcome. If you want to know the future, don’t commission a poll. Buy a crystal ball.

2) Good Polls Are Seldom Exactly Right, But Seldom Badly Wrong

Polls obtain responses from a small fraction of the population – typically 1,000-2,000 out of a population of millions. Good polls seek to match their samples to the characteristics of the population as a whole – by age, gender, region, social class etc. But statistical theory warns us that even the best survey is subject to a margin of error.

Suppose a coin is tossed 1,000 times. We would expect it to land heads roughly 500 times, and tails roughly 500 times – but it would be a fluke if it landed EXACTLY 500 times each. Likewise with polls. If 50% of the whole population hold a certain view, a well-conducted poll should produce a finding between 47% and 53%  – but the laws of probability tell us but one time in 20, even a “good” poll will produce a result outside that range. But it’s vanishingly unlikely to be, say, ten points adrift of the truth.

3) A Small, Representative Sample Is Always Better Than A Big, Unrepresentative Sample

Newspapers and television programmes sometimes invite their readers/viewers to text, phone or email their views. They sometimes then say something like: “We now have the verdict of more than 100,000 people  far more than any opinion poll”, and claim that the size of the exercise makes it better than a poll of just 1,000 people.

Nonsense. Here’s a cautionary tale. In the 1936 US elections, a magazine, Literary Digest, elicited the voting intentions of more than two million Americans and said that President Roosevelt would be buried after just one term as President, and that his rival, Alf Landon, would win by a landslide. Never heard of Mr Landon? That’s because he lost badly. Gallup Polls conducted a far smaller, but properly representative, survey – and rightly showed Roosevelt well ahead.

Our job is keeping you informed!


Subscribe to our free newsletter and stay up to date with the latest Social Europe content.


We will never send you spam and you can unsubscribe anytime.

Thank you!

Please check your inbox and click on the link in the confirmation email to complete your newsletter subscription.

.

An unrepresentative sample is an unreliable sample – simply making it bigger makes no difference.

4) The Details Matter – Dates, Question-wording, Client

So, you see some polling information. You note that it comes from a reputable company and therefore likely to be properly conducted. But does it mean what it seems to mean? Here are some tests to apply.

When was it conducted? Polls often look at current controversies, at a time when public opinion might be volatile. If the fieldwork for the poll was, say, two weeks ago, it might be a less accurate guide to current public opinion than one conducted two days ago.

Where does the report of the poll findings appear? If it’s a media report, it might tell only part of the story. If it’s put out by a campaigning organisation, it might select only those findings that suit its case. And/or they might – wittingly or unwittingly – abbreviate the questions and/or results in a way that ends up being misleading. To be certain about what questions were asked and what results obtained, the best advice is to go to the polling company’s own website.

Who commissioned the survey? Polling clients often have their own agenda – to promote a cause, a party, a candidate, a product or a point of view. This fact does not necessarily invalidate research they commission. Reputable polling companies make sure that the questions they ask are fair and balanced. But when clients have an agenda, it’s especially important to look under the bonnet and check exactly what questions were asked and what the full results show.


We need your support


Social Europe is an independent publisher and we believe in freely available content. For this model to be sustainable, however, we depend on the solidarity of our readers. Become a Social Europe member for less than 5 Euro per month and help us produce more articles, podcasts and videos. Thank you very much for your support!

Become a Social Europe Member

5) Apples And Pears Must Be Compared With Care

Here is a fictional example of real problem. A poll asks people whether they would prefer to spend a sunny summer day in a city or at the seaside. By 60-40%, people say they prefer the seaside. A year later another polling company asks people where would they prefer to spend a sunny summer day: in a city, at the seaside or in the countryside. 45% say the seaside, 30% say the countryside, 25%  say a city. The next day, a new report says that the seaside has slumped in popularity, with the number saying they would like to spend a sunny summer day there, down from 60% to 45%.

It’s obviously nonsense, as the first poll offered only two options while the second offered three. It’s just one example of why it is unwise to compare the findings from different polling companies asking different questions. That example is particularly egregious; sometimes the differences are more subtle – for example, telephone surveys often find different numbers of “don’t knows” than online surveys; as a result the numbers for each of the answer options are liable to be different.

The only safe way to be sure of movements in public opinion is to compare surveys through time by the same polling company, using the same interviewing method (online, phone or face-to-face) and asking the same questions each time. And even then, small differences (say by two or three percentage points) may reflect sampling fluctuations rather than real change.

Peter Kellner

Peter Kellner is a journalist, political commentator and President of the YouGov opinion polling organisation in the United Kingdom.

Home ・ Politics ・ 5 Things Everyone Should Know About Opinion Polls

Most Popular Posts

schools,Sweden,Swedish,voucher,choice Sweden’s schools: Milton Friedman’s wet dreamLisa Pelling
world order,Russia,China,Europe,United States,US The coming world orderMarc Saxer
south working,remote work ‘South working’: the future of remote workAntonio Aloisi and Luisa Corazza
Russia,Putin,assets,oligarchs Seizing the assets of Russian oligarchsBranko Milanovic
Russians,support,war,Ukraine Why do Russians support the war against Ukraine?Svetlana Erpyleva

Most Recent Posts

Gazprom,Putin,Nordstream,Putin,Schröder How the public loses out when politicians cash inKatharina Pistor
defence,europe,spending Ukraine and Europe’s defence spendingValerio Alfonso Bruno and Adriano Cozzolino
North Atlantic Treaty Organization,NATO,Ukraine The Ukraine war and NATO’s renewed credibilityPaul Rogers
transnational list,European constituency,European elections,European public sphere A European constituency for a European public sphereDomènec Ruiz Devesa
hydrogen,gas,LNG,REPowerEU EU hydrogen targets—a neo-colonial resource grabPascoe Sabido and Chloé Mikolajczak

Other Social Europe Publications

The transatlantic relationship
Women and the coronavirus crisis
RE No. 12: Why No Economic Democracy in Sweden?
US election 2020
Corporate taxation in a globalised era

Hans Böckler Stiftung Advertisement

Towards a new Minimum Wage Policy in Germany and Europe: WSI minimum wage report 2022

The past year has seen a much higher political profile for the issue of minimum wages, not only in Germany, which has seen fresh initiatives to tackle low pay, but also in those many other countries in Europe that have embarked on substantial and sustained increases in statutory minimum wages. One key benchmark in determining what should count as an adequate minimum wage is the threshold of 60 per cent of the median wage, a ratio that has also played a role in the European Commission's proposals for an EU-level policy on minimum wages. This year's WSI Minimum Wage Report highlights the feasibility of achieving minimum wages that meet this criterion, given the political will. And with an increase to 12 euro per hour planned for autumn 2022, Germany might now find itself promoted from laggard to minimum-wage trailblazer.


FREE DOWNLOAD

ETUI advertisement

Bilan social / Social policy in the EU: state of play 2021 and perspectives

The new edition of the Bilan social 2021, co-produced by the European Social Observatory (OSE) and the European Trade Union Institute (ETUI), reveals that while EU social policy-making took a blow in 2020, 2021 was guided by the re-emerging social aspirations of the European Commission and the launch of several important initiatives. Against the background of Covid-19, climate change and the debate on the future of Europe, the French presidency of the Council of the EU and the von der Leyen commission must now be closely scrutinised by EU citizens and social stakeholders.


AVAILABLE HERE

Eurofound advertisement

Living and working in Europe 2021

The Covid-19 pandemic continued to be a defining force in 2021, and Eurofound continued its work of examining and recording the many and diverse impacts across the EU. Living and working in Europe 2021 provides a snapshot of the changes to employment, work and living conditions in Europe. It also summarises the agency’s findings on issues such as gender equality in employment, wealth inequality and labour shortages. These will have a significant bearing on recovery from the pandemic, resilience in the face of the war in Ukraine and a successful transition to a green and digital future.


AVAILABLE HERE

Foundation for European Progressive Studies Advertisement

EU Care Atlas: a new interactive data map showing how care deficits affect the gender earnings gap in the EU

Browse through the EU Care Atlas, a new interactive data map to help uncover what the statistics are often hiding: how care deficits directly feed into the gender earnings gap.

While attention is often focused on the gender pay gap (13%), the EU Care Atlas brings to light the more worrisome and complex picture of women’s economic inequalities. The pay gap is just one of three main elements that explain the overall earnings gap, which is estimated at 36.7%. The EU Care Atlas illustrates the urgent need to look beyond the pay gap and understand the interplay between the overall earnings gap and care imbalances.


BROWSE THROUGH THE MAP

About Social Europe

Our Mission

Article Submission

Membership

Advertisements

Legal Disclosure

Privacy Policy

Copyright

Social Europe ISSN 2628-7641

Social Europe Archives

Search Social Europe

Themes Archive

Politics Archive

Economy Archive

Society Archive

Ecology Archive

Follow us on social media

Follow us on Facebook

Follow us on Twitter

Follow us on LinkedIn

Follow us on YouTube