Moneycontrol PRO
LAMF
LAMF

Podcast | Decoding 2019: Are opinion polls reliable?

Our news media has dozens of opinion polls across many channels and languages.
April 26, 2019 / 16:57 IST

Harish Puppala | Rakesh Sharma 

It is election season, and the biggest that world has witnessed is underway in India. It is estimated that around 900 million people will cast their vote in this year’s lok sabha election across one million polling stations. And 2019 is being hyped as the election that will make or break India.

Given the importance of such an event, and the consequences that follow, it is natural that observers, and even lay people, discuss the outcome at length. No matter whom we meet these days, the election is an inescapable topic.

In such an environment, opinion polls are hot commodities. Or, at least, they should be. Our news media has dozens of opinion polls across many channels and languages. And they have them all the time. Everything in the world has a season, except, evidently, apples, bananas and opinion polls. But these polls have come under fire of late. While many of them predicted the Congress party’s win in Madhya Pradesh, Rajasthan and Chattisgarh, as well as the coalition government in Karnataka, they most famously went horribly wrong in 2014. Remember the denials of a “Modi wave”, and the denials of those denials, and so on? So where do we stand today? Are opinion polls even worth our time? That’s the subject of today’s deep dive.

The dangers of forecasting

Let’s look at what constitutes an opinion poll. Take, for instance, the India Today-Karvy poll from January of this year. India Today explained that its poll “was conducted...to gauge the nation's atmosphere ahead of the all-important Lok Sabha elections. A total of 12,166 interviews were conducted - 69% rural and 31% urban - across 97 parliamentary constituencies in 194 assembly constituencies in 19 states. An additional 1,103 interviews were conducted in 20 parliamentary constituencies across Uttar Pradesh to analyse the most populous state in the country, and therefore, one of the most pivotal states in the upcoming Lok Sabha elections. The addition of extra interviews conducted in Yogi Adityanath-led Uttar Pradesh takes the total sample size of the survey to 13,179…(it) followed a multi-stage stratified random sample design, where all interviews were conducted face-to-face using a standard structured interview questionnaire.”

Those are some impressive numbers, albeit still minuscule given the large population of this country.

A cursory look over the results of opinion polls between early 2013 and May 2014 shows that they got things wrong, especially on whether the BJP would get a full majority. The most that the NDA, the alliance that the BJP is a part of, received in over 15 such opinion polls was 275 lok sabha seats while BJP was expected to win around 220 seats by itself. Come result day, and BJP won a full majority with 282 seats, and the NDA decimated the opposition to win 336 lok sabha seats. Even more damning was the number of seats predicted for the Congress party. The party won 44 seats, only around 50% of the absolute minimum number predicted. The ruling coalition, UPA, won just 60 seats, again only half of the seats predicted by various opinion polls. It is no surprise then, that the fortunes of the fortune tellers fell rapidly after that. The exit polls were not particularly accurate either, which didn’t help reputations.

The errors in our predictions

Yogendra Yadav - previously a member of the Aam Aadmi Party and now national president of Swaraj India - and a psephologist who is a regular on news channels, wrote in The Print, “I do not think the pollsters are deliberating misleading us. I have been in this business of election forecasting. I know that pollsters hate to get their forecast wrong. And they pay big professional price when they do. The reason for a possible failure is technical.”

He explained that there are different types of errors. One error comes about because, according to Yadav, “They are risk averse, and hence play it safe. They get the winner right, but underestimate the extent of victory. Typical recent examples would be the AAP’s or the BJP’s sweep in Delhi or Uttar Pradesh, respectively. Most pollsters predicted the victor but almost no one got the extent of the victory right.” Another type of error, Yadav claims, is “...where one gets the victor wrong or predicts a big sweep that does not happen. This is every pollster’s nightmare, and often happens when the pollster takes a risk and goes strictly by what the numbers indicate. My own misadventure in projecting a clear defeat for the BJP in Gujarat in 2017 illustrates (this). I saw two credible polls showing a decline for the BJP in the last four weeks and simply projected it onto the final outcome. Pollsters who predicted a clean sweep for the Congress in Rajasthan based on correct pre-poll data made the same error.”

Yadav supports his claim by way of opinion polls from the NDA’s famous loss in 2004. He wrote, “When Vajpayee advanced the general elections by six months, every poll expected the NDA to come back with over 300 seats. Pre-election polls closer to the election date gave an average of 271 seats to the NDA...When the results came, the NDA won 187 seats….the BJP’s own tally...a paltry 138...” So what went wrong in 2004? Yadav explains, “Basically, all the pollsters played it safe in estimating the loss of seats for the BJP...since they made the same error in a number of states, the accumulation of small errors led to a big blunder....in 2004, all pollsters played it safe by understating the BJP’s losses in most of the states. Instead of cancelling one another, forecasting errors stacked up…(it was) one of the most embarrassing polling blunders of recent times.”

M Ramesh, writing in Business Line, essentially concurred with Yadav, He wrote, “when media houses know their methodology is not good and the surveys are very different from the general predictions of all others, they lose faith in their data. In such cases, they tend to play it safe and give out a number which is more in line with the trend, rather than stick to what their data show — widening the prediction-result chasm.”

Karthik Shashidhar observed in Mint back in April 2014, that opinion polls need to be credible, and the situation in India is currently far from that. Apart from CSDS-Lokniti, no opinion pollster in India deems it necessary to publish the methodology. We are forced to accept the numbers without any information on how many people were surveyed, how those people were chosen and what the margin of error in prediction is.

Aakar Patel, who heads Amnesty International in India, observed in a column in Outlook, “One of the biggest problems pollsters have is that people often lie in India. Not because we are devious...but because we have no confidence in how this information will be used and whether revealing it will harm us in some way. The agency CSDS does a post poll survey in which it sends its staff to people’s homes and conducts in-depth 30-minute interviews. (In contrast) Often the polling agencies seem to be merely guessing.” An Economic Times article from 2014 which reported that BofA-ML highlighted the fact that opinion polls...tend to get it wrong in the Indian elections and therefore any observations based upon them must be taken with appropriate caution.

A piece in Hundu Business Line explained that mathematician and psephologist Rajeeva Karandikar believes the ideal sample size to be around 50,000 respondents across the country, if the objective is a simple determination of who will win among the NDA, the UPA and others. A smaller sample would be inadequate, while a larger one adds no value. Also, these polls cost money. The Business Line analysis observed that “costs work out typically to Rs 600 for every person interviewed, or 3 crores for a 50K sample. Few media houses today are prepared to spend such a sum on an opinion poll — even if they could afford to, the temptation is to take the cheaper offer. And this is crucial because “error creeps in right at that stage. A low budget opinion poll cuts corners...Ideally, once the sample size is fixed, selection of interviewees ought to be done by ‘systematic’ (or circular) sampling...If, say, 20% of the Lok Sabha constituencies is to be randomly identified, then systematic sampling goes like this: take the list of constituencies, pick one randomly from among the first five, then choose every 5th from that number.”

The analysis added, “This method is well-known to pollsters, but not everyone of them adopts this, because of time and money constraints. Instead, they yield to the temptation of sending a team to, say, shopping malls or railway stations with an instruction to pick interviewees of varied profiles in terms of age, gender and social status. You hardly get a representative sample.”

Praveen Chakravarty, chairman of the data analytics department of the All India Congress Committee, told India Today, "The increasing inaccuracy in electoral predictions is due to multiple factors. Most forecasters rely on the old style of forecasting using a vote share to seat share model which has lost its relevance today." Rahul Verma of  Centre for Policy Research said, “Polling agencies conduct a survey among electorates which helps them in making vote share estimates...then, statistical models are used to arrive at seat forecast. It is extremely difficult to make seat forecast in India for a variety of reasons. I find seat prediction as a second-order problem. I'm rather more worried about vote estimates going horribly wrong.”

Such issues are further compounded because, according to Verma, “most social science departments in India, even many journalists who present poll numbers on the TV screen, do not have basic training in probability and statistics. This has created an environment of unwanted expectations from election polls.”

Are more accurate opinion polls possible?

As we’ve demonstrated, opinion polls can go wrong. So should we all just ignore these polls and the talking heads who can’t stop, well, talking about them? Yadav said, “I am not recommending that you stop following the opinion polls and the forecast this election...I have always argued that a half-decent survey is more informative than drawing room or newsroom gossip. I would just suggest that you focus on the real and interesting trends, instead of just looking at the forecast of the number of seats for each party. The more interesting and useful information in an election survey is about percentage of votes for each party, its distribution across social segments, and the evaluation of the government and its leaders.”

Another thing to bear in mind, according to Economic Times, is that the margin of error, while extrapolating from a sample to something as gigantic and diverse as India’s population, can itself make a huge difference to the accuracy of the final tally. When the vote share is then used to estimate the number of seats, the scope for missing the mark becomes truly large enough to make a mockery of the survey itself. The lay audience for such opinion poll findings is unaware of these qualifications, and gets misled. And that presents a case for regulation. Karthik Shashidhar warned in 2014, “If the Indian opinion-polling industry wants to save itself, it is imperative that it come together to form an association, which then imposes strict standards on its members. The primary objective of these standards should be to regain the confidence of the public in opinion polls. The polling agencies need to understand that in the absence of this, their very survival will be in doubt (it doesn’t matter if they are not banned—loss of public confidence is enough of a killer).”

ET recommended in the same year, that all opinion polls must submit their sampling design, questionnaire and raw data to the Election Commission. They should then be made available to the public at large for independent scrutiny and critique. Then, an expert panel of the Election Commission vets a survey after which it could be disclosed to the public. And different experts can offer their own critiques of the poll findings.

One column in Hindustan Times claims that pollsters would be better off following something called a Bayesian model. In simple terms, it means factoring probability into the entire polling exercise. Nate Silver, who correctly called 49 of the 50 US states in the 2008 US presidential election, is a big proponent of Bayesian models. Interestingly, Bayesian models have been used in politics, sports, weather, and also, for kicks, to predict the winner of reality shows.

We haven’t heard the last of such criticism of opinion polls in India. According to Chakravarty, there is no accountability for wrong forecasts. He said , “My analysis shows that forecasts were wrong 85% of the time in state elections post-2014. Yet, the same pollsters continue to predict without any accountability. At the very least, viewers should be told every agency's past accuracy record.”

I suppose the one conclusion we can arrive at, given all that we’ve heard during this podcast, is that opinion polls should be taken with a pinch of salt. Que sera sera.

Moneycontrol Contributor
Moneycontrol Contributor

Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!

Subscribe to Tech Newsletters

  • On Saturdays

    Find the best of Al News in one place, specially curated for you every weekend.

  • Daily-Weekdays

    Stay on top of the latest tech trends and biggest startup news.

Advisory Alert:

It has come to our attention that certain individuals are representing themselves as affiliates of Moneycontrol and soliciting funds on the false promise of assured returns on their investments. We wish to reiterate that Moneycontrol does not solicit funds from investors and neither does it promise any assured returns. In case you are approached by anyone making such claims, please write to us at grievanceofficer@nw18.com or call on 02268882347