Flawed practices behind India’s low democracy index rankings

democracy index rankings
The institutions that publish democracy index rankings are not biased, but they fail to filter out the biases of people who conduct the studies, says Salvatore Babones.

The low democracy index rankings assigned to India never fail to polarise public opinion in the country. Salvatore Babones, associate professor at the University of Sydney, says a clear bias is affecting India’s ranking on the democracy indices. He explains the problems in the practices and methodologies used by the ranking institutions in a free-wheeling interview with senior journalist Yatish Rajawat. Edited excerpts:

Why do you say that the three indices that track global democracies — the ones by the Economist Intelligence Unit, Varieties of Democracy Institute, and Freedom House — do not give the correct picture of Indian democracy?

I want to make it clear that I can’t tell you what the correct picture of India’s democracy is, because that would imply that I have done my own ranking. What I can tell you is that the evidence produced by these three organisations is very weak. In some cases, the data is from a wrong period and in some others the evidence given in support of the rankings doesn’t correspond to the rankings assigned to India. The evidence provided by these three organisations does not justify the low and rapidly deteriorating rankings in the last four years.

Do you think there is a conscious effort by these organisations to bring down India’s ranking? The EIU democracy index has bracketed India with 53 other countries which they call flawed democracies. Is it possible to introduce a bias into an index that is supposed to be reflective?

I don’t think any of these organizations has an anti-India bias. In fact, there’s a lot of goodwill towards India in the West and certainly among Western governments, investors and business people. The problem is that the rankings do reflect the general opinion of the analysts and intellectuals who know the country. When the Economist Intelligence Unit democracy index ranks India, it is not sending a team to do fact finding. Instead, they send a survey. And all three organizations work the same way. They send a survey to people who are in the field.

In India’s case, the survey is conducted by Indian intellectuals like university professors or non-residents Indians who teach/research outside India. The work can also go to journalists from outside India, authors and public intellectuals. We don’t have access to the list of experts they rely on. The only thing we do know is that for each country they’re relying on people who have some acknowledged expertise on the country. It is evident that Indian intellectuals and university professors have a clear bias against Narendra Modi and BJP.

READCOP27: India’s ancient learnings can help climate change mitigation efforts

Well, it is true for some elite universities where there is a clear bias for left-wing ideology…

You are right. To be precise, professors of the humanities and social sciences in elite universities. The surveys do not go to physicists, engineers, or mathematicians. They go to social scientists from top institutions and to non-resident Indians who work at places like Harvard, Stanford, and University of Sydney. I have met a lot of the people in this social class. I’ve seen their writings which reflect their political biases. I am not saying that their opinions are insincere or are not based on facts. However, this is not an unbiased group of people who are looking dispassionately at the data.

To put it another way, I am a comparative social scientist who’s not an expert on India. But I understand the ranking systems. I see a lot of problems with the methodology that these three organisations use. People inevitably bring their political biases into their evaluations of a country. In my own country America, more than 95% of university professors donate to the Democratic party.

If you try to evaluate Donald Trump’s presidency and you ask a bunch of academics, you’re going to get a very negative evaluation regardless of what the reality of the situation might be. Now it is possible that the reality is as bad as this group of experts say. Still, they are the ones who decide the rankings, not the ranking organizations.

So, you are saying that the panels of experts are biased against the government. What role do Indian intellectuals abroad play in this?

They play a big role. Again, I don’t know the names of people on the panels, but we can safely assume that they involve the ‘usual suspects’. These are journalists at places like the New York Times, professors at places like Harvard and Stanford. They are considered authorities on India and organisations such as Freedom House, Varieties of Democracy, and Economist Intelligence Unit depend on them.

There is nothing wrong with that — if you want to know about the democracy in a country you should go to experts on that country. The problem arises when there is editorial oversight and a lack of devotion to detail. I can give you examples of reporting errors in the evidence given in support of the rankings.

READSticky inflation: RBI note to govt may blame supply side factors

Even the US or France are classified as flawed democracy along with India. These two countries did not oppose the ranking…

There are two reasons for that. The first is that the US and France are at the top of that category, not in the middle or bottom. These are broad categories. So, the US and France have not been downgraded to anywhere near the extent that India was.

India only moved two ranks below in ranking…

I don’t have the number on my fingers, but we can look them up. The second reason is that France and the US don’t care about any democracy index. They can’t care less about what organizations in Washington or Sweden say about their country… The problem is not that Indians have a thin skin. Indians should care because India requires the cooperation of the rest of the world in a way that France or the US don’t have to worry about. I mean France is a member of NATO and the European Union. France is not facing unwilling investors. It doesn’t have to worry about access to high technology. It’ s not going to face sanctions.

For India, this is a problem. There is the likelihood of the US suspending technology cooperation with India if American legislators feel that India is not democratic. Look at the US-China relations. There’s an absolute ban on US technology going to China because there’s a recognition that China is a totalitarian state. I routinely see Narendra Modi being called a fascist dictator on websites and Twitter.

Ranking agencies don’t quote that, but they are influenced by them when they classify India alongside Russia. Many American legislators might be hesitant about entering into agreements with India. They know France, and the French people, they know America. But when someone says things about India, they are not sure if it really is a democratic country. When we Google it, we get the democracy index rankings.

So you’re saying that the rankings have geostrategic impact on the way India’s international relationships are going to be structured. Is there a conspiracy behind this popular narrative and rankings?

There’s no conspiracy, but there is a lot of laziness. Anyone who reads newspapers know that top Indian humanities scholars and social scientists associated with democracy index processes are either Marxists or secular liberals. They are overwhelmingly anti-BJP. The laziness comes in when the editors at the ranking organizations fail to verify the information gathered.

Let us examine the suggestion that journalists are not safe in India because more journalists are killed in India than any other country. We don’t have data for China. But wait, India has a population of 1.4 billion. In India, 3.5 journalists per billion die every year. But how many journalists die in the rest of the world – the figure is 6.3 per billion. So, we can say India is one of the safest countries for a journalist.

So, if you’re preparing an index, you would adjust the figures to population, unless you are trying to project an image that is not correct. Let us have a look at the institutions that do rankings. The Variety of Democracy Institute is supported and funded by institutions based in Sweden, Norway and Denmark. This is mentioned on their Facebook page.

We were also funded by the Europeans. They are also funded by the European Union or the European Social Research Council. I believe they receive funding from US National Science Foundation, but these are reputed funders. There is no problem with the funding sources.

Even if the structure of funding is beyond doubt, the rankings are prone to biased reporting. They are depending on data and samples that are biased. You are saying that there is mendacity in the editorial attitude in the sense that there can be wilful creation of malice…

I am not sure if it is wilful mistake or lazyness. When you are doing rankings for 200 plus countries, it’s a huge effort. A lot goes into it. A lot of trust goes into the people who are contributing to the reports and who are writing the reports. There is a lot of delegation that goes on to creating these reports. Remember they are not doing an India report. The problem is not in the overall process.

The problem is in the fragility of the editors and the administrators who do not ask the right questions. Is that right? Can we really use that figure? Has that figure been properly compiled? I have interviewed some Indian academics. I went through every single footnote in these reports and chased down the original sources. I asked the original sources if they support the interpretation in the final report. That’s from where all the evidence in my paper comes from.

Is it technically not a paper, but still an article because it was not published in a journal…

Call it whatever you like. It is not a peer-reviewed paper in a scientific journal. It is a critique published in a literary magazine called Quadrant, and later reprinted on the website of The Print in India.

Why didn’t you go in for a peer-reviewed publication? This piece could have been then used to rectify the ranking system or the processes that these three institutions follow.

I planned this simply as a working paper at the University of Sydney, but the editor of Quadrant expressed an interest in it. I might turn it into a peer-reviewed paper. But it would be very difficult for me to publish it. The academic peer review process is romanticised by people. Peer review means that there is some endorsement of quality in my experience.

I have a Google H Index of 26 which is very good for a social scientist. I have published dozens of research papers. It is very difficult to publish anything which your reviewers would politically disagree. I would be very surprised if the paper made it through peer review. I want to emphasise again I have not rated Indian democracy in this article. I am not saying it is good, bad or and different. I am saying that there are a number of errors in the ratings and that should be clear to anyone who reads me.

Could you please give us some examples of such errors that are mentioned in your article?

Sure. Sedition charges are very controversial in India and the use of sedition charges has been routinely cited as an example of how the government is repressing political dissent. Until 2014, sedition was not in the reporting on Indian justice system. An NGO actually went back and compiled the numbers from the year 2000 onwards.

The V-DEM report says sedition charges spiked during the decade 2010-2020. A decade is 10 years — it should be either 2011-2020 or 2010-2019. The arbitrary choosing of beginning and end years is tricky. If you go a year either way, you could have a different result. If you strategically choose the range, you can manipulate the results.

I will give another example of UAPA arrests. As an American, I am not a fan of emergency regulations. I would be very happy if international democracy organisations rated India downward because of its use of emergency powers like UAPA. I would like to see them doing the same for Canada which recently declared a state of emergency when truckers wanted the protest in Ottawa. I am much more concerned about UAPA. The incidence of its use has been flat except for a spike in 2018-19. There has been a spike of about 10% in 2018-19 and then it is flat again.

So, if you strategically pick a year, say between 2017 and 2019, there was a rise in the number of people charges under UAPA. But if you choose any other period, you just get a flat line, Number of UAPA cases has been used to assess the current status of Indian democracy between 2018 and 2019.

Some of the arrested people were charges by the earlier government on similar charges for their association with organisations banned in 2009. These arrests are used to explain the decline in democracy under the current government. These examples show the flaws of the ranking system. They are relying on those who filter the details. Editorial supervision is needed to make sure that these kinds of claims don’t get into the reports.

Is there one common organisation helping rating agencies to cherry pick data? Or, is it done by two-three different individuals who are working for all these organisations?

No. This is a class issue. I don’t know who is Indian and who is of Indian descent. I can see reports in New York Times, Washington Post, and academic publications from Harvard/ Stanford that cast India in negative light. I don’t know who is Indian and who is not. It’s pretty obvious that intellectuals don’t like the BJP and the BJP frankly doesn’t seem to like intellectuals.

I don’t agree with that part. It is the problem with a set of intellectuals who have been side-lined by the regime…

I don’t have data for India, but have data for the US. It is overwhelmingly clear that intellectuals tend to support the Democrats and oppose the Republicans. This same pattern would replicate in India and it should not surprise anybody. It may be politically incorrect to say so. I can’t give you the data for India. No one’s done a survey as far as I know, but it’s one of those things that you know. But people can make their own judgment. The problem is with the filtering of data for the democracy index. The three agencies do not send out fact-finding missions to 200 countries. Their surveys are being answered by people who are bona fide actual experts on India.

So, would you say that these three organisations need to widen their pool and sample size?

No. I think they need to be cautious about their democracy index being drawn into domestic politics. If a country like Bhutan is ranked low because of methodology issues, it won’t make headlines. It won’t be reported in major newspapers in the West. This is not the case for India. The agencies need be careful about the methodologies and processes. Utmost care is needed not to become a proxy weapon in domestic political battles. I think that’s happening with the rankings on India.

So, you’re saying that democracy index rankings have been politically weaponised by Indians?

I suspect that the pool of intellectuals who study India genuinely hold the opinions they hold. They are susceptible to reinforcing their own biases and opinions. It’s a perfectly legitimate opinion. There’s no problem in being Indian and being anti-BJP. The problem comes when you allow your personal political viewpoint to colour your evaluation.

I do know from the citations and footnotes that have been provided to substantiate the rankings that a large number of people are misrepresenting India. They ask, why are you supporting that fascist? And I hear this from highly credentialed colleagues, not from people on Twitter. I am not supporting any regime; I am just working in the interest of truth. I don’t have any evaluation to offer on Indian democracy. I have an evaluation of people who evaluate.

(K Yatish Rajawat is the founder and CEO of Centre for innovation in Public Policy, a Gurgaon-based think tank.)