The British Broadcasting Corporation (BBC) has published a report [pdf] that talks about the fake news industry in India. In their report titled “Beyond Fake News”, BBC has reached certain outlandish conclusions like ‘Nationalism is driving the spread of fake news” based on dubious data, incorrect interpretations, bias sample selection and selective information.

It was almost as if BBC decided the conclusion of their research and then reverse worked their data to suit their conclusion.

Following are just 5 aspects that prove beyond reasonable doubt that the BBC research is unreliable at best and fake news at worst.

1. Sample size

The BBC “research” has mentioned its sample size in no uncertain terms. For a publication as reputed as BBC, one would imagine that before publishing any research paper, their sample size for the survey would not only be comprehensive qualitatively but also quantitatively.

- Advertisement -

The BBC sample size as mentioned in their document is as follows:

As can be seen from BBC’s own document, the sample size used to “interview” based on which they came up with their conclusion was 40. 10 each from 4 age brackets.

At first glance itself, even a rookie would say that the sample size is abysmally low for a vast and diverse country like India.

If you explore the generally accepted norms for sample selection and sample size, one would realise that BBC is way off the mark and the sample size itself negates their entire survey, research and the conclusions reached thereof.

SurveyMonkey, a reputed survey website uses the following thumb rule:

The chart above simply explains that the sample size is decided on the basis of the margin of error the surveyor is targeting and the confidence level that the surveyor wants in his survey.

Confidence level simple means how sure one wants to be that their sample accurately represents the population they wish to draw conclusions about. Margin or error would mean how sure one wants to be that the answers would accurately represent the views of the population.

To quote SurveyMonkey:

“So if, for example, 90% of your sample likes grape bubble gum. A 5% margin of error would add 5% on either side of that number, meaning that actually, 85-95% of your sample likes grape bubble gum. 5% is the most commonly used margin of error, but you may want anywhere from 1-10% for a margin of error depending on your survey. Increasing your margin of error above 10% is not recommended”.

According to SurveyMoney, 5% is the most common margin of error. Even if we take a small population size of 100, one would have to interview approximately 80 people to ensure that the margin of error is limited within 5%, according to the SurveyMonkey chart.

To put things in perspective now, India has a population of 1.3 Billion people. There are over 7.83 million active Twitter users in India and are set to rise to over 34 million in 2019. As of October 2018, there were as many as 294 million Facebook users in India. BBC’s survey sample size was 40.

Let me repeat that again. With the entire population of India exceeding a billion and with millions and millions of users using Twitter and Facebook, BBC decided to conclude that ‘Nationalism is the driving force behind fake news’ on the basis of 40 people. BBC’s sample size was 80 people from 3 countries and 40 people from India.

Even if we take the sample size of 1 million people (since the sample size doesn’t grow in the same ratio as population), BBC would have to interview 384 people to stick to a 5% margin or error. I reiterate they have interviewed 40 people.

It suffices to say that the shoddy sample size alone should be grounds to dismiss this report by BBC as bogus, propaganda driven and motivated.

In their defence, BBC has claimed that their analysis is qualitative. There is an age group 19-24, which is a very large and diverse group and is also the main consumer of the internet in India. For such a large group, the sample size is 10, which is half of what’s the minimum requirement is even if one buys BBC’s excuse of justifying their sample size. Though unconvincing, we are including BBC’s response here in fairness, which we don’t expect from BBC.

Even if we accept BBC’s theory, by the law of large numbers and Central Limit Theorem, the bare minimum requirement for sample size is 30 (n=30) to ensure that the sample is equally representative. This is important to ensure consistency and to ensure that all estimators are close to the population value. By the extension of this logic, what was required was that for all age-wise segregation, they should have picked 30 samples individually, for each segregation. Instead of that, BBC decided to pick a total of 40 samples rendering their sample inordinately inaccurate, biased and seriously dubious to be considered for the basis of any serious research on the subject.

This sample was used to determine “why people share fake news”. For such qualitative assessment where a humongous population is being painted with a broad brush, the statistics don’t add up.

It does not matter whether the study is qualitative or quantitative in nature. The early stage of designing a research study involves deciding upon a size of the sample which will render the estimated value to be as close to the population value. Hence, they got the design of their study wrong in the very initial stages of the research itself.

Besides, BBC doesn’t appear to have great knowledge of Qualitative Research. One of the ontological positions of Qualitative Research is that there are multiple realities and that there is no single objective truth. The purpose of Qualitative Research is not to arrive at objective truths that can apply across all situations but to generate hypotheses that could be tested using Quantitative methods. The focus of Qualitative Research is always on the subjects that are selected as participants and great care must be taken to ensure that the participants are genuinely representative of the larger population if one is to derive any meaningful understanding regarding the larger world. If the selection of subjects themselves are not done in a reliable fashion, then the conclusions derived from them won’t be applicable to any other group.

However, unlike BBC, we like to explore a little more and hence, the other glaring flaws in the report must be pointed out as well. However, this counter would only be indicative because if the entire report is to be countered, we would have to write a book that runs into over a hundred pages.

(Note: This point has been updated with relevant information)

2. BBC passed off truth as ‘fake news’ in its ‘research’

The BBC research on fake news is so shoddy, that they have passed on genuine news as “fake news” just to establish the “Nationalism leads to fake news” narrative.

In their research document, they have included one such.

This news of the citizenry fearing the IBC law and 2100 companies repaying bank loans worth Rs. 83,000 crores is NOT fake news.

It is verified and the absolute truth.

This news was covered by several mainstream media outlets. Following is a screenshot of a report published in Times of India.

This news was covered by OpIndia.com as well. One wonders then if the BBC is passing off real news as fake news and OpIndia is covering the real news, who is the fake news purveyor among the two? But I digress.

The muster of the researchers in this BBC fake news research paper is so abysmal that while the snapshot they have included says “diwaliya” which means bankruptcy, they have transcribed it as “Diwali”, the festival of lights in the following text. Had they known that it means bankruptcy, they’d have recalled that BBC itself had praised the law introduced by Prime Minister Modi as being “good for business”.

3. Doctored data set

BBC has put in some handles as seeders amplifiers etc. of fake news, and selection of these handles itself is based on ‘sources’ that are biased and have often peddled fake news – as we will see later in this article – and this selection is not just biased, but absurd.

Based on “fake news sources” identified by these fake news peddlers, the BBC report makes the following conclusion:

“After plotting the sources who have published fake news, on this network map a pattern does emerge, where we see that handles that have published fake news sit in the pro-BJP cluster” First, BBC has included a Twitter handle called RealHistoryPix as a purveyor of fake news. This handle is a parody handle – as clearly mentioned in their Twitter bio – and often tweets things which are not ‘news’ or makes assertions on current affairs for satirical purposes. For BBC to identify this handle as a purveyor of fake news is simply outlandish. On page no. 87, they name OpIndia.com, R Jagannathan and Swati Goel Sharma as three of the several accounts that Prime Minister Modi followed who are purveyors of ‘fake news’. The BBC report says, “And of the 30 sources known to have published at least one piece of fake news, that sit in the pro-BJP cluster, the @narendramodi account follows 15 of them.” – this is where OpIndia is listed. If “at least one piece of fake news” is the criteria, BBC qualifies liberally. Further, the “at least one piece of fake news” is an absurd criterion is what we’d see in the omissions section later in this article, where BBC didn’t care about many outlets that have published various fake news items, not just one. The BBC then goes on to cover their shenanigans with some disclaimers. They say it is possible that these accounts may have spread fake news based on slippages of journalistic standards or because the ‘entities’ BELIEVE in the truthfulness of the claims. And then the clincher. BBC says “And finally, it is possible, though unlikely, that the independent fact checkers who have identified these sources as having published fake news have made errors”. In all of this, they have provided no data as to why OpIndia, Mr Jagannathan or Swati Goel Sharma have been branded as fake news purveyors. None at all. The BBC survey conveniently ignores the deliberate fake news spread by their own sources and brands several others fake news purveyors with a disclaimer that they might be wrong. The only reason one can imagine that these three entities would be included in this list is, that, all three have often exposed the bias and fake news furthered by their “sources” that have selected the handles, like this one by Swati Goel Sharma exposing Indiaspend, one of the BBC’s sources. I would like to take this opportunity to challenge BBC and ask them to show instances where OpIndia has deliberately spread ‘fake news’. The two times or so that we did make a mistake, we updated our articles and the Editor tendered an unconditional apology. We would like to challenge BBC to show us where the same integrity has been shown by the very sources of fake news that they have considered an authority to help them with their research. In fact, here is something that shows the lack of muster in the researchers and the BBC report itself. There is a website called ‘ViralIndia.net’. This website was run by one Abhishek Mishra who was extremely close to Senior Congress leader Digvijay Singh and Aam Admi Party supremo Arvind Kejriwal and had used that website to spread Islamist propaganda and Anti-BJP, Anti-Modi fake news. In the BBC research paper, this website was, on Page 88, was listed in the Pro-BJP cluster. This itself is fake news. Then, as if BBC was fact-checking its own fake news, on Page 102, this website was listed in the Anti-BJP cluster. Now, there could very well be two separate pages with that name, however, we couldn’t find one. For a ‘research’ to have presented no details of their dataset, this is an apology of a research. Update: As we had guessed, there indeed appears to be two pages of the same name. However, BBC’s decision to include the supposedly ‘pro-BJP’ page in this analysis map is another example of shoddy work. You can read the details here.

4. Inherent bias in the introduction of the report itself

The introduction to the entire report itself indicates the BBC report’s bias and why they might have concluded that ‘Nationalism is the source of fake news’ in India. In their introduction to the long report, BBC writes: