Opening the Guardian app this morning I was surprised to see an article on religious affiliation as the main story. The bombastic headline explained that ‘Christians now in a minority as the UK becomes less religious’ (emphasis added by me) and the subheadline elaborated that the “proportion of population who identify as having no religion rose from 25% in 2011 to 48.5% in 2014“. If true this would suggests there has been a major new trend in religious affiliation in the UK observed that warrants a front page news headline, but has there been? To spoil the punchline… No. Instead, all we have here is long observed trends being misrepresented by dodgy statistics and exaggerated reporting. Details follow below the cross:

Reading through the article I was disappointed (but not surprised) to discover that despite ostensibly covering the findings of some new study, there were no relevant details provided to help you locate the actual study. There wasn’t even the usual link to some university press release. The lack of any reference to the primary source is not unusual for how mainstream media reports on new research but the widespread prevalence of the practice makes it no less frustrating. Instead, like an amateur detective the reader needs to piece together the likely details of the study in question, specifically:

The lead author of the study is “Stephen Bullivant, senior lecturer in theology and ethics at St Mary’s Catholic University in Twickenham”. The report “analysed data collected through British Social Attitudes (BSA) surveys over three decades” and thus “did not examine data from Scotland or Northern Ireland.”

So far, so straightforward.

As such it might seem reasonable to presume that the new analysis of the British Social Attitudes (BSA) data by Stephen Bullivant is where the sensational rise in people identifying as non-religious reported comes from. But that assumption would be wrong. The figure of 25% cited in the article is referenced from the last British census in 2011, whereas the figure of 48.5% is taken from a BSA survey from 2014. This discrepancy is significant as the results of a survey can vary dramatically depending on the wording of questions and the categorising of answers. This makes drawing any grand conclusions based on differences in percentages differences in responses collected from different surveys an extremely perilous endeavour. But does the wording of the questions or the answer categories differ significantly between the census and the BSA’s items about religious affiliation? Yes to both.

Specifically, the census asked respondents “What is your religion?” which implicitly suggests the respondent has one, while the BSA survey asks “Do you regard yourself as belonging to any particular religion?” which instead carries the connotation that the respondent may not regard themselves as belonging and also raises the issue of what qualifies as belonging. That asking the questions in these differing ways results in very different responses is not really news as the BSA stated clearly in its report on religion from 2012:

The difference between the proportions of the population identified as belonging to a religion by the 2001 census and British Social Attitudes can be partly explained by question wording… The difference may also be due to the response options offered; with the census listing the major world religions, and British Social Attitudes listing specific denominations; respondents answering the former would be most likely to see this as a question concerned with ‘cultural classification’ rather than religion (Voas and Bruce, 2004). Finally, the context of the questions is significant, with the census question following one on ethnicity, arguably causing ‘contamination’ of responses (ibid.).

Given that this excerpt is discussing a difference noted in 2001(!) it appears to be a well established discrepancy between the surveys. Further credence to this issue not being news is also found in a 57 page document produced by the Office of National Statistics (ONS) which covers the in-depth research, consultation, and pre-testing that was conducted to select the specfic wording of the religion questions in the census. In this document the difference with the BSA question is discussed in some depth and finally the conclusion is reached that despite offering some benefits the question used by the BSA is:

…unclear as to how ‘belonging’ should be interpreted. Most respondents were forced to choose between the two broadest interpretations (being brought up in a faith or belief in a religion and active participation or ‘practising’ a religion). Therefore, asking a question about ‘belonging’ would not be consistent with the decision to collect information on religious affiliation.

Given such well known differences why did the author of today’s article, the Guardian’s religious correspondent Harriet Sherwood, chose to compare these two statistics? Being charitable you might assume it is due to their being a lack of relevant data from the BSA for 2011- but you would be wrong. In fact there is not only BSA survey data for 2011 but their data goes all the way back to 1983! So what about the figure for the proportion of non-religious they collected using basically the same methodology in 2011? Take a look at the table below, where I’ve highlighted the relevant results from 2010-2012.

Using these figures the dramatic increase in non-religious looks remarkably less dramatic. If we focus only on 2011 then the figure rose from 46% to 48.5% in 2014- an increase of 2.5%, which is only relevant if you completely ignore important things like confidence intervals. To illustrate why this is problematic consider that if you compare the result from just one year early in 2010 instead of a rise we find a 1.5% decrease for the 2014 figure. Taking an average based on the figures from 2010-2012 you end up with a slightly more reliable figure of 48%, which makes the breathlessly reported ‘rise’ to 48.5% seem significantly less deserving of the headline space. Indeed, according to the BSA data shown above Christians in Britain were already a minority compared to no religious affiliation back since at least 2009.

However, the census data demonstrates why accepting even this conclusion could be misleading; first, the census by its nature is a much more representative sample, it costs hundreds of millions of pounds to administer, failure to respond can lead to criminal prosecution and it is estimated to have collected responses from 93.9% of the British population. In comparison the BSA collects around 3,000 responses each year and although these are carefully selected and weighted to be as representative as possible, they obviously cannot compare with the power granted from a census. Second, the census focuses on collecting data about affiliation to a religious identity rather than strength of beliefs or extent of participation, the belonging item used by the BSA is a concept that cuts primarily across affiliation and participation in particular denominations but this doesn’t tell us directly about the strength of an individual’s religious beliefs or lack thereof.

Setting aside such complexities, the figures over the 30 years since the BSA survey launched demonstrate a definite increasing trend in the number with no religious affiliation but it’s a trend that has pretty much levelled off in the past 15 years. Additionally, while Church of England affiliation continues to drop the rates of affiliation with Roman Catholicism and ‘Other Christian’ are pretty steady while non-Christian religious affiliation has shown a slight increase. The point is that ultimately how you interpret these kinds of statistics and what trends you emphasise is likely be highly influenced by your ideological biases and it is thus crucial to be extra careful when you read about some study that supports a conclusion you want to be true. Religious belief is declining in the UK but the rate of non-religious people didn’t almost double in 3 years. That’s just bad reporting and we can’t even tell if it is the fault of the journalist or the study author because there is no actual useful detail on the new study provided in the article.

Despite this, I’m actually inclined to believe that the author of the new study, Stephen Bullivant, is not to blame and that his study just provided the necessary gristle for a pre-prepared sensationalist narrative that is trotted out by journalists covering religion every few years. If that sounds a little too conspiratorial consider this excerpt from the BSA’s report on religion from 2012: