Over the past week, we have witnessed a reigniting of the rancorous debate about qualifications in Scotland’s Curriculum for Excellence (CfE). This debate has surfaced periodically since 2015, and has focused on allegedly falling attainment and a narrowing of curriculum choice in the senior phase of secondary schooling.

Most recently, further controversy was sparked by the coverage of a claim in Professor Jim Scott’s latest report, that the percentage of those who gain national qualifications has been going sharply down since the introduction of the new curriculum.

This debate has been of great interest to us, given the focus of our recent publications (Shapira & Priestley, 2018, 2019). We, too, have been critical of many aspects of CfE – articulation of policy, implementation and particularly the trend towards curriculum narrowing in the senior phase. Nevertheless, we also are cognizant of the dangers of using research to support political agendas, as appears to be the case in the current furore about qualifications.

First Minister's Questions: Sturgeon grilled over pupils' subject choices

Background: Claims about improving attainment 'not fully accurate'

Quick read: Subject uptake reduces across ‘entire senior phase’

Expert view: Is Scotland’s curriculum really narrowing?

Long read: How one school drove up attainment of disadvantaged pupils with courses in everything from beekeeping to construction

Thus, we have some concerns about the methodological rigour of the reports used as the main evidence for supporting an argument that the Curriculum for Excellence is failing young people in Scotland. For example, in our own analysis of attainment data (for years 2011-2017) we have seen evidence of attainment at National 5 and Higher having risen, both in overall percentage passes (out of total number of entries into qualifications, grades A-C), as well as in percentages of pupils who attained five A-C grades at National 5 and Higher levels.

Is Curriculum for Excellence failing young people?

So what is going on? How is it possible to have two quite conflicting interpretations from the same data? We suggest that the issue lies in a lack of robust methodology in the underpinning research; this, in turn, then produces results – and subsequent claims – that are at best dubious, and which at worst misrepresent the data. We cannot address the full range of claims made in the report, but we offer two examples to illustrate claims that are problematic.

Let us first address one of the claims made in the media based on the report that “attainment in Scottish national qualification …in S4… has dropped by at least 32.9 per cent for each level since CfE was introduced in 2013”.

The report uses the figures obtained from the Scottish Qualifications Authority (SQA) official statistics. These show 335,397 passes in 2018-2019, compared with 503,221 passes in 2012-2013. Simple maths thus suggests that the total number of passes in 2018-19 stands at 66.6 per cent of the total number of passes in 2012-2013. But could we conclude based on that that attainment on National 3-5 levels dropped by 33.3 per cent? The answer is no, and it is necessary to explain why.

Let’s start with an examination of what is meant by ‘"attainment". Looking at publicly available SQA data, we could see that "attainment" is counted as all exam/coursework passes, across all year groups. Young people who sit multiple qualifications, and pass, are counted multiple times. The attainment at National 5 level is considered to be a pass at grades A-C. Thus attainment is conceptualised as the total number of qualifications gained in a year across all subjects.

There is a major caveat here. Claiming that the total number of qualifications achieved has fallen is not the same as saying that grades have fallen; to claim the former as a fall in attainment is misleading. The reason for this is that a drop in the total number of qualifications achieved is not necessarily evidence of a decline in standards; it may simply be that fewer qualifications are being taken, and there are various factors that need to be considered when analysing this. To simply compare raw numbers from year to year will not account for these.

Over time, comparisons are important because they allow us to understand these factors. Selecting a "baseline year" and comparing the rest of the data to that year is one way of doing these comparisons. Yet a selection of the baseline year should be justified and one should make sure that such comparison is meaningful, comparing "like with like" and avoiding spurious comparisons.

With Curriculum to Excellence, there is a need to reflect on both demographic issues (eg, declining school rolls) as well as changes in school practices after the introduction of the new curriculum, as these may explain the total number of passes at National levels 3-5 after the introduction of the CfE.

Below are some of these factors:

Before introduction of new national qualifications under the CfE, there was a widespread practice of double-counting of passes at SCQF (Scottish Credit and Qualifications Framework) levels 4 and 5 (see our blogpost for fuller explanation).

There was a curriculum narrowing in terms of the number of qualifications that student enter at SCQF levels 4/5 (see our recent research, Shapira & Priestley, 2018, 2019) and the main reduction in the number of subject entries took place during 2012-2013 and 2013-2014.

There was a continuous reduction in the size of the cohort aged 12-18.



Taking into account these (and other) changes using the year 2012-2013 as a baseline is simply wrong, and a comparison between the total number of passes before and after the introduction of the CfE is essentially meaningless.

To meaningfully compare the number of passes over time, we must do so as a proportion of the total number of entrants or awards for each year. Using the data we can calculate the proportion of students attaining at each level over time, but only if we calculate the proportion at each level using that specific year’s total as a base. Then we can compare the trends in the size of the proportion of those who achieve qualifications at a certain level.

When we do so one can see (see charts in our blog) that after the introduction of CfE there was a reduction in the proportion of passes on National 3 and National 4 levels but a 15 per cent increase in the proportion of passes at National 5 level qualifications (from 53 per cent in 2011-2012 to 67 per cent in 2018-2019) . This strongly suggests that attainment is actually rising.

It is important to have rigorous independent research to support both policy formation and critique of that policy. We believe that studying the impact of curriculum reform is essential, yet it is necessary to acknowledge the complexities involved and address them by using rigorous research methodologies and avoiding a simplistic collation of figures. There is also a need to look at a broad range of outcomes including qualification results, the overall level and range of qualifications achieved, the transitions made and destinations reached after leaving school.

One argument that we have been making in our recent work is that there is simply not enough evidence on the impact of the Curriculum for Excellence. Our new two-year research project, funded by the Nuffield Foundation, will go some way towards addressing this evidence gap.

Marina Shapira, Camilla Barnett, Tracey Peace-Hughes, Mark Priestley and Michelle Ritchie are researchers at the University of Stirling. This is a shortened version of a piece originally published as a blog post