Pisa, Timss and Pirls of Wisdom

Much is made of international comparison and the government seems to be obsessed with England’s relative performance on a world stage. However, just what do these international comparisons show?  Here is a little information:-


The latest edition of the programme for international student achievement (Pisa) from the Organisation for Economic Co-operation and Development (OECD), published 2015 shows that

  • The UK is still lagging behind leading countries and has made little progress in international rankings since results three years ago.
  • When we focus upon the top 10% of pupils in science, England is among the world’s leading countries. In only three countries (Singapore, Taiwan and Japan) are the top 10% of pupils more than a school term ahead of the top 10% of pupils in England in science.
  • In maths, the UK is ranked 27th, slipping down a place from three years ago, the lowest since it began participating in the Pisa tests in 2000
  • In reading, the UK is ranked 22nd, up from 23rd, having fallen out of the top 20 in 2006
  • The UK’s most successful subject is science, up from 21st to 15th place– the highest placing since 2006, although the test score has declined


The four-yearly Trends in International Mathematics and Science Study (TIMSS), published 2015 shows that:

  • England is above average in maths – and ahead of many European countries – but it has not made any significant progress in rankings, despite the ambitions of ministers that overhauling the school system would tackle “stagnating” performance.
  • In these latest international TIMSS tests, England has fallen down in maths by one place at both primary, from ninth to 10th, and secondary level, 10th to 11th.
  • In science, England’s primary pupils remain in 15th place, but have risen from ninth to eighth place at secondary level.


The Progress in International Reading Literacy Study (PIRLS), undertaken every five years, involved children aged about 10 in 40 countries, indicated in 2011 that:

The reading performance of children in England has fallen from third to 19th in the world.

A randomised sample of 170 schools from across England were selected to take part in PIRLS 2016. The results will be published this year


RE Assessment simplified (at last!)

For a good many years, assessment in RE has been against a series of ‘I Can’ Statements, linked to notional levels, similar to those in the pre-2014 National Curriculum. Now, as a National Society (Church of England) school inspector, and assessor for the RE Quality Mark, I encounter school after school that is struggling to make sense of assessment in RE. This is often because they are confused about the use of the I Can statements and don’t understand how the ‘levels’ in RE can stand up in a world where NC levels have gone.

My colleague, Emily Norman, and I have now created a simple change to the existing system, which merely simplifies what we already have – it keeps the familiar but makes it manageable. The materials for this, including assessment Excel sheets for Years R to 9, have been placed online and are freely downloadable from: http://bit.ly/2hUx0Xl

This simple view is based on the following key principles: –

1. It is important to acknowledge that Attainment Target 1 (learning about religion) and Attainment Target 2 (learning from religion) are essential components to RE planning but they are part of a child’s overall development in the skills and understanding of RE. Therefore, they should be a part of any assessment system but not necessarily separated. The separation is a function of planning and formative feedback rather than summative assessment.

2. The way that the I can statements are laid out is against the RE Council’s Six Areas of Enquiry. This makes every sense because it aligns the development of skills in RE with the content. Our table, therefore, includes the REC’s key question which underpin each area. This makes it easier for teachers to see how pupils demonstrate their understanding. It also blends the current thinking about the content domain and the cognitive domain and so aligns with NC approaches.

3. It is no longer appropriate to think about Levels in RE or in any other subject. Therefore, our table is laid out in age-appropriate expectations, with an additional line for Year R, and an indication of when pupils are working towards (WT) or working at greater depth (GD).

4. To depersonalise assessment, since it is the teacher who is making the judgement, the first person references in the old I Can Statements have been re-written in the third person.

It is a key principle of assessment that pupils do not progress linearly and there was tendency to use the I Can statements as a ‘best fit’ system which expected linear progression. Therefore the most sensible approach here is to acknowledge that pupils progress at differential rates and reflect this in whatever assessment recording system we use. Then it would be simple for teachers to highlight the relevant statements. Where this is used robustly, of course, they would be able to identify evidence that the statement applies. Thus assessment can be much more forensic than the rather hit-and-miss system that sometimes characterises a school’s approach.

Ours is not a revolution, more a sensible revision, but we hope that it will help schools to bring the RE assessment into line with other subjects, rather than still expecting some kind of artificial levels. Also, we see AT1 and AT2 as planning tools, their impact on pupils’ thinking is reflected in the statements.

The knowledge question

The recent post by teachingbattleground, centred on the arguments around a knowledge-based curriculum reminded me that, while nobody – surely – is arguing that we should not teach children knowledge, it might be timely to explore what we mean by teacher knowledge.

There has always been a narrative around the skills (knowledge? art? craft?) of teaching and, since I frequently get invited to provide CPD in this sphere, I find it helpful to link thinking to the Teachers’ Standards. As a teacher trainer, I often refer trainees to the old QCDA fourfold explanation of teacher knowledge because it sits quite well with aspects of the 2012 standards.

Teachers’ Standard 3 has three key elements: secure subject knowledge, critical understanding of developments in the subject and the personally correct use of English. The two other aspects relate to knowledge of phonics and early maths teaching. Add to this the expectation, in Standard 2, that teachers should know how children learn and we can now see that the QCDA fourfold explanation is a helpful model. This states explains teacher knowledge as:

  • Subject knowledge per se;
  • Pedagogical theory and practice;
  • Understanding how children learn; and
  • The teacher’s own attitudes to learning.

So, this means that teachers and trainee teachers, must pay particular attention to:

  • Their own subject knowledge and its application so that they teach accurately and in sufficient depth.
  • The way that children learn in order to present material that can be quickly assimilated.
  • The effectiveness of the pedagogies they employ so that material can be effectively deconstructed and presented conceptually.
  • Their own openness to key ideas about teaching and learning so that they do not limit their teaching by what is familiar to them.

If the re-think of the national curriculum and its assessment did anything, it moved the focus away from what the outgoing HMCI once called, ‘a stultifying methodology’ towards a more simple view of ‘what works’ and, as Sir Michael pointed out, ‘what’s good is what works’.  We are still waiting for many headteachers to catch up with this and move away from the straitjacket of WALT, WILF, or whatever set methodology they expect. Improve teachers’ fourfold subject knowledge and we should not need the straitjacket.