Pisa, Timss and Pirls of Wisdom

Much is made of international comparison and the government seems to be obsessed with England’s relative performance on a world stage. However, just what do these international comparisons show?  Here is a little information:-


The latest edition of the programme for international student achievement (Pisa) from the Organisation for Economic Co-operation and Development (OECD), published 2015 shows that

  • The UK is still lagging behind leading countries and has made little progress in international rankings since results three years ago.
  • When we focus upon the top 10% of pupils in science, England is among the world’s leading countries. In only three countries (Singapore, Taiwan and Japan) are the top 10% of pupils more than a school term ahead of the top 10% of pupils in England in science.
  • In maths, the UK is ranked 27th, slipping down a place from three years ago, the lowest since it began participating in the Pisa tests in 2000
  • In reading, the UK is ranked 22nd, up from 23rd, having fallen out of the top 20 in 2006
  • The UK’s most successful subject is science, up from 21st to 15th place– the highest placing since 2006, although the test score has declined


The four-yearly Trends in International Mathematics and Science Study (TIMSS), published 2015 shows that:

  • England is above average in maths – and ahead of many European countries – but it has not made any significant progress in rankings, despite the ambitions of ministers that overhauling the school system would tackle “stagnating” performance.
  • In these latest international TIMSS tests, England has fallen down in maths by one place at both primary, from ninth to 10th, and secondary level, 10th to 11th.
  • In science, England’s primary pupils remain in 15th place, but have risen from ninth to eighth place at secondary level.


The Progress in International Reading Literacy Study (PIRLS), undertaken every five years, involved children aged about 10 in 40 countries, indicated in 2011 that:

The reading performance of children in England has fallen from third to 19th in the world.

A randomised sample of 170 schools from across England were selected to take part in PIRLS 2016. The results will be published this year

The knowledge question

The recent post by teachingbattleground, centred on the arguments around a knowledge-based curriculum reminded me that, while nobody – surely – is arguing that we should not teach children knowledge, it might be timely to explore what we mean by teacher knowledge.

There has always been a narrative around the skills (knowledge? art? craft?) of teaching and, since I frequently get invited to provide CPD in this sphere, I find it helpful to link thinking to the Teachers’ Standards. As a teacher trainer, I often refer trainees to the old QCDA fourfold explanation of teacher knowledge because it sits quite well with aspects of the 2012 standards.

Teachers’ Standard 3 has three key elements: secure subject knowledge, critical understanding of developments in the subject and the personally correct use of English. The two other aspects relate to knowledge of phonics and early maths teaching. Add to this the expectation, in Standard 2, that teachers should know how children learn and we can now see that the QCDA fourfold explanation is a helpful model. This states explains teacher knowledge as:

  • Subject knowledge per se;
  • Pedagogical theory and practice;
  • Understanding how children learn; and
  • The teacher’s own attitudes to learning.

So, this means that teachers and trainee teachers, must pay particular attention to:

  • Their own subject knowledge and its application so that they teach accurately and in sufficient depth.
  • The way that children learn in order to present material that can be quickly assimilated.
  • The effectiveness of the pedagogies they employ so that material can be effectively deconstructed and presented conceptually.
  • Their own openness to key ideas about teaching and learning so that they do not limit their teaching by what is familiar to them.

If the re-think of the national curriculum and its assessment did anything, it moved the focus away from what the outgoing HMCI once called, ‘a stultifying methodology’ towards a more simple view of ‘what works’ and, as Sir Michael pointed out, ‘what’s good is what works’.  We are still waiting for many headteachers to catch up with this and move away from the straitjacket of WALT, WILF, or whatever set methodology they expect. Improve teachers’ fourfold subject knowledge and we should not need the straitjacket.






Some more thoughts about pedagogy

It was way back in 2012 when the Chief Inspector, speaking at the RSA, referred to the ‘stultifying’ effect of formulaic lesson planning. I agreed then and I agree now. And we’ve moved on a bit from 2012; since then we’ve had the Mike Cladingbowl missive on why Ofsted will no longer be grading teachers or lessons, we had the Ofsted  document about what they don’t expect schools to do and we’ve had it all bolted together in the Common Inspection Framework giving a clear lead to inspectors NOT to expect any particular pedagogy or assessment system, or frequency of lesson observations.

This is very reassuring, so why is it the schools and school leaders still get nervous about requiring a particular approach?  I don’t want to get political here – that’s the OTHER Education Monkey on blogger – so, having spent much time in advising teachers on how to plan lessons, I thought it worth a few minutes on questioning the whole thing.  In the first instance I’ll stick to the learning objective.  This is nothing new, I’ve said all this before, but I’m repeating it because if continues to be a source of stress to teachers and leaders.

Let’s not get hung up on what it’s called; lesson objective, learning objective, learning outcome… who really cares?  And if you think the kids do, then you’re wrong.  Sure, kids like to know what the lesson is about and teachers MUST understand the specifics of what they expect pupils to learn but, you know, these two things do no necessarily join up!  This is heresy to some school leaders, so I’ll whisper it – you really don’t have to display the learning objective at all. Some of the best lessons I’ve seen have not involved WALT, or WILF, or TIB, or any other members of this menagerie of characters that populate the world of lesson planning. Sometimes kids like to work to what the focus of the lesson has been for themselves. This way, you can be sure that they will let you know if YOU weren’t clear about it. Quite often they like the challenge in the form of a question. And, as soon as you make it a question, you shift the focus onto learning. So, rather than the tedious ‘We are Learning To…’ or “I can…” both of which are too frequently followed by an activity and not the learning (we are learning to write an autumn poem, I can name the parts of a plant…) why not try a question?  Ask yourself, how exciting is it to a ten-year-old to copy down ‘We are learning to understand the key features of a play script’ (yawn)?  So, why not ask simply, ‘how can actors remember all those words? With the supplementary question, ‘…and how do they know what to do?’  I guarantee you spark kids’ interest more keenly, you’ve saved all that copying out of a convoluted objective, and you’ve immediately given yourself the beginnings of an assessment protocol.  Dylan William, making a similar point, asks, ‘why can’t we just ask students, “why is it colder on top of a mountain when it’s nearer the sun?’

And don’t give me that nonsense about writing down the LO as the title because then you can assess if it was met or not!  If you are clear about the intended learning then surely, both you and the pupil can remember that a lesson entitled ‘How can ice and steam be the same?’ was about solids, liquids and gases.

It takes a bit of getting used to, this use of learning questions, but it’s fun. It’s a lot more fun than some of the stultifying lesson aims I’ve seen. You need to practice. Have a go, now.  I won’t tell anyone,