Do England’s teacher trainees ‘do worse’ in maths tests?

The BBC article on the latest research by the CfBT Education Trust would make you think that UK teachers are hopeless at maths.

“England’s trainee teachers do worse in mathematical tests than their peers in some major economic competitors, a study suggests. Teacher trainees in schools in Japan, China and Russia, easily outperformed those from England in the tests. England’s primary trainee teachers came second to last out of eight countries with a score of 32.2 out of 60. Japan led the pack with 52.9 out of 60, followed by China on 43.1 and Russia on 41.7. England was also more narrowly outperformed by Finland, Ireland, Hungary but finished above Czech Republic. At secondary level, where teachers specialise in subjects, maths trainees in England came second to last out of seven countries, with only those in Hungary finishing lower. Russia, China, Japan and Singapore were significantly ahead of England’s score of 26 out of 40. The greatest variation between individual trainees’ results was found in the English group, pointing to a variable quality of new teachers.”

Scary stuff but did the study for charity CfBT Education Trust really find a huge variation in the mathematical knowledge of England’s trainees?

There are 400,000+ teachers in the UK. If we take that as an average for the other 8 countries this would give us a total of 3.6 million in total.

Now this two-year study carried out by researchers at the Centre for Innovation in Mathematics Teaching at Plymouth University subjected ONLY 1,400 teachers to a series of mathematical tests.

In other words, a sample of 0.00038889% took part in this research.

Moreover, the selection of volunteers in the report itself isn’t made clear. The methodology section is where you should find this and there’s not much there. It does say on page 7 that “each country had a co-ordinator with a background in mathematics teacher training, in both the primary and secondary sectors. Typically, the co-ordinators were front line teacher trainers with good access to other teacher training institutions and to schools used for teaching practice.”

That is to say, the research relied upon co-ordinators to find volunteers but no mention is made of how volunteers were chosen i.e the criteria. Were they chosen because they wanted to do the test? Were they told that they’d have a chance to practise their maths? Were they told that they could improve their maths if they did the test? Were only trainees and teachers chosen who thought they were bad at maths or had lower grades?

All of this matters a great deal because (and especially with a convenience sample that appears to have been used in this case) selection effects bias the results of the research. It is not representative. Indeed, sampling bias is why blinded randomised controlled tests are preferred.

We guess the real reason for the research can be found in the recommendations. Indeed, the writing is on the wall before it’s even begun. We note on p.8 that the attitude of the research team is very clear: “Mathematics education continues to be an area of concern to the UK and indeed other countries.” The conclusion of the research is already known before it is conducted.

Indeed, the beauty of league tables or rankings is that less than first place always looks like a failure even when it’s not. This is the fundamental problem – and raison d’etre – of international comparative studies like this and others such as PISA.

Generalising from a sample of 0.0004% across only 9 countries, Professor David Burghes the research head then predictably concludes that UK maths teachers aren’t good enough: “I don’t think many of our trainee teachers have enough conceptual understanding of mathematics at the primary level. Countries that do well at mathematics tend to have a strong foundation at primary school.”

The report then calls for new university training schools to be set up in which trainee teachers could train. And calls for secondary maths teachers, who are currently expected to have an A-level in maths, to take specialised mathematics enhancement courses.

So a university and an educational consultancy publish a report that calls for new university training schools and enhancement courses. What a surprise?! Of course they are going to recommend this. They’re hardly going to say there’s nothing wrong especially in this current economic climate.

This ‘problem’ that the research has invented is then used by other interested parties to promote their ends. Education director at CfBT Education Trust, Tony McAleavy, wades in to say “The establishment of university practice schools otherwise known as university training schools, is the most important decision that could be made for taking the profession forward. This would ensure less variation in standards and would ensure that there would be peer support for new teachers in their first practice; something that has currently been lacking.”

And who is going to help run those practice schools? CfBT perhaps?

The other party with a vested interest in creating a fictional crisis that a fictional solution can mend is the Government. And lo and behold the DfES with words such as ‘languish’ and ‘plummeting’, then use this almost laughable research to vindicate the Tory White Paper, which sadly will see teachers without proper training placed in schools and perfectly suitable candidates lose teacher training funding because they didn’t get a 2:2 in a subject that may not make them a good teacher in the first place.

It may be the case that levels of maths are low among teachers in the UK but this research has not demonstrated that. In fact, this study is a good indication of how not to conduct social research, and why only the results of independently-funded and blinded randomised controlled studies can be relied upon.

In short: bad research, bad policy.

Archives

Register  |  Login

© 2017 EducationState: the education news blog.. All Rights Reserved. Log in