Archive for July, 2010

Monday, July 26th

Most journalists –I like to think, at least – try to probe beneath the surface a little bit in seeking to get some insight into what is going on. Sometimes, this process can be frustrating, especially when evidence comes in which does not fit a pre-conception, hypothesis…or strong news line.

Late last week, I had some experience of this when doing some number-crunching on this year’s national test results.

I wanted to find out if the thousands of schools boycotting this year’s tests – some 26 per cent of the total – would have any effect on the national data generated at the end of the process.

My thinking went along the following lines. I knew that strenuous efforts were made by the testing authorities – principally, the Qualifications and Curriculum Development Agency (QCDA) – to hold the standard of the tests constant every year.

That is: the QCDA oversees a complex process, including “pre-testing”, whereby it is supposed to ensure that it is just as easy – or difficult – for a child of a certain ability and understanding of their subject to achieve a certain level one year, as it would be the next.

When entire national cohorts of 11-year-olds take the tests every year, if the difficulty* of the tests is held constant, then the numbers passing them should give a good idea of the overall national standard of understanding of that particular cohort, in the tested parts of each subject at least, it is argued.

However, what if, in the case of a boycott, the average ability** levels of pupils taking the tests in one year changed? What if pupils from the boycotting schools tended to come overwhelmingly from those who would have been expected to achieve the Government’s target level very easily? In that case, assuming there was no major change in the overall national ability profile compared to the previous year, the sample of pupils taking the tests would have been skewed, because the number of high-achieving pupils taking the tests would have been reduced because many would have been part of the boycott.

If the tests were just as hard as in previous years, results would then fall because of the number of likely high achievers taken out of the test-taking “sample”. But this would say more about the effect of the boycott than it would reflect any fall in national standards overall.

Conversely, if the boycotting schools tended to have a pupil population of lower-than-average ability in the tested subjects, then results might rise as the proportion of high-ability youngsters taking the tests went up.

A couple of weeks ago, after getting hold of the list of schools which boycotted the tests, I began to wonder if the second scenario might hold true. I wrote an analysis piece which ranked the regions of England on take-up for the boycott.

Although finding precise patterns was tricky, it did seem that regions with higher proportions of pupils eligible for free school meals tended to have higher support for the boycott, while those with lower free school meal eligibility tended to have higher numbers of pupils taking Sats as normal.

With pupils eligible for free school meals established, by a great deal of research, as less likely to do well in tests, I thought this might translate to mean that the boycott disproportionately took out from the test-taking sample children who might have been expected, on average, not to do well in the tests.

If this were the case, I thought, even if the results of those taking the tests rose, this might say less about national standards rising, and more about the effects of the boycott.

But this was only a hypothesis. Fortunately, it was testable, if only in a rough way. I got hold of the national test results for 2009 of all primary-age schools in England. I then checked to see if the schools which boycotted the tests this year had higher-than-average results in 2009, or lower-than-average.

And the answer? Well, to my surprise their pupils’ performance, at least in terms of the numbers achieving the Government’s “expected” level four, was almost spot on the national average last year by my calculations.

Nationally, 80 per cent of pupils achieved level four in English, 79 per cent in maths and 88 per cent in science in 2009. Among the schools which boycotted the tests this year, the respective percentages were 79.6, 79.2 and 88.5. The average points score, which the Government calculates by simply adding up the percentage of level fours across all three subjects in each school, was 247 nationally last year and 247.4 in the 2010 boycott schools, I found.

So, there I was expecting to write about how the sample had been skewed massively by the effect of the boycott, and how we should be able to read even less into the national scores than ever this year. Yet the numbers appear not to support such a conclusion.

That said, this may just be luck on the part of the Government. I have checked with the QCDA, whose answer suggests to me that it did not investigate the characteristics of pupils in schools boycotting the tests this year.

It said that its job was to ensure the tests were kept at a constant difficulty level each year and that this process was not affected by the industrial action.

In other words, if for any reason only a proportion of the cohort took the tests in any year and this affected what could be read into the results as a guide to overall national standards, this was not a matter for the QCDA because it was concerned with the difficulty of the tests themselves, rather than the nature of the cohort taking them.

I find that answer strengthening my belief in the merits of national sample tests/assessments as measures of overall standards for England as a whole. Under this system, by design every year the testing authorities would select only a proportion of pupils to take tests, but they would do so in such a way as to ensure it was nationally representative. In other words, they would have to control the sample of pupils taking the tests each year to ensure the results could be read as standing for England as a whole.

By contrast, if the Government sticks with the current testing model, any future boycott action could have an effect on what could be read into the results as guides to national standards.

I think this is more evidence that the problem with the current system is that there are multiple purposes for these tests: they are not designed solely as a check on national standards, of course, but to check on schools, teachers and pupils, as well as providing evidence for many other aspects of education. So the QCDA seeks to ensure that the tests are comparable to previous years for pupils and schools, but sees it as not its job to investigate the possibility of any effect on national standards caused by the sample of pupils taking the tests changing.

The Department for Education told me: “Results from the 2010 Key Stage 2 National Curriculum Tests will be published on 3 August. This publication will include any relevant commentary about the impact of industrial action on the results, including issues of representativeness.”

That looks to me as if there will be some disclaimer offered by the Government next week about being cautious about how much can be read into this year’s results as indicators of national standards, because of the boycott. My number-crunching suggests one should be cautious, but perhaps not as cautious as I thought before looking a bit more closely.

*By “difficulty”, I mean not just how hard the questions are, but how easy or hard it is for a pupil of a given ability to achieve a certain level on a paper each year.

** I am using the word “ability” loosely here, to mean the overall likely capability of an 11-year-old to do well in the national tests. I realise that there are debates over whether ability is fixed from a young age, or, of course, whether test-taking ability reflects overall understanding of a subject.

- Warwick Mansell

No Comments
posted on July 26th, 2010

 

Sunday, July 25th

An article in today’s Observer, setting out a teacher’s observations about how schools have changed during his near 40-year career, includes as “lows” the advent of Ofsted, targets and league tables.

The teacher, Alan Hemsworth, says: “I hate league tables – I think they are so destructive – and it has a spin-off in the classroom, because then everything becomes focused on results, results, results.”

The highs included German exchanges, in my view an example of an educational experience which surely has value aside from the grades it may or may not have generated for each pupil at the end.

Well worth a read:

http://www.guardian.co.uk/education/2010/jul/25/teacher-remembers-40-years

- Warwick Mansell

No Comments
posted on July 25th, 2010

Wednesday, July 7th

A report in today’s Telegraph quoted university admissions tutors saying they found it difficult to choose between A-level candidates partly because of “teaching to the test”:

http://www.telegraph.co.uk/education/educationnews/7872573/Universities-criticise-exam-grade-inflation.html

- Warwick Mansell

No Comments
posted on July 7th, 2010

Wednesday, July 7th

What to make of yesterday’s announcement that just over one in four primary and junior schools took part in the boycott of this year’s key stage 2 tests?

One head I spoke to on Monday, before the final figure was announced, said he thought that overall, take-up of the boycott had been disappointing. The head, who backed it strongly, said this had been the profession’s one chance to really stand up to government on the issue, and it had bottled it.

I guess many of those sympathetic to the boycott might hold this view about the one-in-four figure, and clearly, I am an outsider to this process who has only come to be familiar with the intricacies of this debate in the last few years. Hands up here, too: I also write a regular blog, often on assessment issues, for the National Association of Head Teachers.

But I’m not sure that a figure of 26 per cent non-compliance with the testing regime as it stands is insignificant. The fact that one in four heads were so unhappy, they were prepared to take a decision which looks to me to be quite brave, in the face of reports of pressure from local authorities – and, no doubt, some governing bodies – to go through with the tests, I think says something about the level of dissatisfaction out there. School leaders, it has been argued persuasively I think, are not naturally disposed to industrial action of this sort. And the figure comes about even despite some heads’ reservations about the timing of the action: for legal reasons, the unions were unable to ballot for a boycott earlier in the academic year, meaning that many schools had already gone in for months of test preparation before being asked to join the boycott. I think this will have put off some heads who do not like the tests from refusing to administer them.

In fact, the position that we find ourselves in, in 2010, with the testing system now having been established for a decade-and-a-half, and schools having had to work with that structure since then, is quite remarkable, I think. Having seen this system close-up for all this time, one in four heads are not only still very unhappy with it, they are so unhappy, they are prepared to boycott high-stakes testing altogether.

A government which looked at these figures dispassionately could hardly conclude that the system as it currently exists (that is, not just the tests, but the tests backed by hyper-accountability) commands much confidence from those who know it best.

-          Interestingly, for those against the boycott who might think that heads took decisions on avoiding the tests against the wishes of parents, the head I spoke to said all his year six parents were asked whether they favoured administering the tests as normal and sending the results off to the government, (ie not taking part in the boycott) or teacher-controlled assessment.

They opted, he said, overwhelmingly for the latter, as had parents at other local schools. The school assessed the pupils itself, including using some Sats tests from previous years, and used teachers from another school, which was also taking part in the boycott, to conduct moderation on the marks its staff gave to pupils for some of the tests.

-          An analysis of the socio-economic backgrounds and prior achievements of pupils in boycotting schools would also be interesting. The testing authorities will need to have looked at this data to ensure that the sample of pupils taking the tests this year is not markedly different from that of previous years. If not, national results, due out in early August, could be skewed.

- Warwick Mansell

No Comments
posted on July 7th, 2010

Sunday, July 4th

 I bought today’s Sunday Times intrigued by one of the stories on its front page, under the headline: “Gove plans A-level exam revolution”.

Michael Gove, the education secretary, it said, has announced plans to make the A-level more “rigorous”, by scrapping AS levels.

Universities, it said, would be invited to design new A-levels, which would be “modelled on the new Cambridge Pre-U qualification, taken by a number of leading state and independent schools in preference to A-levels”.

Gove said: “We will see fewer modules and more exams at the end of two years of sixth form and, as a result, a revival of the art of deep thought.”

This, though likely to be far from uncontroversial among teachers and sixth formers, sounded fair enough, I thought. Of course, pupils spend far too long preparing for and taking exams in their last years at school – the last four years, for teenagers taking GCSEs and A-levels, are now dominated by them. Addressing that problem is important, I think, for all the arguments that were advanced for the benefits of the AS system during the Curriculum 2000 reforms.

However, one sentence in this report had me shaking my head in disbelief. “The existing exam boards…could continue to offer the AS/AS combination, but Gove believes schools will abandon these exams as it becomes clear that they do not meet university requirements,” said the report.

Well maybe, but expecting Gove’s suggested return to traditional A-levels to thrive and attract candidates in serious numbers alongside the current incarnation of the exam looks very optimistic to me.

Why? Well, any exam that looks like it could be harder than what is on offer at the moment will struggle to win favour from students, and certainly from their schools, if it is plunged into a marketplace which still features these existing qualifications, and where results pressures are huge.

 If teachers and schools think they can get an A grade from something which is called an A-level now, and which gives them a luxury of a half-way result which will help them both to gauge their likely overall success in the qualification, and to have the potential to resit early papers, why abandon it for a system without these qualities?

Gove would no doubt respond that schools and students would do so because his exams would be  more highly regarded by university tutors. But I’m not sure. I think the Cambridge Pre-U itself, which sounds very like the Gove A-level, has faced a battle to convince even leading independent schools to abandon the A-level because, however much some might like the Cambridge qualification educationally, A-levels as currently constituted would be perceived by many to give them more control over achieving good results.

Don’t get me wrong: as argued, on the face of it I think there are strong grounds for thinking that Gove’s A-levels would lead to a better learning experience for the pupil. But, as my book argued, schools and pupils are not necessarily using this criterion to select courses, given the results pressures they face.

Launching this exam and then expecting it to win in the A-level “marketplace” looks optimistic at best, and naive at worst. If Gove really wants this reform to succeed, given the huge pressures on schools and students to achieve results, I think he is going to have to go further and abolish the route that he clearly thinks stands to be an “easier” option: the current AS/A2 A-level.

- Warwick Mansell

2 Comments
posted on July 4th, 2010