Archive for March, 2009

 

Right, this blog is for a new page I’m setting up on the site which will be concerned with the use of Fischer Family Trust data in schools.

This is an area that I’ve been trying to get to grips with over the last year, amid tales from teachers and others which point to concerns about the workings of this data system, which makes estimates of pupils’ future exam performance based on their previous test scores and other information.

Many of the underlying issues with the use of FFT data in schools seem to have links with wider concerns about the implications of England’s test-driven education system. These include the dangers of using information for purposes for which it was never designed, the problems with over-interpreting data and the inherent conflict between using information on pupils’ exam performance to help that child improve, and also to hold their teacher to account.

I wrote a piece for the TES on this shortly before I left the paper, which would be a good starting point, I guess, for those interested in this subject. It’s available here.

Since then, I’ve had contact from a teacher and, separately, a parent, who both have made the interesting observation that FFT data can actually act to demotivate children.

The teacher said that she teaches in a school where many children are from poorer backgrounds and arrive with low predictions under the Fischer Family Trust’s “D” indicator – often F or G grades at GCSE.

The teacher adds:  “Firstly, I don’t have to do much to meet my targets for these groups of children and wonder if FFT data therefore helps to maintain the status quo – ie we are not expected to achieve much with the most socially deprived pupils we teach.  Secondly, my experience shows that pupils at this end of the spectrum often have far more ability than is predicted by FFT.  In my first year of using FFT data, a pupil predicted grade F achieved grade B and pupils have regularly exceeded targets in these groups – quite often pupils achieve two or three grades higher than predictions.  This tells me that poverty is not necessarily correlated with ability and we do pupils a disservice if we use data based on this assumption (FFT use of post code and FSM).”

The teacher goes on: “My school (and I suspect many others) uses FFT data not only with teachers but in setting pupil targets.  On target setting day, an individual pupil’s FFT data is published for them and parent’s/carers to see and this is expected to be their aspirational target.  At the top end of ability that may be appropriate but at the bottom end – is an E, F or G at GCSE meant to be aspirational – it is certainly demotivating for the pupil. I find myself telling pupils and parents to ignore it, that the data is irrelevant and not applicable to the child – as indeed it often is not.”

Recently, a parent got in contact about the way FFT data was being used in her son’s school. Her son is in year 10. She said:

“He achieved level 4’s at KS2 and level 6’s at KS3, and he has just had the result of the first part of his GCSE double science award in which he achieved a grade ‘A.’ [Yet] his target grades based on Fischer Family Trust analysis done by the school are for grade ‘D’s in every single subject. I have asked the school how this can be so given that he managed level 4’s and 6’s in key stage tests. I was always led to believe that a level 6 would equate to a grade B at GCSE. He is currently attaining grade B level grades in most subjects but he has not had his target grades increased to match his performance and hard work. I am incensed that his diligence and the good teaching he is having are not being recognised by the school as an institution.”

I would welcome any comments on this, particularly as it seems to me that Fischer Family Trust data is increasingly influential in terms of how teachers and even pupils are judged.

I will also shortly be introducing another new category, called Data watch: Ofsted, which will look at the use of statistics in inspections.

 

- Warwick Mansell

7 Comments
posted on March 31st, 2009

A powerful article was published yesterday on the effects of Government targets on the quality of care in NHS hospitals. It was written under a pseudonym, which of course gives a clue as to the difficulty people often feel in speaking out about problems with the regime they work under.

I include a link to it here because I, and I’m sure others, can see a pattern emerging in the coverage of what happened at this hospital and the Baby P incident, and a link with my observation of what has happened in schools over the past 20 years.

It is, of course, that the obsession with statistical indicators as a way of holding people to account can actually get in the way of what should be the core purpose of what public service professionals are meant to be doing. And the lack of trust embodied in the use of statistical output measures becomes counterproductive because it effectively writes off the sense of service and commitment that many working within each of these services undoubtedly have. This does not mean, of course, that we can go back to simply trusting professionals unconditionally. But a high price is being paid under the present system, which frankly looks stranger and stranger, the more I find out about it.

I hope to be writing more about what appear to me to be links in the future.

- Warwick Mansell

No Comments
posted on March 20th, 2009

I struggle, sometimes, to believe what I’m reading from those who have been supposedly amongst the most well-informed and influential experts on education in England.

These are the Government’s policy advisers, who have helped shape its attitude to what goes on in our schools, and whose opinions may have an impact on the educational experience of millions of pupils.

Prime in my sights at the moment is Conor Ryan, the former adviser to David Blunkett during his time as education secretary, and then to Tony Blair in the years before he stood down as Prime Minister.

Last month, Ryan wrote a piece in the Independent defending national tests, which prompted me to respond.

Now, after reading his latest test-related missive, penned in response to the recent Cambridge Primary Review report on the curriculum, I want to make some further observations.

Ryan argues in his blog that the Cambridge review should not be seen as independent because its members have an ideological opposition “against standards and effective teaching of the 3Rs in our primary schools”. Not only that, but the members of the review want to turn back the clock to “1970s primary education”.

Comparing the contents of this blog, which amounts to little more than politically-driven vitriol, with the depth of understanding and analysis in the Cambridge Review’s 129-page document on the curriculum is like comparing night with day.

Much of the Cambridge report is spent not on reaching hasty judgements on the future of the primary curriculum, but on a careful weighing of evidence and arguments coming from hundreds of people and organisations who submitted views. In doing so, it is guided by the need to seek answers to the most fundamental of questions in this field, including the overall purpose of primary education.

Among its findings, which you may or may not agree with and which I I highlight only to show that this is about as far as you can get from a kneejerk critique from a group of idealogues, are that the Government’s numeracy strategy has been accepted as useful by many within the profession in a way that the literacy strategy has not; that claims that pupils don’t need to know much these days, because they can look up information on the internet, are a “travesty” of what good education should involve; and that it is misleading and unhelpful to lament uncritically the organisation of primary teaching by individual subjects.

The report also appears to look critically at some aspects of education thinking which were popular with teachers in the 1970s, including the reluctance to teach in “subjects” and what it sees as the increasingly cliched use, during that period, of the term “development” to describe children’s learning.

The report also comes with what an outsider to England’s education system would surely find as yet more worrying evidence of the side-effects of the testing regime, which should be taken seriously. And I’m sorry if I’m beginning (well, more than beginning…) to sound like a stuck record on this. But it is, you know, the purpose of this site to document the downsides. Evidence mentioned in the report includes:  

- The Music Education Council arguing that many schools are “shackled” by league tables and performance statistics [relating to English, maths and science, and therefore are reluctant to divert resources into music. Teacher trainers, added the report, "argued that music had become so marginalised that it could disappear from the curriculum altogether". 

- On reading for pleasure, "one witness, citing her experience as an English teacher, primary head and English examiner, condemned the 'abject state of affairs'" where this "has disappeared under the pressure to pass tests.

- The National Inspectors and Advisers Group for Science (NAIGS) reported that "the key stage 2 science Sat has changed the way that science is taught in KS2 classrooms (and viewed by senior managers), to the detriment of the children's learning experience and love of science.

- The Association for Science Education said that "teaching to the KS2 test reduces both the extent and quality of primary science at the top end of the primary school".

Interestingly, and unsurprisingly, the review takes a different view from Ryan as to whether teaching a truly broad and balanced curriculum represents a threat to pupils' mastery of literacy and numeracy, or the three Rs, to use Ryan's shorthand.

Ryan argues: "There is a very real conflict between recognising the need to single literacy and numeracy out for extra time over the other subjects as with the dedicated literacy and numeracy lessons, and making them just another aspect of primary schooling that pupils may or may not pick up along the way....

"A return to a situation where the teaching of these basics is subsumed again into a process of osmosis would destroy [Really?? How many would die?]another generation of primary schoolchildren in the same way that the children of the seventies were failed.”

The Cambridge review says: “The HMI and Ofsted evidence, consistent over several decades, [is] that far from being a threat to achieved standards in ‘the basics’, a broad, rich, balanced and well-managed curriculum is actually the prerequisite for those standards, and this has been demonstrated consistently in school inspection. It is perhaps worth recalling this finding, confirmed in 1978, 1985, 1997 and 2002, in case there be any who wish to defuse the chorus of complaint from our witnesses by arguing that the loss of curriculum breadth and balance since 1997 has been a necessary sacrifice in the cause of improved standards in literacy and numeracy. The evidence could not be clearer. If breadth is attained, so are standards. If breadth is sacrificed, so are standards.”

But perhaps I digress. Claims that this report should not be listened to, because those involved in its publication simply want a return to the 1970s, are ludicrous. The idea that there is nothing to be learned from such a painstaking inquiry really does leave one wondering whether Ryan believes in the concept of education – learning by weighing up such facts as one can get one’s hands on, and actually listening to others’ points of view - at all.

Education in England has changed fundamentally since the time I was at primary school (I left primary in 1982), and those who criticise and question high-stakes testing and other aspects of the political take-over of schools over the last 20 years do not want a repeat of the mistakes of the past. Of course, there are ways of reforming the system without ditching all accountability. And there have been gains. Yet it is, surely, right to question the effectiveness of much of current education policy, especially if one stands back and asks what it is really trying to achieve.

Ryan, who I doubt has actually read the full Cambridge review report, judging from his posting, and other New Labour advisers have sought to distance the party and its policies from what went before. But, increasingly, this authoritarian attempt to paint those who disagree with them as luddites and unprincipled oppositionalists – we’re right and everyone else hasn’t read their education history – betrays a sense in which they themselves are trapped in the prison of old debates. There is not going to be a return to the 1970s in education, even if the Cambridge Review were being taken seriously by ministers. Ryan should relax about that, except that it suits him more to brand his “opponents” in a certain way, so that we must ignore the review’s contents, noting as he does in his post that the review team”have been permanent critics of the changes of recent decades. And it is only in that light that the review’s conclusions can be understood.” In other words, they should not be listened to because they have been critics of this Government’s education policies in the past.
In a flippant moment, I might regard this as scare tactics redolent of the Conservatives’ botched “New Labour, New Danger” advertising campaign against Tony Blair in 1997. Or as something out of Animal Farm.
 
 Ryan and his ilk are simply not willing to engage with the evidence of the impact of the policies they themselves have created. From a party-political point of view, that may be unsurprising. From the point of view of attempting to safeguard the best interests of our education system as a whole, it is unforgiveable.

- As a post-script, there is yet more evidence of ministers somehow not managing even to investigate the impact of teaching to the test. Here is a question, asked by Adrian Sanders, Liberal Democrat MP for Torbay, of Jim Knight, the schools minister, last month, and the Parliamentary written answer that followed.

Mr. Sanders: To ask the Secretary of State for Children, Schools and Families what research his Department has commissioned to assess the extent of teaching to the test practices in schools. [253304]
Jim Knight: The Department has undertaken investigative research in a small number of primary schools to look at good practice in preparing pupils for key stage 2 national curriculum tests. The findings will be considered by the Expert Group on Assessment as part of their wider research into the future of testing. The group is due to report in March 2009. [Source: Hansard]
In other words, the Government has not done any detailed work on this.  

- Warwick Mansell

1 Comment
posted on March 4th, 2009