Assessment can be dangerous…..

Professor Stephen Ball hung his head in despair at the end of his answer to the final question after his Robert Owen Centre lecture on Tuesday night (23/02/2016). I don’t really think anyone was all that surprised. His talk, which explored lessons from around the world on the use and abuse of assessment as  policy, set an agenda of concerns which had been carefully chosen for his Scottish audience.These concerns related to the question of how far crude data garnered from assessment practice  is accepted as a major policy driver, and whether or not this is a dangerous thing.
Stephen Ball did not make any  claims to be telling us anything new. It seemed that instead he was  confirming Scotland’s place at the neo-liberal education reformers’ table, but no-one was really in the mood for celebrating. He showed us the different ways we’d got there.
Evernote Snapshot 20160223 185920
Firstly, we have bought into  what Pasi Sahlberg calls the GERM agenda he shared  different examples of education reforms underpinned  by assessment and showed the commonality across these systems; now in the company of our very own  latest policy offering – the new National Improvement Framework .
Economic performance and competitiveness are at the heart of all these reforms – the neo-liberal agenda –  where educational data are harnessed for international comparison. Again, no secrets here as the NIF was offered as a means of response to  The OECD report on Scottish education (For more analysis of the NIF  see Professor Mark Priestley’s blog).The arguments and research on neo-liberalism and education are plentiful, well documented and well rehearsed – see links on Mark’s blog.
Secondly, in common with other recent education system reforms The NIF exemplifies what Ball identifies as the policy ratchet;  there are no more grand gestures in education reform, just a constant stream of small manoeuvres. We can see this happening here. We have had a number of small reforms relating to assessment ( NIF, New National level tests; new Highers) but also to curriculum ( early years education; 1+2 languages, for example) and to teachers’ professionalism and practices (Teaching Scotland’s Future; professional update); important questions are -how do they all relate to each other over time and whose agenda are they responding to? Small reforms don’t announce themselves as loudly as big reforms, however. And therefore they might not invite so many questions as big reforms.
The NIF is presented to us as a response to closing the attainment gap.  There are implications of causality in this, i.e. claims are made about causal relationships embedded in these reforms; the causal assertion being that lowering the attainment gap will equalise opportunities. Schools, however, as well all know only account for a very small proportion of the improvement effect – 11-15% according to Stephen Ball, the rest  is explained by non school factors, but the elements in this complex relationship cannot be isolated from each other; to do so is to over-simplify the problem.  Conveniently, though,  if attention is  shifted from general inequalities to the role of schools, this  depoliticises the poverty problem, according to Ball, recasting  inequality as more of an educational failure than a political problem. The solutions are presented  as technical,  not social or political and simple, not complex and are identified as the responsibilities of schools. And if it does’t work? Well,  we all know who gets the blame, and further reforms can be justified.
The sharpened focus on schools in this debate  illuminates features of the inexorable but flawed logic of neo-liberalism:
  • management by data through targets and reporting – performance management.
  • Evaluating learning and teaching through  teacher performances – surveillance
  • Rewarding schools/teachers for good performance – exacerbating the problem.
  • Students are now seen as an asset  to be invested in; from whom economic advantages can be garnered
  • reform becomes a profit opportunity, generating commercial commodities (such as assessments) which become essential to the process

Ironically, there is a strong case for negative causality here: with a focus increasingly on positive high achievement,  most of these features serve to widen the attainment gap, not narrow it.

This logic is also providing us with  new reference points for research.  Policy actors such as PISA; McKinsey;  Schiller; Tucker and  OECD are the new voices in research and the policy process,  changing the ways research is understood, changing how policy is done and changing also modes of governance.  Their  data, which command our attention are based on scientific practice offering quasi scientific claims; the Educational Endowment Fund provides a good example of this with its reductivist cost versus effectiveness equations to help teachers and leaders evaluate a range of interventions. Not as helpful as you might think – as Biesta (2007) argues,
‘we need to widen the scope of our thinking about the relation between research, policy, and practice, so as to make sure that the discussion is no longer restricted to finding the most effective ways to achieve certain ends but also addresses questions about the desirability of the ends themselves.’
 The example from EEF narrows our thinking about this -data can be dangerous too.
So there was a general sense of despondency at the end, along with a feeling that we need a different agenda for reform. Reasons to be cheerful were in short supply. Professor Louise Hayward, who introduced the event urged us to keep the conversations going  through networks – and make sure they are heard. Maybe that’s all we can do – worth doing all the same. So comments welcome, as ever please 🙂
Biesta, G. (2007), WHY “WHAT WORKS” WON’T WORK: EVIDENCE-BASED PRACTICE AND THE DEMOCRATIC DEFICIT IN EDUCATIONAL RESEARCH. Educational Theory, 57: 1–22. doi: 10.1111/j.1741-5446.2006.00241.x

About catrionao

I'm a lecturer at UWS and a PhD student at Stirling University, studying a school based practice of teacher professional learning.
This entry was posted in Uncategorized. Bookmark the permalink.

2 Responses to Assessment can be dangerous…..

  1. renfrews says:

    Thank you for a very informative and thoughtful article. It’s good to see that there are professionals who are able to understand the mechanisms behind current educational changes and to challenge them.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s