Assessment viewed sideways

Featured

My last post asserted that the world looks at assessment the wrong way. ‘Sideways’ so to speak. I did this with no real evidence to back up my view so I guess it’s incumbent upon me to at least give some sort of proof.

To do this I submit the Australian NAPLAN testing program as exhibit A, and the Victorian Certificate of Education’s GAT as exhibit B. As neither would be widely known outside of Australia I’ll attempt to set the scene a little along the way.

Exhibit A: NAPLAN

In it’s own words, The National Assessment Program – Literacy and Numeracy (NAPLAN) “is an annual assessment for students in Years 3, 5, 7 and 9. [It] tests the sorts of skills that are essential for every child to progress through school and life, such as reading, writing, spelling and numeracy.”

However, these are not the words some people involved in the process would use. NAPLAN is a contentious pariah in the opinion of some. For example, it is the cause of much angst in Schools whose published results appear unflattering, and significant anxiety for students and parents who worry about failing to perform well.

Screen Shot 2015-06-08 at 12.52.37 pmHere’s the thing. NAPLAN is an exercise in meta-data collection. The value is in establishing literacy and numeracy standards of present day students in the Australian educational landscape and locating where there is a regional, institutional, or systemic need for attention or overhaul. The Tasmanian education system, for example, was flagged in the last 12 months as worthy of scrutiny based on the data raised by NAPLAN.

So it’s useful. In fact, I’d go so far as to say NAPLAN is absolutely essential to education policy makers in this country. However, because its big picture stuff, it’s implications to schools and families is limited and easily misconstrued. Every student receives their results and the performance of every school’s collective student body is published on the “my schools” website. What that means in effect is that everyone who cannot gloat over unambiguously strong results is left feeling defensive and threatened by appearing mediocre.

Screen Shot 2015-06-08 at 12.02.49 pmI regularly hear accounts of schools that take time out of their curriculum schedule to prepare their students for these literacy tests. I also hear accounts of schools tapping individual families on the shoulder and suggesting it would be a good idea if their little angel were away sick for the testing week. I have anecdotal evidence of students who have developed anxiety over school as a result of these tests and I also know that when my school (an open enrolment school) asks for parents of prospective students to supply NAPLAN data, the request is treated with suspicion and fear that the data will be used as an unofficial form of selection criteria.

I’ve lost count of the number of times I’ve tried to set the record straight on NAPLAN. It has no direct bearing on the achievements of any Australian student in the curriculum they are learning in. Scrutinizing it for meaning is like looking at climate records to predict whether or not it will rain tomorrow. Not exactly a futile activity but equally not a guarantee of precise information. It is nothing more than a diagnostic tool. Being ‘good’ at it is nice. Being ‘bad’ at it is fascinating and worth investigation rather than concealment. I may as well howl at the moon though. People cannot help layering results with status and stigma. Interestingly, the same cannot be said of…

Exhibit B: GAT

The high school finishing certificate of the masses here in Victoria, Australia is the Victorian Certificate of Education or VCE and an integral piece of this process is the General Achievement Test aka the GAT.

A quick look on the official website explains the GAT is ‘a test of general knowledge and skills in written communication, mathematics, science and technology, and humanities, the arts and social sciences’ – an exercise in literacy and critical thinking basically.

And once again, I’m convinced the attitude to this assessment is lopsided. My circumstantial experience is that students sit this test in the middle of the year (also the coldest part of the year here) and it is an exercise in begrudging compliance. It’s a three-hour test under exam conditions with two writing tasks and a set of multiple choice questions and … blah blah blah.

Screen Shot 2015-06-08 at 12.53.42 pmStudents really just don’t want to know about it and schools want to get it over and done with quickly. Everyone is at the end of a long term and the first semester has just wrapped up and they’re feeling flat. Schools for their part make some effort to prep their students for the test but my sense is that this is something most staff don’t fully appreciate, and parents are completely uninformed about the thing.

But the GAT matters! Unlike NAPLAN which has no direct effect on curricular assessment, a student’s GAT result comes into play when there is a discrepancy between the school assessed results and the end of year exam. In such situations the system favours the level of achievement that the GAT most compliments. Therefore, should a student achieve well throughout the year but then bomb out in the exams, a GAT performed to their best ability is there to protect all that hard work. It’s credit in the bank!

The status and usefulness of this assessment seems obscured to me. Not that I want anyone getting anxious about it you understand, I just think there’s a lot of students who overlook it’s direct value to them.

Sideways Perspectives Corrected

Both testing regimes sit largely outside the daily grind of student curriculum and both claim to be indicators of general aptitude. For this reason I often hear people say that ‘you can’t study for these things’ which is another unhelpful perspective in my opinion. Students can always practice sample questions and writing genres because in doing so they develop familiarity with the nature of questions, the nuances of the wording, and assurance in how to complete tasks.

It may be that suggestions the tests can’t be prepared for is a way of managing anxiety but if that’s true than we’re feeding ignorance by missing an opportunity to look at assessment properly.

The question that people really need answered is not ‘what is it?’ but ‘what is it for?’ More specifically, what’s the assessment designed to measure? In the case of NAPLAN, frankly, there’s not much more in it for individuals than bragging rights. It needs to be taken seriously and completed properly but its ramifications are for the big picture. The GAT on the other hand is only undertaken for the benefit of establishing individual student’s standards. A strong performance in the GAT safeguards against poor outcomes at the end of the year.

If we could all resist the urge to be seduced by some odd sense of test prestige and focus more on what assessment is designed to do I think we’d all see things much more clearly.