The UK Citizenship test: Making sure that all new citizens have a good short term memory.

October 12, 2011

assessment is about measuring that which we should be trying to measure.

Phil Race Making Learning Happen.

The Guardian website quiz ‘Life in the UK: could you pass the citizenship test?’ has been provoking a lot of discussion amongst my friends. None of my friends, UK citizens or otherwise, have been able to pass the citizenship test yet.

I suspect that the Guardian has selected some ‘greatest hits’ amongst the questions and that most obscure questions have been deliberately chosen. But, if the citizenship test is really about assessing British values, British history and British culture it is a total failure. We can’t be sure that new British citizens are able to participate fully in British society, appreciate British history and understand British customs but we can be sure that all our new citizens are successful learners of trivia.

Does it measure what we are trying to measure? The Home Office need to read Phil Race.

A critical examination of proofreading (from latest edition of Studies in Higher Education)

June 21, 2011

I find proofreading difficult, especially proofreading my own work. I’ve long taken the view that proofreading my own work is beyond my abilities, particularly when a manuscript has gone through multiple drafts. Friends and colleagues generally concur; “You’re too close the text” they sometimes say. I’m always grateful for the professionals who perform this service on my journal articles.

Joan Turner’s critical examination of the nature of proofreading in the most recent edition of Studies in Higher Education is the first treatment of the subject I have come across (not that I have especially been looking out for an article like this, but it caught my attention when the e-mail alert from the journal came into my inbox). Student support centres which provide guidance on writing often emphasise that they are NOT a proofreading service. She writes:

 Such services offer some analysis of issues of style, grammar or rhetorical organisation that students should be aware of and attempt to resolve in their own writing, but they do not provide a ‘clean’ copy or ‘proof’ that the student can immediately submit for assessment (p. 427).

The article engages the question of proofreading from different angles. For example:

  1. Is proofreading is a skill which all students should acquire— particularly students whose first language is not English? Is it part of learning to write well?
  2. There is an ambiguity between teaching writing skills and proofreading.
  3. There is a moral question about whether getting someone to read an assessed paper is unfair. And is there an ethical difference between asking a friend to read your work and paying a professional (or non-professional) proof-reader?
  4. Will a proof-reader ‘just’ improve the writing or will they also improve the content of the text? At what point does using a proof-reader become cheating? What is being accessed—the writing or the content? Is it possible to even separate writing style from content?
  5. Does use/ overreliance on a proof-reader lead to lower standards? Does it prevent the students from learning how to write well?

Article reference

Joan Turner, “Rewriting writing in higher education: the contested spaces of proofreading,” Studies in Higher Education 36, no. 4 (2011): 427-440.

My top five Teaching and Learning websites (in no particular order).

May 13, 2011

I’ve never written a top 5 list before, but here is a “top 5”: list from me.

Phil Race

Emeritus Professor at Leeds Metropolitan University and Educational Development Consultant Phil Race has shared lots of his materials on his website. The compendium of his writings on assessment is ideal for experienced teachers as well as lecturers starting out. He also shares his thoughts on the National Student Survey and his page “If I were in charge…” motivates reflection on the way universities operate.

Oxford Centre for Staff and Learning Development

Lots of great resources here on almost everything you can think of from assessment to plagiarism, internationalism to course design. I’ve found its page on writing learning objectives with its extensive list of appropriate verbs valuable on numerous occasions.

E-learning blog “Don’t waste your time”

I found the website of David Hopkins, Learning Technologist at Bournemouth University when trying to find out what a QR code was (I’ve yet to feel that the purchase of smart phone is justified, but all the students seem to have them). The poster downloads on topics like using blackboard are helpful as so are the tips on making effective use of blogs and Twitter (lots for me to learn here.)


Ok, I might be a bit biased here as some of my colleagues at the Subject Centre for Languages, Linguistics and Area Studies led the project to build this teaching resource repository. The HumBox describes itself as “…a new way of storing, managing and publishing your Humanities teaching resources on the web.” The beauty of Humbox is its remarkable simplicity of use. Once you have set up an account you can upload and download resources in virtually any format, as easily as is technologically possible. I really started appreciating HumBox when trying (often unsuccessfully) to use some other repositories (no names will be mentioned here).


A mixed bag as you might expect, but some great material and good production standards. As I write I am listening to a round table discussion on “Why French Matters”. A highlight for me is Dan Judge’s statistics lectures which succeed in making a difficult subject (for me) so engaging.

What does summative assessment measure?

May 4, 2011

Mantz Yorke’s article in the latest edition of Studies in Higher Education “Summative Assessment: dealing with the measurement fallacy” has caught my eye. In my view assessment is (and arguably should be) anxiety producing for the assessor as well as the student. Yorke’s concern, as he notes in the first paragraph is for the students who are neither very strong or very weak, the bulk of students “where differences between individuals tend to be small, but can have a large impact on opportunities” (p.251).

Reflected in over 70 references the article covers a lot of ground including the UK degree classification system and the appropriateness of a classification grade which somehow summarises all summative assessment over the course of three or four years of study. Changing the classification for a letter grade or an overall percentage is not in itself any better—Yorke quotes the VC of Bedfordshire University’s evidence to the House Commons universities, Sciences and Skills committee “… as a chemist  I would be telling my students not to average the unaverageable, then I would walk into an examination board and do exactly that”.

Yorke identifies many of the familiar and less familiar flaws in assessment and concludes that we need to acknowledge these before any improvements can be made. He writes of taking an overly judgemental approach to summative assessment to challenge the measurement (and seemingly scientific) fallacy of summative assessment.  This would (my summary here from Yorke’s points on pp.262-265.

  1. Make it unambiguously clear (to students and others) that summative assessment are judgements, not refined measurements.
  2. Would help deal the variability in the use of the percentage scale (if percentage scale is really an accurate way of describing the marks given to students) between disciplines and for difference types of assessment within disciplines.
  3. Require students to accept that assessment grades are judgements of their work and not precise measurements. (Perhaps this would be the most difficult sell here).

I can’t do this article justice in summary form—it is well worth a read for anyone who deals with assessment in higher education.

Reference: Mantz Yorke, “Summative assessment: dealing with the ‘measurement fallacy’,” Studies in Higher Education 36, no. 3 (2011): 251-273.