Best Practice

Exam preparation: Extended writing questions

As part of her work on improving feedback to students, Helen Webb focused on improving performance in GCSE science extended writing questions. She discusses her findings

At its last inspection, Ofsted rated our school “good”. One of the reasons that inspectors gave for the school not yet being “outstanding” was because the quality of marking and feedback was not consistently high across the school.

As a result, one of the aims of our School Improvement Plan was to increase the proportion of outstanding teaching by making sure that the quality of marking and feedback was consistently high across the school.

In response to this I chose to dedicate an action research project to support this aim, focusing my attention initially to my own science department.

This project specifically aimed to improve the provision of feedback for extended writing questions used in examinations for the key stage 4 OCR 21st Century GCSE Science Suite. These are tasks that students found challenging to complete and teachers often found difficult to mark.

I reviewed the literature focusing specifically on issues surrounding marking and feedback and have already written for SecEd on many of my findings (see What does effective student feedback look like? Parts 1 & 2, SecEd, September 2016 – follow the link at the end of this article to find all previous pieces).

Following this I collated a set of top tips for staff to improve the quality of feedback in general and by reviewing examiner reports I sought specific advice that would allow students to make greater progress on extended-writing tasks in particular (Effective student feedback: Top tips for teachers, SecEd, September 2016).

I further developed the idea of using targeted comment banks, to deliver feedback more effectively to students on these tasks. This strategy also led to significant reductions in marking workload (Effective student feedback: Creating and using comment banks, SecEd, September 2016).

Effective feedback

Feedback aims to bridge the gap between prior or current achievement and the success criteria. According to Wiggins (2012), “helpful feedback is goal-referenced; tangible and transparent; actionable; user-friendly (specific and personalised); timely; on-going; and consistent”.

Ultimately, feedback is successful if it allows each student to make progress as a result of what you say. This became my key focus during this project – what do we say to students so that they can make the most possible progress on these tasks?

Extended writing questions

OCR introduced this style of six-mark extended writing question in 2011 and students found them challenging. According to previous OCR examiner reports many students did not even attempt to answer these questions during examinations.

Many responses by students lacked appropriate scientific detail and clarity. For example students frequently used the words “it” or “they” rather than what they were actually referring to.

Often students did not read the question carefully or did not use the information given to help them. The examiner reports also advised that students needed further practice writing balanced arguments as many would only discuss one aspect of the problem.

Many students also found difficulty in structuring their answer, and did not give the quality of their written communication due care and attention. In response to this, we now give feedback dedicated to QWC (quality of written communication) and exam technique as much priority as scientific understanding.

The examiner reports stated that candidates should be reminded that written communication is not limited to continuous writing.

Answers that used bullet points or annotated diagrams often resulted in clear communication of all the salient points, and so were able to gain the maximum mark. This information is now routinely reiterated to our students when sharing success criteria.

The reports pointed out that some candidates had a tendency to rewrite the information in the question or fail to answer the question actually set. As such they recommended that schools train candidates in strategies such as highlighting significant words in the question to enable them to structure their answer around those points.

They suggested that schools encourage candidates to read the instructions in the question stems carefully; many candidates apparently lost marks by answering a question different from the one in the paper. Candidates who underlined or highlighted key words and phrases in the question stems made that error less frequently.

I have since found that it is useful to ask students to not only circle command words but also to look out for the word “and”, which indicates two or more parts to the question, e.g: “Describe and explain the differences in the three shoots, A, B and C, after 12 hours.”
Much of this advice is now not only routinely integrated as part of the teacher explanation of the task, but also reiterated where appropriate in the specific feedback given back to the student on completion of the task.

The examiner reports identified that candidates who had had some practice in organising their thoughts into a coherent sequence tended to contradict themselves much less frequently, and as such scored more highly.

Consequently, to provide students with regular practice of these extended writing questions, teachers in my department were encouraged to set students at least a couple of these six-mark questions per topic.

When set as homework, colleagues, including myself, reported that in many cases these tasks were not submitted, submitted late or rushed, and therefore poorly completed.

As such much precious time was wasted chasing students, the opportunity for self or peer-assessment was difficult as not all students would have access to a question in class to mark, and time spent by the teacher marking was wasted writing comments such as “please take more time and effort with your homework” rather than giving specific targeted feedback to improve individual skills or knowledge. As such, with many classes I chose to complete these important tasks in class and had greater success.

Model answers

The use of exemplar student answers can be particularly useful in modelling A* answers. We wrote into each scheme of work several model answers of varying quality for different questions. This allowed students a chance to evaluate these student responses ahead of the task, discuss success criteria and start to access mark schemes.

This strategy also provided a starting point, for students struggling to get started with their own answer or having difficulty framing their response. By using the targeted comment banks (detailed in my previous piece for SecEd) to provide feedback to students, I was also able to provide model answers retrospectively to all or parts of the question that students struggled with and provide more detailed suggestions or examples of ideas that could be included in their answers.

Feedback in this way is both specific and individualised and is far easier for students to interpret than the teacher mark schemes, and as such enables students to progress more easily. This technique is also far quicker than writing detailed responses for each student individually.

Peer and self-assessment

The mark schemes for these tasks are not straightforward and not easily interpreted by students. It is also commonly reported that the use of grades can have a diluting effect on the feedback you provide. As such, when organising peer feedback I moved away from students attempting to score each other’s work, and instead encouraged them to select appropriate feedback from a pre-prepared comment bank that would most help their partner to improve their answer.

This strategy has dramatically improved the quality of peer feedback, which in the past would have been littered with weak comments from students stating “good work” and “could be more detailed” with no specific focus on what was good or what detail was needed.
The number of statements displayed on each comment bank depended on the ability of the group and these statements would generally include model answers for each part of the question, common misconceptions, generic feedback for common errors and typical comments relating to the quality of written communication.

Students receiving feedback would respond in the same way as they would if it was teacher feedback. A rather serendipitous outcome of this strategy was that when students were given more open peer feedback tasks for other activities, the quality of feedback given was notably improved, no doubt from high-quality feedback being consistently modelled during these tasks. (If this is a strategy that interests you, I have written some suggestions of generic feedback statements for use with targeted comment banks in a previous article to help you get started: Effective student feedback: Ideas for feedback statements, SecEd, October 2016.)

New specifications

From September 2016 we are once again embarking on new GCSE science specifications. While the style of assessment will be modified the lessons learnt from this action research are still applicable; a focus on sharing clear success criteria; referring to the examiner reports for advice and guidance on exam technique and the quality of written communication; using exemplars and model answers to not only help frame students’ answers but also correct scientific errors and explain misconceptions; and regular practice of difficult tasks.

Using specific, individualised and relevant feedback will also ultimately lead to greater progress in any task.

  • Helen Webb is an experienced science and biology teacher with a professional interest in developing CPD for teachers. She works at Lutterworth College in Leicestershire. You can follow her @helenfwebb. To read Helen’s previous articles for SecEd, visit http://bit.ly/2cLa6UZ

References

  • OCR General Certificate of Secondary Education J241 GCSE Science A, Twenty First Century Science Suite, OCR Report to Centres, January 2013: http://bit.ly/2mruR8y
  • OCR General Certificate of Secondary Education J242 GCSE Additional Science A, Twenty First Century Science Suite, OCR Report to Centres, January 2013: http://bit.ly/2nrf0be
  • OCR General Certificate of Secondary Education J241 GCSE Science A, Twenty First Century Science Suite, OCR Report to Centres, June 2012: http://bit.ly/2nreZEc
  • Seven Keys to Effective Feedback, Wiggins (2012), Educational Leadership (Feedback for Learning, Volume 70, Number 1, Pages 10 to 16): http://bit.ly/2bLx5vI