Giving Students Control over Writing Assessment

Charles R. Duke, Rebecca Sanchez
1994 English Journal  
Teachers often see any kind of external assessment as an evil to be endured but certainly not embraced. And as more and more states enter the writing-assessment arena, the fear and hostility indexes among teachers have risen dramatically. Some of this apprehension is understandable since most statewide assessments have not provided much useful information to teachers or to students. In addition, the design of the assessment has not always been clear to teachers, and there has been uncertainty
more » ... been uncertainty about how results would be used. Pennsylvania is a good case in point. Because new curriculum regulations included more emphasis upon writing, the state department of education decided in 1992 that a statewide writing assessment was needed. Grades six and nine were the targeted population. With the help of teachers, test experts, and state-department personnel, a holistic writing assessment was designed and field tested. A percentage of districts will be tested each year so that all districts will have been assessed within a threeyear cycle (see Pennsylvania State Assessment System, 1992, Writing Assessment Handbook, Pennsylvania Department of Education, Harrisburg, PA). The Pennsylvania holistic writing assessment is similar to many other such assessments; it uses a six-point scale applied to five key characteristics of effective writing: focus, content, organization, style, and conventions. Two readers' scores provide the holistic assessment; disagreement between readers is arbitrated by a third reader. A student, therefore, could receive a score anywhere from 0 (unscorable) to 12. As a result of participating in the training provided by the state department of education, we began thinking about how our students could di-rectly benefit from the assessment. We also hoped for a means to provide classroom teachers with ways to incorporate aspects of the assessment in their teaching. One of our principal goals was to give students more control in the assessment process. So we began with the students' ideas about effective writing and helped them to derive assessment criteria that they could use individually and in peer-response We began with the students' ideas about effective writing and helped them to derive assessment criteria that they could use individually and in peer-response groups. groups. We aimed at analytical scoring which draws upon the same premise as holistic scoring-the whole is more than the sum of its parts-but which also offers a way to talk about writing in a language that is readily accessible to students and which, therefore, helps them with revision. Analytical scoring identifies the key characteristics of writing and provides descriptors of these traits in terms of strengths and weaknesses similar to those appearing in student samples of writing. (See Vicki Spandel and Richard J. Stiggins, 1990, Creating Writers: Linking Assessment and Writing Instruction, White Plains, NY: Longman, 26-72.) Frankly, we were uncertain how students might take to designing their own assessment criteria since they were more accustomed to the teacherdriven model of assessment and the "hidden agenda" ordinarily involved in evaluating papers, which required students to guess what the teacher April 1994 47 National Council of Teachers of English is collaborating with JSTOR to digitize, preserve, and extend access to The English Journal
doi:10.2307/821085 fatcat:ye5br7m6j5bejgjkdqyg372fjm