New: Journal of Writing Assessment Issue 17.2 Published

Dear colleagues,

The editorial team at The Journal of Writing Assessment (JWA) is excited to announce the publication of Volume 17, Issue 2 of JWA. The issue features five (5) full articles focused especially on issues of fairness and equity in writing program assessment and writing placement.

Thank you to the authors for all of their incredible work, and to the team of reviewers that contributed to this issue.

Volume 17, Issue 2 of JWA includes:

  • Editor’s Introduction: The “Accidental California Issue” – Critical Questions about Fairness and Equity in Writing Assessment and Placement, by Carl Whithaus
  • Construct Validity and the Demise of the Analytical Writing Placement Examination (AWPE) at the University of California: A Tale of Social Mobility, by Daniel M. Gross

    In 2021, the University of California System ended its decades-old timed writing assessment for course placement, due in part to challenges presented by the COVID-19 pandemic. Beyond practical crisis, however, the event marks a sea change in educational philosophy away from a universalizing model of cognitive development, which dominated in the 1970s and 1980s, towards a concern for social mobility and student self-assessment. The article explores the historical factors that led to this change, including the emergence of the social mobility index as a new method for evaluating student success. It also unpacks UC’s discourse on preparatory education and levels of proficiency, emphasizing instead fairness in writing assessment.

  • Assessment is Constructed and Contextual: Identity, Information Literacy, and Interview-Based Methodologies in the First-Year Writing Classroom, by Julia Voss, Loring Pfeiffer, and Nicole Branch

    Over the past twenty years,the field of writing assessment has moved from critical theories that question traditional models of validity and objectivity (Huot, 2002; Lynne, 2004) to scholarship that exposes how traditional assessment perpetuates inequality(Inoue 2015, 2019) and advocates new approaches that take social justice as their central goal (Poe, et. al., 2018). We report on a collaboration between two writing instructors and one librarian that assessed first-year writing(FYW) students’ information literacy when researching and writing with popular news sources. In addition to the typical practice of analyzing students’ written work, this project used interviews as an assessment methodology. Thisresearch produced three important findings: 1) minoritized students demonstrated superior critical information literacy skills compared to majoritized students; 2) these differences were made visible through the use of multiple measures (written artifacts and interviews); and 3) the use of interviews is an assessment methodology that invites students to engage incounterstory and draw on personal experiences, revealing new sources of knowledge and countering narratives of deficit. Ultimately, we argue that interviews hold promise for antiracist revamping of student learning outcomes as well as assessment practices.

  • The Strange Loop of Self-Efficacy and the Value of Focus Groups in Writing Program Assessment, by Edward Comstock

    It’s long been presumed that increases in self-efficacy are correlated with other “habits of mind,” including more effective metacognitive strategies that will enable writing skills to transfer to different situations. Similarly, it’s long been understood that high self-efficacy is associated with more productive habits of mind and more positive emotional dispositions towards writing tasks. However, this two-year assessment of College Writing classes at a private, mid-sized, urban four-year university complicates these assumptions. By supplementing substantial survey data with the analysis of data collected in focus groups, we found that the development of self-efficacy does not necessarily correlate to the development of more sophisticated epistemological beliefs—beliefs about how learning happens—nor the development of rhetorically-effective “writing dispositions." In short, suggesting the value of focus groups in assessment, we discovered a “strange loop” of self-efficacy in which gains made towards self-efficacy frequently have an unanticipated, complex, and problematic relation to our desired learning outcomes.

  • Collaborative Writing Placement: Partnering with Students in the Placement Process, by Sarah Hirsch, Kenny Smith, and Madeleine Sorapure

    This paper will discuss how the Writing Program at the University of California, Santa Barbara “flipped the script” on placement by implementing a model that emphasizes the importance of student voices. Our Collaborative Writing Placement (CWP) shares many similarities with Directed Self-Placement (DSP) in that its instrument consists of survey questions and reflective writing opportunities (Aull, 2021; Gere, Ruggles, et. al., 2013). But it differs from DSP in that students work with writing faculty in choosing the first-year course that is the best fit for them. Through an examination of our initial data, and the first two years of CWP’s implementation, our paper will discuss how the CWP offers another avenue for promoting student agency and generating more equitable placement outcomes.

  • Multifaceted Equity: Critiquing a First-Year Writing Assessment through Curricular, Performance, and Reliability Lenses, by Julie Prebel and Justin Li

    This article examines whether a college’s new portfolio-based first-year writing assessment process is equitable. We build on the existing literature by arguing that equity must be assessed through multiple, complementary facets within the writing assessment ecology. We present and operationalize three lenses through which to examine the equity of a first-year writing assessment process. The curricular lens shows that the new writing assessment is more aligned with and improved the classroom pedagogy of our first-year seminars. The performance lens shows the ongoing disparities between students across demographic backgrounds. Finally, the reliability lens reveals faculty differences in how they interpret the writing rubric. We conclude that while the new portfolio-based writing assessment is more equitable, it is also constrained by institutional structures and systems of power that prevent it from being equitable, period.

Matt Gomes, PhD
Assistant Professor of English
Santa Clara University
Associate Editor and Social Media Coordinator, The Journal of Writing Assessmentpronouns: he/him