Utility in a Fallible Tool: A Multi-Site Case Study of Automated Writing Evaluation

Authors

  • Douglas Grimes University of California, Irvine
  • Mark Warschauer University of California, Irvine

Keywords:

automated essay scoring, automated writing evaluation, artifical intelligence, writing, composition

Abstract

Automated writing evaluation (AWE) software uses artificial intelligence (AI) to score student essays and support revision. We studied how an AWE program called MY Access!® was used in eight middle schools in Southern California over a three-year period. Although many teachers and students considered automated scoring unreliable, and teachers’ use of AWE was limited by the desire to use conventional writing methods, use of the software still brought important benefits. Observations, interviews, and a survey indicated that using AWE simplified classroom management and increased students’ motivation to write and revise.

Downloads

Published

2010-03-02

How to Cite

Grimes, D., & Warschauer, M. (2010). Utility in a Fallible Tool: A Multi-Site Case Study of Automated Writing Evaluation. The Journal of Technology, Learning and Assessment, 8(6). Retrieved from https://ejournals.bc.edu/index.php/jtla/article/view/1625

Issue

Section

Articles