The Influence of Variance in Learner Answers on Automatic Content Scoring

Horbach, Andrea GND; Zesch, Torsten GND

Automatic content scoring is an important application in the area of automatic educational assessment. Short texts written by learners are scored based on their content while spelling and grammar mistakes are usually ignored. The difficulty of automatically scoring such texts varies according to the variance within the learner answers. In this paper, we first discuss factors that influence variance in learner answers, so that practitioners can better estimate if automatic scoring might be applicable to their usage scenario. We then compare the two main paradigms in content scoring: (i) similarity-based and (ii) instance-based methods, and discuss how well they can deal with each of the variance-inducing factors described before.

Cite

Citation style:
Horbach, A., Zesch, T., 2019. The Influence of Variance in Learner Answers on Automatic Content Scoring. https://doi.org/10.3389/feduc.2019.00028
Could not load citation form.

Rights

Use and reproduction:
This work may be used under a
CC BY 4.0 LogoCreative Commons Attribution 4.0 License (CC BY 4.0)
.

Export