Text Mining in Students' Course Evaluations - Relationships between Open-ended Comments and Quantitative Scores
Tamara Sliusarenko, Line Harder Clemmensen, Bjarne Kjær Ersbøll
2013
Abstract
Extensive research has been done on student evaluations of teachers and courses based on quantitative data from evaluation questionnaires, but little research has examined students’ written responses to open-ended questions and their relationships with quantitative scores. This paper analyzes such kind of relationship of a well established course at the Technical University of Denmark using statistical methods. Keyphrase extraction tool was used to find the main topics of students’ comments, based on which the qualitative feedback was transformed into quantitative data for further statistical analysis. Application of factor analysis helped to reveal the important issues and the structure of the data hidden in the students’ written comments, while regression analysis showed that some of the revealed factors have a significant impact on how students rate a course.
References
- Abrami, P. C. (2001). Improving judgments about teaching effectiveness using teacher rating forms., volume [Special issue]. New Directions for Institutional Research, chapter Theall, P.C. Abrami, and L.A. Mets (Eds.). The student ratings debate: Are they valid? How can we best use them?, pages 59-87.
- Abrami, P. C., dApollonia, S., and Rosenfield, S. (2007). The dimensionality of student ratings of instruction: what we know and what we do not. Perry, R.P., Smart J.C., editors: effective teaching in higher education: research and practice. New York: Agathon Press, pages 385-456.
- Alhija, F. N. A. and Fresko, B. (2009). Student evaluation of instruction: What can be learned from students' written comments? Studies in Educational Evaluation, 35(1):37-44.
- Cohen, P. A. (1981). Student rating of instruction and student achievement. Review of Educational Research, 51(3):281-309.
- Damerau, F. (1993). Generating and evaluating domainoriented multi-word terms from text. Information Processing and Management, 29:433447.
- Feldman, K. A. (1989). The association between student ratings of specific instructional dimensions and student achievement: Refining and extending the synthesis of data from multisection validity studies. Research in Higher education, 30(6).
- Greene, W. H. (2006). Econometric Analysis. Prentice Hall, 5th edition.
- Hair, J. F., Black, W. C., Babin, B. J., Anderson, R. E., and Tatham, R. L. (2006). Multivariate Data Analysis. Prentice Hall, 6th edition.
- Hodge, V. and Austin, J. (2004). A survey of outlier detection methodologies. Artif. Intell. Rev., 22(2):85-126.
- Hodges, L. C. and Stanton, K. (2007). Changing practices in evaluating reaching. a practical guide to improved faculty performance for promotion/tenure decisions. Innovative Higher Education, 31(5):279-286.
- Koehn, P. (2005). Europarl: A parallel corpus for statistical machine translation. MT Summit, page 7986.
- Lewis, K. G. (2001). Making sense of written student comments. New Directions for Teaching and Learning, 87:25-32.
- Likert, R. (1932). A technique for the measurement of attitudes. Archives of Psychology, 140:155.
- Manning, C. D. and Schutze, H. (1999). Foundations of statistical natural language processing. MIT Press.
- McKeachie, W. J. (1997). Student ratings: Their validity of use. American Psychologist, 52:1218-1225.
- Paukkeri, M.-S. and Honkela, T. (2010). Likey: Unsupervised Language-Independent Keyphrase Extraction. In Proceedings of the 5th International Workshop on Semantic Evaluation (SemEval), pages 162-165, Uppsala, Sweden. Association for Computational Linguistics.
- Paukkeri, M. S., Nieminen, I. T., Pll, M., and Honkela, T. (2008). A language-independent approach to keyphrase extraction and evaluation. In In Proceedings of COLING.
- Romero, C. and Ventura, S. (2007). Educational data mining: A survey from 1995 to 2005. Expert Systems with Applications, 33(1):135146.
- Salton, G. and Buckley, C. (1988). Term-weighting approaches in automatic text retrieval. Information Processing and Management, 24:513523.
- Seldin, P. (1999). Changing practices in evaluating reaching. a practical guide to improved faculty performance for promotion/tenure decisions. Bolton, MA: Anker.
- Sheehan, E. and DuPrey, T. (1999). Student evaluations of university teaching. Journal of Instructional Psychology, 26(3):135146.
- Wright, R. (2006). Student evaluations of faculty: Concerns raised in the literature, and possible solutions. College Student Journal, 40(2):417-422.
Paper Citation
in Harvard Style
Sliusarenko T., Harder Clemmensen L. and Ersbøll B. (2013). Text Mining in Students' Course Evaluations - Relationships between Open-ended Comments and Quantitative Scores . In Proceedings of the 5th International Conference on Computer Supported Education - Volume 1: CSEDU, ISBN 978-989-8565-53-2, pages 564-573. DOI: 10.5220/0004384705640573
in Bibtex Style
@conference{csedu13,
author={Tamara Sliusarenko and Line Harder Clemmensen and Bjarne Kjær Ersbøll},
title={Text Mining in Students' Course Evaluations - Relationships between Open-ended Comments and Quantitative Scores},
booktitle={Proceedings of the 5th International Conference on Computer Supported Education - Volume 1: CSEDU,},
year={2013},
pages={564-573},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0004384705640573},
isbn={978-989-8565-53-2},
}
in EndNote Style
TY - CONF
JO - Proceedings of the 5th International Conference on Computer Supported Education - Volume 1: CSEDU,
TI - Text Mining in Students' Course Evaluations - Relationships between Open-ended Comments and Quantitative Scores
SN - 978-989-8565-53-2
AU - Sliusarenko T.
AU - Harder Clemmensen L.
AU - Ersbøll B.
PY - 2013
SP - 564
EP - 573
DO - 10.5220/0004384705640573