Predicting Students’ Examination Performance with Discovery-embedded Assessment using a Prototype Diagnostic Tool

Kai Pan Mark, Lilian L. P. Vrijmoed, Paula Hodgson

Abstract

Early detection and provision of feedback on learners’ performance are essential mechanisms to allow learners to take prompt action to improve their approach to learning. Although students may learn how they perform in mid-term quizzes, the results cannot reflect how they might perform in the final examination. However, quiz results with set answers cannot illustrate the comprehensive skills and knowledge that are expected in university study. This paper reports on the use of a diagnostic tool to analyse the process of students working on a discovery-embedded assessment task in the collaborative learning environment of a microbiology course. The diagnostic tool identified that those learners who performed less well in the assessment tasks also performed less well in the final examination. This tool can provide early detection of those facing learning challenges in comprehensive assessment tasks, so that educators can provide appropriate support.

References

  1. Ahrama, T., Karwowskia, W., and Amaba, B. (2011). Collaborative systems engineering and socialnetworking approach to design and modelling of smarter products. Behaviour and Information Technology, 30(1), 13-26. doi: 10.1080/0144929X.2010.490957.
  2. Arnold, Nike, and Paulus, Trena (2010). Using a social networking site for experiential learning: appropriating, lurking, modeling and community building. The Internet and Higher Education, 13(4), 188-96. doi: 10.1016/j.iheduc.2010.04.002.
  3. Barczyk, Casimir C., and Duncan, Doris G. (2012). Social networking media: an approach for the teaching of international business. Journal of Teaching in International Business, 23(2), 98-122. doi: 10.1080/08975930.2012.718703.
  4. Beck, Kent (1999). Embracing change with extreme programming. IEEE Computer, 32(10), 70-7. doi: 10.1109/2.796139.
  5. Bierer, S. Beth, Dannefer, Elaine F., Taylor, Christine, Hall, Phillip, and Hull, Alan L. (2008). Methods to assess students' acquisition, application and integration of basic science knowledge in an innovative competency-based curriculum. Medical Teacher, 30(7), 171-7. doi: 10.1080/01421590802139740.
  6. Brady, Anne-Marie (2005). Assessment of learning with multiple-choice questions. Nurse Education in Practice, 5(4), 238-42. doi: 10.1016/j.nepr.2004.12.005.
  7. Coe, Jason B., Weijs, Cynthia A., Muise, Amy, Christofides, Emily, and Desmarais, Serge (2012). Understanding veterinary students' use of and attitudes toward the social networking site, Facebook, to assist in developing curricula to address online professionalism. Journal of Veterinary Medical Education, 39(3), 297-303. doi: 10.3138/jvme.0212- 016R.
  8. Cole, Jonathan S., and Spence, Stephen W.T. (2012). Using continuous assessment to promote student engagement in a large class. European Journal of Engineering Education, 37(5), 508-25. doi: 10.1080/03043797.2012.719002.
  9. Committee of Inquiry into the Changing Learner Experience. (2009). Higher education in a Web 2.0 world. Retrieved 13 Dec. 2012, from http://www.jisc.ac.uk/media/documents/publications/h eweb20rptv1.pdf.
  10. Dabbagha, Nada, and Kitsantas, Anastasia (2012). Personal learning environments, social media, and self-regulated learning: a natural formula for connecting formal and informal learning. The Internet and Higher Education, 15(1), 3-8. doi: 10.1016/j.iheduc.2011.06.002.
  11. Davis, Fred D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13(3), 319-40.
  12. Dawson, Shane (2010). 'Seeing' the learning community: an exploration of the development of a resource for monitoring online student networking. British Journal of Educational Psychology, 41(5), 736-52. doi: 10.1111/j.1467-8535.2009.00970.x.
  13. Gleason, Jim (2012). Using technology-assisted instruction and assessment to reduce the effect of class size on student outcomes in undergraduate mathematics courses. College Teaching, 60(3), 87-94. doi: 10.1080/87567555.2011.637249.
  14. Hevner, Alan R., and Chatterjee, Samir (2010). Design Research in Information Systems: Theory and Practice (vol. 22). New York, London: Springer.
  15. Hodgson, Paula, and Chan, Lai Kuen (2010) Building professional skills through peer group assessment. The International Journal of Interdisciplinary Social Sciences. vol. 5, issue 4, 409-18.
  16. Jong, Bin-Shyan, Chan, Te-Yi, and Wu, Yu-Lung (2007). Learning log explorer in e-learning diagnosis. IEEE Transactions on Education, 50(3), 216-28. doi: 10.1109/TE.2007.900023.
  17. Li, Jamy, and Chignell, Mark (2010). Birds of a feather: how personality influences blog writing and reading. International Journal of Human-Computer Studies, 68(9), 589-602. doi: 10.1016/j.ijhcs.2010.04.001.
  18. Lui, Kimman, Chan, Keith C.C., and Mark, Kai-Pan (2009). Association between web semantics and geographical locations: evaluate internet information consistency for marketing research. In M.M. CruzCunha, E.F. Oliveira, A.J. Tavares and L. G. Ferreira (eds), Handbook of Research on Social Dimensions of Semantic Technologies and Web Services (pp. 158-81). Hershey, PA: IGI Global.
  19. Nicol, David (2007). E assessment by design: using multiple choice tests to good effect. Journal of Further and Higher Education, 31(1), 53-64. doi: 10.1080/03098770601167922.
  20. Orlikowski, Wanda J., and Lacono, C. Suzanne (2001). Research commentary: desperately seeking the 'IT' in IT research-a call to theorizing the IT artifact. Information Systems Research, 12(2), 121-44.
  21. Prensky, Marc (2001). Digital natives, digital immigrants. On the Horizon, 9(5), 1-15.
  22. Pressman, Roger S. (2010). Software Engineering: A Practitioner's Approach (7th edn). New York: McGraw-Hill.
  23. Sein, Maung K., Henfridsson, Ola, Purao, Sandeep, Rossi, Matti, and Lindgren, Rikard (2011). Action design research. MIS Quarterly, 35(1), 37-56.
  24. Weller, Martin, Pegler, Chris, and Mason, Robin (2005). Use of innovative technologies on an e-learning course. The Internet and Higher Education, 8(1), 61-71. doi: 10.1016/j.iheduc.2004.10.001.
  25. Xiao, Yun, and Lucking, Robert (2008). The impact of two types of peer assessment on students' performance and satisfaction within a Wiki environment. The Internet and Higher Education, 11(3-4), 186-93. doi: 10.1016/j.iheduc.2008.06.005.
Download


Paper Citation


in Harvard Style

Pan Mark K., Vrijmoed L. and Hodgson P. (2013). Predicting Students’ Examination Performance with Discovery-embedded Assessment using a Prototype Diagnostic Tool . In Proceedings of the 5th International Conference on Computer Supported Education - Volume 1: CSEDU, ISBN 978-989-8565-53-2, pages 361-368. DOI: 10.5220/0004389203610368


in Bibtex Style

@conference{csedu13,
author={Kai Pan Mark and Lilian L. P. Vrijmoed and Paula Hodgson},
title={Predicting Students’ Examination Performance with Discovery-embedded Assessment using a Prototype Diagnostic Tool},
booktitle={Proceedings of the 5th International Conference on Computer Supported Education - Volume 1: CSEDU,},
year={2013},
pages={361-368},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0004389203610368},
isbn={978-989-8565-53-2},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 5th International Conference on Computer Supported Education - Volume 1: CSEDU,
TI - Predicting Students’ Examination Performance with Discovery-embedded Assessment using a Prototype Diagnostic Tool
SN - 978-989-8565-53-2
AU - Pan Mark K.
AU - Vrijmoed L.
AU - Hodgson P.
PY - 2013
SP - 361
EP - 368
DO - 10.5220/0004389203610368