New Perspectives in Science Education

Edition 14

Accepted Abstracts

Towards Efficient Assessment: Personalized Quizzing through Machine Learning

Satanshu Mishra, The University of British Columbia (Canada)

Mostafa Mohamed, Computer Science, University of British Columbia – Okanagan, Canada (Canada)

Abdallah Mohamed, University of British Columbia-Okanagan (Canada)

Abstract

In today's technology-driven world, the demand for computing skills spans across disciplines, necessitating tailored learning experiences. Current e-learning systems often employ a "one size fits all" approach, which fails to accommodate for diverse student skill levels and learning styles. This paper introduces, uLearn an Adaptive Learning tool designed to personalize the learning pathway for students in introductory programming courses. Using Machine Learning and Item Response Theory, the proposed algorithm dynamically adjusts question type and difficulty based on the overall student performance and individual abilities. Moreover, uLearn employs a mastery tracking system to monitor student competency in various topics and categories, to provide targeted learning to students. 
Findings from an exploratory study revealed positive results in terms of student performance and comprehension, as well as increased student engagement and a desire to continue using Adaptive Learning in the future.
 
Keywords: Adaptive Learning, Item Response Theory, Education 
 
References:
[1] L. J. Sax, K. J. Lehman, and C. Zavala, “Examining the Enrollment Growth: Non-CS Majors in CS1 Courses,” in Proceedings of the 2017 ACM SIGCSE Technical Symposium on Computer Science Education, Seattle Washington USA: ACM, Mar. 2017, pp. 513–518. doi: 10.1145/3017680.3017781.
[2] A. Mohamed, “Designing a CS1 Programming Course for a Mixed-Ability Class,” in Proceedings of the Western Canadian Conference on Computing Education, Calgary AB Canada: ACM, May 2019, pp. 1–6. doi: 10.1145/3314994.3325084.
[3] NMC horizon report... Higher Education Edition. Austin (Texas): NMC, 2004.
[4] H. Peng, S. Ma, and J. M. Spector, “Personalized adaptive learning: an emerging pedagogical approach enabled by a smart learning environment,” Smart Learn. Environ., vol. 6, no. 1, p. 9, Dec. 2019, doi: 10.1186/s40561-019-0089-y.
[5] S. E. Embretson and S. P. Reise, Item Response Theory for Psychologists. Hoboken: Taylor and Francis, 2013.
[6] W. J. van der Linden and R. K. Hambleton, Handbook of modern item response theory. New York: Springer, 1997.
[7] R. K. Hambleton and H. Swaminathan, Item Response Theory: Principles and Applications. Dordrecht: Springer Netherlands, 2013.
[8] X. Li, Z. Wang, X. Wu, Y. Li, and H. Dong, “The design of adaptive test paper composition algorithm based on the item response theory,” in 2011 6th IEEE Joint International Information Technology and Artificial Intelligence Conference, Chongqing, China: IEEE, Aug. 2011, pp. 157–159. doi: 10.1109/ITAIC.2011.6030299.
[9] Y. L. P. Vega, G. M. F. Nieto, S. M. Baldiris, and J. C. Guevara Bolaños, “Application of item response theory (IRT) for the generation of adaptive assessments in an introductory course on object-oriented programming,” in 2012 Frontiers in Education Conference Proceedings, Seattle, WA, USA: IEEE, Oct. 2012, pp. 1–4. doi: 10.1109/FIE.2012.6462377.
 

Back to the list

REGISTER NOW

Reserved area


Media Partners:

Click BrownWalker Press logo for the International Academic and Industry Conference Event Calendar announcing scientific, academic and industry gatherings, online events, call for papers and journal articles
Pixel - Via Luigi Lanzi 12 - 50134 Firenze (FI) - VAT IT 05118710481
    Copyright © 2024 - All rights reserved

Privacy Policy

Webmaster: Pinzani.it