
6 CONCLUSION AND FUTURE
DIRECTIONS
The evaluation of learning materials used in learn-
ing situation provides valuable feedback to different
stockholders practitioner, publishers, and educators.
This feedback loop supports their iterative improve-
ment, allowing for updates and revisions based on the
evolving needs of learners and changes in the educa-
tional landscape.
We consider that in the future several controlled
experiments must be conducted to assert the useful-
ness of this work.
REFERENCES
Apel, S., Batory, D., K
¨
astner, C., and Saake, G. (2016).
Feature-oriented software product lines. Springer.
Arcaini, P., Gargantini, A., and Vavassori, P. (2015). Gen-
erating tests for detecting faults in feature models. In
2015 IEEE 8th International Conference on Software
Testing, Verification and Validation (ICST), pages 1–
10. IEEE.
Baker, R. K. (2003). A framework for design and evalua-
tion of internet-based distance learning courses: Phase
one–framework justification design and evaluation.
Online Journal of Distance Learning Administration,
6(2):43–51.
Bundsgaard, J. and Hansen, T. I. (2011). Evaluation of
learning materials: A holistic framework. Journal of
learning design, 4(4):31–44.
Dewan, H. (2022). Reforms in curriculum and textbooks:
Challenges and possibilities. Learning without Bur-
den, pages 83–103.
Edded, S., Sassi, S. B., Mazo, R., Salinesi, C., and Ghezala,
H. B. (2020). Preference-based conflict resolution for
collaborative configuration of product lines. In In-
ternational Conference on Evaluation of Novel Ap-
proaches to Software Engineering.
Elfaki, A. O., Phon-Amnuaisuk, S., and Ho, C. K. (2009).
Investigating inconsistency detection as a validation
operation in software product line. Software Engineer-
ing Research, Management and Applications 2009,
pages 159–168.
Feichtinger, K., Hinterreiter, D., Linsbauer, L., Pr
¨
ahofer,
H., and Gr
¨
unbacher, P. (2021). Guiding feature model
evolution by lifting code-level dependencies. Journal
of Computer Languages, 63:101034.
Ferrell, B. G. (1992). Lesson plan analysis as a program
evaluation tool. Gifted Child Quarterly, 36(1):23–26.
Gottipati, S. and Shankararaman, V. (2018). Competency
analytics tool: Analyzing curriculum using course
competencies. Education and Information Technolo-
gies, 23:41–60.
Grover, S., Pea, R., and Cooper, S. (2014). Promoting ac-
tive learning & leveraging dashboards for curriculum
assessment in an openedx introductory cs course for
middle school. In Proceedings of the first ACM con-
ference on Learning@ scale conference, pages 205–
206.
Leacock, T. L. and Nesbit, J. C. (2007). A framework
for evaluating the quality of multimedia learning re-
sources. Journal of Educational Technology & Soci-
ety, 10(2):44–59.
Liffiton, M. H. and Sakallah, K. A. (2008). Algorithms
for computing minimal unsatisfiable subsets of con-
straints. Journal of Automated Reasoning, 40:1–33.
Maher, M. L. (1990). Process models for design synthesis.
AI magazine, 11(4):49–49.
Marques-Silva, J., Heras, F., Janota, M., Previti, A., and
Belov, A. (2013). On computing minimal correction
subsets. In Twenty-Third International Joint Confer-
ence on Artificial Intelligence.
McDonald, M. and McDonald, G. (1999). Computer sci-
ence curriculum assessment. Acm Sigcse Bulletin,
31(1):194–197.
Mendonca, M., Cowan, D., and Oliveira, T. (2007).
A process-centric approach for coordinating prod-
uct configuration decisions. In 2007 40th Annual
Hawaii International Conference on System Sciences
(HICSS’07), pages 283a–283a. IEEE.
Morgan, K. E., Henning, E., et al. (2013). Designing a tool
for history textbook analysis. In Forum Qualitative
Sozialforschung/Forum: Qualitative Social Research,
volume 14.
Piirainen, K., Kolfschoten, G., and Lukosch, S. (2009). Un-
raveling challenges in collaborative design: a litera-
ture study. In Groupware: Design, Implementation,
and Use: 15th International Workshop, CRIWG 2009,
Peso da R
´
egua, Douro, Portugal, September 13-17,
2009. Proceedings 15, pages 247–261. Springer.
Rogers, H. and Demas, A. (2013). The what, why, and how
of the systems approach for better education results
(saber). Technical report, The World Bank.
Vivian, R. and Falkner, K. (2018). A survey of australian
teachers’ self-efficacy and assessment approaches for
the k-12 digital technologies curriculum. In Proceed-
ings of the 13th Workshop in Primary and Secondary
Computing Education, pages 1–10.
EduColl: A Collaborative Design Approach Based on Conflict Resolution for the Assessment of Learning Resources
477