Step by step guide for the implementation and assessment of Adaptive Learning Platforms

Adaptive Learning Platforms (ALPs), also referred to as adaptive e-learning environments (AEEs), are sophisticated digital educational systems that dynamically adjust the presentation of instructional content, the sequence of learning activities, the level of difficulty, and the type of feedback provided to individual learners. This adaptation is based on real-time analysis of the learner’s performance, interactions, knowledge state, preferences, and/or other defined characteristics, often utilizing algorithms, data analytics, and Artificial Intelligence (AI). The core aim of ALPs in the context of AMR/AMS is to create a personalized, efficient, and effective learning experience that caters to the unique needs and pace of each professional, optimizing their path towards mastering specific AMR/AMS competencies.

Priority topics in NAPs Icon

Planning a Adaptive Learning Platforms

The following steps should be taken into consideration when planning the use of Adaptive Learning Platforms (ALP) on AMR/AMS:

  1. Define Granular AMR/AMS Learning Objectives and Competencies: Clearly identify the specific AMR/AMS knowledge, skills, and competencies that the ALP will target. Break down complex topics into smaller, measurable learning objects or concepts.
  2. Platform Selection or Development: Choose a suitable existing Adaptive Learning Platform (e.g., Smart Sparrow, CogBooks, Area9 Lyceum, Knewton, Realizeit, or others mentioned in general educational technology literature) that allows for the desired level of adaptivity and content integration, or plan for custom development if resources permit. (Samulski et al., 2016, used Smart Sparrow for cytopathology modules; Putra et al., 2022, used CogBooks for veterinary dermatology).
  3. Content Creation and Curation for Adaptivity: Develop or curate a rich repository of diverse AMR/AMS learning resources in various formats (e.g., text, short videos, interactive simulations, case studies, image banks, audio clips, links to guidelines). Content needs to be modular and tagged appropriately to allow the system to select and present it dynamically.
  4. Authoring Adaptive Rules, Pathways, and Logic: This is a critical step. Define the rules, algorithms, and decision trees that will govern how the platform adapts to the learner. This includes setting up prerequisite relationships between learning objects, defining mastery thresholds for concepts, determining how learner performance on embedded assessments will trigger different content pathways (e.g., remedial loops, advanced topics, alternative explanations), and specifying the nature of adaptive feedback. (Samulski et al., 2016, describe the process of creating adaptive tutorials with branching logic).
  5. Integration of Formative Assessments: Embed frequent, varied formative assessment activities (e.g., multiple-choice questions, open-ended questions, problem-solving tasks, virtual patient interactions) throughout the learning experience. These assessments provide the data that fuels the adaptation engine.
  6. User Interface (UI) and User Experience (UX) Design: Ensure the platform has an intuitive and engaging interface that clearly presents the personalized learning path and makes it easy for learners to navigate and interact with the adaptive content.
  7. Pilot Testing and Iterative Refinement: Conduct thorough pilot testing with representatives of the target AMR/AMS learners to evaluate the platform’s usability, the effectiveness of the adaptive logic, learner engagement, and the achievement of learning objectives. Use feedback and performance data to iteratively refine the content, adaptive pathways, and platform configuration.
AMR One Health policy group Icon

Defining roles in a Adaptive Learning Platforms

Facilitator’s role (Instructional Designer/Content Expert/Platform Manager/Learning Analyst): This role is heavily concentrated in the upfront design and development phases: defining learning architectures, authoring adaptive rules and pathways, creating or curating granular, high-quality AMR/AMS content, and configuring the platform. During course delivery, the facilitator monitors overall learner progress at a cohort level using platform analytics, identifies common points of difficulty that might indicate a need to refine pathways or content, updates content based on new AMR/AMS evidence, and may provide high-level support for complex issues that the platform’s automated feedback cannot resolve. Direct, individualized tutoring is less frequent as the platform is designed to handle much of the personalized guidance.

Participant’s role (Learner): Actively engage with the Adaptive Learning Platform, diligently complete learning activities and embedded assessments. The learner’s performance, responses, and interactions directly influence the personalization of their learning path. Learners must be comfortable with a technology-driven, often self-paced, learning experience and be proactive in utilizing the adaptive features to their benefit.

AMR One Health policy group Icon

Assessing a Adaptive Learning Platforms

Methods

  • Platform-Generated Learning Analytics: ALPs provide rich data on individual learning paths, time spent on specific learning objects or topics, performance on all embedded assessments, identified areas of difficulty or misconception, rates of progress, and levels of mastery achieved for specific AMR/AMS competencies.
  • Pre- and Post-Intervention Knowledge/Skill Tests: Use validated instruments or custom-developed tests to measure overall gains in AMR/AMS knowledge, improvements in clinical reasoning or decision-making skills, or changes in attitudes/confidence before and after engaging with the ALP. (Fontaine et al., 2019, in their meta-analysis found AEEs can improve knowledge and skills; Putra et al., 2022, demonstrated higher post-test scores in an adaptive learning group).
  • Performance on Embedded Summative Assessments: Evaluation of performance on specific quizzes, simulations, or problem-solving tasks designed to assess mastery of key AMR/AMS learning objectives within the platform.
  • Learner Satisfaction and Engagement Surveys: Collect feedback from users on their experience with the platform, including its usability, the perceived effectiveness and relevance of the personalized learning path, engagement levels, and overall satisfaction. (Yakin and Linden, 2021, and Putra et al., 2022, assessed student perceptions and engagement).
  • Learning Efficiency Analysis: Where appropriate, compare the time taken by learners to achieve mastery of AMR/AMS concepts using the ALP versus traditional or non-adaptive e-learning approaches. (Fontaine et al., 2019, suggest AEEs may be more efficient).

Tools

Adaptive Learning Platform’s built-in analytics dashboards and reporting functionalities. Standardized or custom-designed AMR/AMS knowledge tests and clinical reasoning assessments. Validated user experience (UX) and satisfaction questionnaires . Possibly eye-tracking or detailed interaction log analysis for deeper research into learning behaviors within the adaptive environment.

AMR One Health policy group Icon

Suggested Adaptive Learning Platforms prototype

Target Audience: Clinical Profiles (nurses, pharmacists, junior doctors), Researchers, Educators.

Learning Objectives:

  • Enable learners to achieve mastery of specific, pre-defined AMR/AMS competencies (e.g., interpreting complex antibiograms for MDROs, understanding the application of PK/PD principles for antibiotic optimization, developing communication strategies for AMS interventions, tailoring AMS advice for different patient populations or agricultural settings) through a personalized learning pathway.
  • Identify and remediate individual knowledge gaps related to core and advanced AMR/AMS topics.
  • Optimize learning time by allowing learners to bypass already mastered content and focus on areas needing development.

Curriculum/Activities:

  • A comprehensive repository of granular learning objects covering a wide spectrum of AMR/AMS topics (e.g., mechanisms of resistance, spectrum of activity of different antimicrobials, diagnostic stewardship, IPC measures, One Health aspects of AMR, AMS in specific clinical/veterinary areas). Content formats include short explanatory texts, video snippets, interactive diagrams, virtual patient case fragments, and image-based questions (as in Samulski et al., 2016).
  • Initial Diagnostic Assessment: Learners begin with a diagnostic test to assess their existing knowledge across various AMR/AMS competencies.
  • Personalized Learning Path Generation: Based on the diagnostic results and ongoing performance, the ALP’s algorithm directs each learner through a unique sequence of learning objects and activities. For example, a learner struggling with interpreting susceptibility reports might receive more foundational content and practice exercises on this topic, while a learner demonstrating mastery might be presented with more complex case studies or advanced research articles.
  • Adaptive Feedback and Remediation: Immediate, targeted feedback is provided on all assessments. If a learner makes an error, the platform might offer a different explanation, a simpler related concept, or additional practice exercises before allowing them to proceed.
  • Mastery-Based Progression: Learners advance to new topics or more complex material only after demonstrating mastery of prerequisite concepts.

Evaluation of the Prototype’s Effectiveness:

  • Detailed analysis of individual and cohort learning pathways and progression rates using platform analytics to understand how personalization is functioning.
  • Measurement of significant improvements in specific AMR/AMS competencies, as assessed by pre- and post-platform standardized tests or performance on embedded mastery challenges. (Fontaine et al., 2019, found AEEs effective for knowledge and skill acquisition).
  • High levels of learner satisfaction with the adaptive learning experience, perceived relevance of the personalized content, and ease of use of the platform.
  • Comparison of learning efficiency (time to achieve competency) for learners using the ALP versus those in non-adaptive control groups (if research is conducted).
  • Qualitative feedback from learners on their experience with the adaptive pathways and the type of support received from the platform.

Reference

  • Fontaine, G., Cossette, S., Maheu-Cadotte, M.-A., Mailhot, T., Deschênes, M.-F., & Mathieu-Dupuis, G. (2017). Effectiveness of Adaptive E-Learning Environments on Knowledge, Competence, and Behavior in Health Professionals and Students: Protocol for a Systematic Review and Meta-Analysis. JMIR Research Protocols, 6(7), e128. https://doi.org/10.2196/resprot.8085
  • Fontaine, G., Cossette, S., Maheu-Cadotte, M.-A., Mailhot, T., Deschênes, M.-F., Mathieu-Dupuis, G., Côté, J., Gagnon, M.-P., & Dubé, V. (2019). Efficacy of adaptive e-learning for health professionals and students: A systematic review and meta-analysis. BMJ Open, 9(8), e025252. https://doi.org/10.1136/bmjopen-2018-025252
  • Pfeiffer, C. N., & Jabbar, A. (2019). Adaptive e-Learning: Emerging Digital Tools for Teaching Parasitology. Trends in Parasitology, 35(4), 270–274. https://doi.org/10.1016/j.pt.2019.01.008
  • Putra, A., Gram, D., Stefanou, C., & Santoro, D. (2022). The Use of Adaptive Learning Technology to Enhance Learning in Clinical Veterinary Dermatology. Journal of Veterinary Medical Education, 49(1), 118–125. https://doi.org/10.3138/jvme-2020-0069
  • Samulski, T. D., La, T., & Wu, R. I. (2016). Adaptive eLearning modules for cytopathology education: A review and approach. Diagnostic Cytopathology, 44(11), 944–951. https://doi.org/10.1002/dc.23558
  • Scallan Walter, E. J., Mousavi, C. T., Elnicki, J., & Davis, S. (2022). Training Public Health Professionals on Adaptive Challenges—An Innovative Approach Using Remote Learning Modalities. Journal of Public Health Management and Practice, 28(Supplement 5), S240–S248. https://doi.org/10.1097/phh.0000000000001522
  • Yakin, M., & Linden, K. (2021). Adaptive e‐learning platforms can improve student performance and engagement in dental education. Journal of Dental Education, 85(7), 1309–1315. https://doi.org/10.1002/jdd.12609