Purpose: Clinical reasoning is a critical component of physical therapy practice. The Commission on Accreditation in Physical Therapy Education (CAPTE) requires that all physical therapist education programs develop and assess clinical reasoning skills as a professional practice outcome expectation1. Further, a call to strengthen the profession through a series of recommendations from Jensen et al2 states, “The profession must establish a comprehensive, longitudinal approach for the explicit development of learners’ clinical reasoning skills that spans entry-level through clinical residencies and continues across the therapist’s career. This comprehensive approach to clinical reasoning will require that academic and clinical faculty develop robust teaching, learning, and assessment strategies.” Recent survey work on the teaching and learning of clinical reasoning revealed highly variable definitions and strategies for teaching and assessment across programs3. In order to address the challenges related to such variability, Huhn et al4 completed a concept analysis to define clinical reasoning as “integrating cognitive, psychomotor, and affective skills. It is contextual in nature and involves both therapist and client perspectives. It is adaptive, iterative, and collaborative with the intended outcome being a biopsychosocial approach to patient/client management.” Christensen et al3 supports building clinical reasoning educational interventions on 1) a constructivist learning theory, allowing learners to create their own knowledge through experience and reflection in order to establish meaningful learning, and 2) a situational or experiential learning theory, which promotes learning by problem solving in a relevant context. Simulation is one method to promote and develop experiential clinical reasoning in health professions education prior to clinical experience. The structure promotes learning to problem solve in a relevant context with opportunities for reflection, often incorporates patient perspective, and typically requires integration of cognitive, affective, and/or psychomotor skills to achieve success. However, simulation models often include labor-intensive in-person group activities using costly standardized patients or high fidelity mannequins that require extensive planning and time to execute5. The common alternative of paper cases often lack context and authenticity, but are attractive in that they require fewer resources. Because both approaches typically proceed along a linear storyline, neither allows for multiple pathways of iterative decision-making with real-time formative feedback and reflection for learners developing their clinical reasoning. Digital platforms provide a reasonable middle ground to create simulated patient cases that can be built on sound learning theories, utilize fewer resources, and provide more authentic clinical context than paper cases6. Learners can navigate digital platforms across multiple pathways, engage as individuals or in collaborative small groups, receive real time feedback, and resume the problem solving before them in a manner authentic to the clinical context. Many digital platforms exist, but none are specific to the needs of physical therapy clinical reasoning. Therefore, existing platforms must be manipulated to meet the needs of the curriculum, though often met with limitations5. The purpose of this session is to support and develop learners’ clinical reasoning through the use of orthopedic patient case simulation on a digital platform in an effective and efficient manner for use by instructors in the didactic curriculum. Methods and/or Description of Project: A concept mapping program was used to establish the scaffold of a simple orthopedic patient case already utilized and then built and tested on multiple simulation platforms for feasibility. In order to build simulations in an effective and efficient manner, the following 6 elements were determined to be of necessity: ease of use for instructor development and learner operation, accessibility in a didactic setting, low cost, data collection to capture learner navigation paths and short answer reasoning responses, option to progress and regress through clinical reasoning pathways based on feedback, and a scoring mechanism with response weighting capabilities. Branch Track© was selected as the platform that best met these criteria, and included additional advantages such as a polished interface presentation, higher fidelity virtual environments and characters, and the ability to upload media files. Cases designed on the Branch Track© platform represent a series of authentic clinical interactions in which the learner engages with a virtual clinical instructor (CI) and a context-rich patient to make clinical decisions about physical therapy care management. Scenes are presented in a “challenge-choice-consequence” fashion that allows the virtual CI or patient to pose “challenges”, the leaner to select from multiple option therapeutic intervention “choices”, and the “consequences” to play out in the form of direct CI feedback or simulated emotional or clinical patient responses. Individuals who fall into traps experience feedback consistent with clinical context, such as the virtual CI intervening to ensure patient safety and reviewing key points. In the event that the trap relates to professional behaviors, intervention by the virtual SCCE or DCE are introduced. Similarly, some scenes result in learners choosing an appropriate course of action, with the virtual patient response potentially not what would be desired or expected, consistent with realistic clinical practice. Learners are provided with the opportunity to either problem solve the issue or abandon the course of action. Points are weighted to reflect the significance of the clinical reasoning in the context of the case and are awarded according to the appropriateness of the learner selection. Throughout the simulation, learners are also provided free text opportunities to articulate their clinical reasoning either before or after the consequence of the scene is known, which provides practice for both reflection “in action” and reflection “on action”. These responses are manually graded by the instructor to provide additional feedback at a later date. Following the completion of each scene, learners have the opportunity to either return to the scene to explore other choices or to leave the scene and go to another interaction in the case. All simulated cases finish with a free text opportunity for learners to provide feedback to the instructor regarding the case content, the simulation platform, and the experience overall. Cases were implemented into the orthopedic curriculum in two ways: 1) independent homework framed as an opportunity to apply recently learned treatment concepts to a single simulated treatment session for feedback on clinical reasoning, included early and midway through the two semester orthopedic sequence and 2) a small group in-class active learning activity framed as an opportunity to apply comprehensive post-operative management clinical reasoning to a simulated case over three different treatment sessions spanning the course of an individual’s plan of care, implemented at the end of the second semester. The instructor assumed the role of support and guide for small groups. In each case, learners were encouraged to aim for their best outcome on their first attempt, but then to consider playing the simulation again to explore other options they might have considered or the consequences of poor choices. Results/Outcomes: Branch Track© data collection provided useful outcomes information to assist in fine tuning the instruction and scaffolding of the cases and to identify learners who were on track (higher simulation scores) and those who were struggling (lower simulation scores). Aggregate learner performance scores for each simulated challenge identified which elements of the case were most difficult for the students to navigate, directing the instructor to modify either delivery of that content or the format of the simulated scene. Data that identified which learners were struggling in their clinical reasoning allowed the instructor to intervene individually to address performance during the didactic portion of the curriculum, prior to full-time clinical experiences. Much like simple outcomes on a multiple choice exam, the instructor could see frequency counts on each of the options provided and determine if the majority of the class was on the right track and avoiding clinical reasoning traps. This data was valuable in building complexity to the cases as the curriculum progressed in that the instructor was able to determine that learners were successfully avoiding obvious traps and therefore would be more appropriately challenged with less obvious, more nuanced options. In contrast, when learner responses were inconsistent with expectations of the instructor, there was an opportunity to improve the clarity of intention to direct learners by adjusting the prompts in the simulation. Initial trials resulted in learners taking a more casual approach to their free text responses by writing incomplete responses or responding to the instructor as opposed to directing responses to the virtual CI or patient. This prompted the instructor to modify free text prompts to be more conversational and realistic to the clinical experience. When learners failed to demonstrate prioritization of interventions and organization of a treatment session, the instructor adjusted the virtual CI’s prompt to state, “What do you want to do first?” These modifications were further reinforced with instructor verbal orientation to the simulation experience by providing explicit instructions to consider order of interventions and direct all free text responses with language directed to the intended audience. Some learners expressed a desire to provide their own treatment options as opposed to selecting from a predetermined set of choices. The utilization of “free text” could allow for such additions, but the altered flow of the simulation would be significantly disrupted. As a result, the instructor has encouraged learners to consider physically covering up the options and determining what answer they would provide prior to considering the provided choices. Data collection allowed the instructor to identify individuals or small groups who fell into clinical reasoning traps or abandoned a course of action without adequate exploration. The instructor was then able to follow up directly with individuals to debrief the scene, identify faulty reasoning, and support correction. The specifics of the case disruption directed the instructor towards content related issues, such as directing the virtual patient towards an ineffective course of treatment, or personal barriers related to resilience, self efficacy, and trusting their reasoning. Learner feedback regarding simulations indicated the pedagogical approach as an enjoyable format to apply their knowledge and provide adequate challenge, citing the activity as easy to navigate and preferable compared to paper cases elsewhere in the curriculum. The additional layer of small group collaboration in clinical reasoning processing was especially valued, with learners expressing an appreciation for the opportunity to open their ideas for inspection. This approach promoted self-assessment through comparison to their peers and allowed learners to collect additional perspectives to fill their gaps in understanding. The instructor also found the simulation platform to be user-friendly in building cases, administering to learners, and collecting data. In-class debriefing allowed the instructor to observe rich discussion among learners as they distilled their varied perspectives to a consensus and articulated their collective thoughts into free text responses. Conclusions/Relevance to the conference theme: The development of and reflection upon clinical reasoning skills in care management is a necessary component of physical therapy education and practice, but it is difficult to guide learners and provide feedback regarding contextually relevant clinical reasoning in the didactic setting. Use of the Branch Track© simulation platform allowed learners to practice prioritization, organization, clinical reasoning, and reflection for intervention choices consistent with realistic orthopedic clinical cases, with value added beyond the typical ability of an instructor to deliver, measure, and provide feedback on learners’ clinical reasoning in the classroom. Each participation experience allows learners to chart a new course in their clinical reasoning through case management enhanced by simulation that promotes contextual learning through self, peer, virtual CI and patient, and course instructor feedback. This pedagogical approach also sets sail into the unknown of clinical reasoning best practice in physical therapy education, moving the profession forward in its journey toward becoming front-line primary care providers of the future.