The current model of clinical education delivery faces multiple challenges, one of which might be the approach used to pair students with clinical sites. Academic programs face ongoing competition for slots which has become accentuated by the increase in programs and expansion of existing cohorts nationally. Additionally, the promulgation of payment-for-slot models has created more uncertainty within the market. Clinical facilities face ongoing market disruption given constant changes in healthcare and understandably must streamline process while focusing on value for the customer / client. As such, facilities are faced with difficult decisions regarding participation in, and continuation of, student clinical education. Academic programs may not always possess the strongest of relationships with each of their clinical sites and therefore are left in a reactive stance when informed of clinical site decisions regarding clinical education. Finally, it truly is not always possible to fully separate clinical education experiences from imposed student expenses. The heightened cost of education and the prolongation of the learner role given expansion of residency and fellowship models force students to more strongly consider their finances during clinical education placement. Despite these influences, Directors of Clinical Education (DCEs) must work to maximize the efficiency and effectiveness of clinical education in a manner that best serves the needs of students and clinical facilities alike.
Methods and/or Description of Project
Through the use of clinical facility benchmarking and objectifying student educational / professional attributes, achievement of “best-fit matching” may occur. Academic programs must determine benchmark thresholds for clinical education offerings and pursue relationships with sites that meet, or exceed, those standards. The implication, therefore, is that sites that fall below a benchmarked standard will not be used. For this to occur, a rigorous benchmarking system is required for all sites, specific to setting / population served, and informed by industry or professional accreditations / certifications as appropriate and available. Benchmarking relies upon a resourceful partnership mindset that encourages open and respectful relationships so the DCE may fully comprehend facility expectations and needs. Concurrently, the academic program must quantify student performance data, inclusive of academic, professional, and clinical data as known, to generate comprehensive student profiles. Caution must be exercised in safeguarding student data so as not to violate student privacy, as directed by FERPA. Piecing together the benchmarked clinical site data with the student profiles, through a systematic matching process, reveals a final site-student listing which adheres to a best-fit method of clinical slot placement. Such placement takes into account both the clinical facility’s performance expectations and the student’s setting-specific interests, educational trajectory, and career goals. This presentation will illustrate a model for best-fit matching of sites and students during clinical placement. The benchmarking process for sites and the development of the student profile will be discussed by the panel members, comprised of a physical therapist program director and members of the academic clinical education team.
This presentation will provide participants a model for best-fit matching of sites and students during clinical placement. The benchmarking process for sites and the development of the student profile will be discussed by the panel members, including a program director and members of the clinical education team.
During this session, presenters will provide insight into the successes and drawbacks of this best-fit matching process, as developed and revised by the program over the course of several years. Participants will be shown how the process achieves the goal of appropriately challenging each student within their clinical education placements, while still achieving entry-level standards across all clinical internships and students. Outcome data will be shared regarding alignment of clinical site placement with student-reported clinical interests, student-expressed educational goals, and student-generated slot requests. This will be compared and contrasted to the outcome of a computer-generated placement match, using the same variables. The student data will additionally be appraised in contrast to clinical facility feedback on student performance / preparedness and student satisfaction of internship experience. Anecdotal reports of student match, obtained inadvertently during internship communications, will also be shared. Finally, the panel will lead a discussion of the resources used to ensure success of this process, including but not limited to time, personnel, and departmental resources beyond the clinical education team.
Conclusions/Relevance to the conference theme: Through the Looking Glass: Transforming Physical Therapy Education
Academicians must grapple with questions about the future directions of clinical education, and specific to this presentation, whether alignment of student needs with clinical facility educational offerings best meet the needs of the profession, and ultimately the clients served. If this is the case, then what is the best method for this alignment? It would seem that some site assignment methods are not evidence-based: use of the lottery system might arguably use student-centric evidence which is incomplete and biased without full regard to the clinical facility needs and expectations. Benchmarking clinical sites using objective measures, and applying a logical system for matching students to these sites provides a more effective process for assignment of students, ultimately providing a stronger foundation for site/student development and the clinical education program.
Baldry Currens JA, Bithell CP. Clinical Education: Listening to different perspectives. Physiotherapy. 2000; 86(12): 645-653.
Deusinger S, Crowner B, Burlis T, Stith J. Meeting Contemporary Expectations for Physical Therapists: Imperatives, Challenges, and Proposed Solutions for Professional Education. Journal of Physical Therapy Education. 2013; 28(1): 56-61.
Jette D, Nelson L, Palaima M, Wetherbee E. How do we Improve Quality in Clinical Education? Examination of Structures, Processes, and Outcomes. Journal of Physical Therapy Education. 2013; 28(1): 6-12.
McCallum CA, Mosher PD, Jacobson PJ, Gallivan SP, Giuffre SM. Quality in physical therapist clinical education: a systematic review. Phys Ther. 2013; 93(10): 1298-311.
McCallum CA, Mosher P, Howman J, Engelhard C, Euype S, Cook C. Development of Regional Core Networks for the Administration of Physical Therapist Clinical Education. Journal Of Physical Therapy Education. 2014; 28 (Supp 1): 39-47.
Stiller K, Lynch E, Phillips AC, Lambert P. Clinical education of physiotherapy students in Australia: Perceptions of current models. Australian Journal of Physiotherapy. 2004; 50(4): 243-247.
Teel CS, MacIntyre RC, Murray TA, Rock KZ. Common themes in clinical education partnerships. J of Nursing Educ. 2011;50(7):365-372.
At the conclusion of this session, participants will be able to:
1. Discern the value to invested stakeholders (i.e.: students, clinical sites, academic programs) when clinical site placement methods are applied strategically.
2. Compare and contrast the best-fit matching method proffered in this presentation to placement via lottery and computer-generation for clinical site placement of students in physical therapy education programs.
3. Analyze clinical site benchmarking methods, and discuss how these methods might be useful to each participant’s home program.
4. Analyze the utility of quantification of student data for generation of student profiles for use in best-fit matching methods of clinical site placement.
Lecture, panel-led discussion
I. Defining the issue: 10 minutes
II. Best-fit matching process: 20 minutes
a. Process for student profile development
b. Process for clinical facility benchmarking
III. Outcomes related to matching process: 20 minutes
a. Presentation of post internship feedback
b. Compare and contract success rate of matching process vs computer generated match
IV. Resource allocation: 20 minutes
a. Within the clinical education team
b. Within the department at large
V. Q & A with panel: 20 minutes