Validity of a Self-Assessment Survey and Its Implications for Clinical Education and Practice Framework

Date

2020-05

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Background: Researchers, educators, and accreditors in the medical and health professions have increasingly emphasized competency-based education and assessment. Professional requirements for evidence-based practice and continuing professional development require clinicians to self-assess their performance and make decisions about practice limitations and continuing education throughout their careers. The clinical training portion of a health professions program relies upon both the learner and supervisor to make judgements about clinical performance in order to determine competency. In orthotics and prosthetics, professional competency is structured around the American Board for Certification’s Practice Analysis (2015) framework, which has implications on educational standards and certification exam construction. The Analysis outlines the following domains: patient assessment; formulation of the treatment plan; plan implementation; follow-up to the treatment plan; practice management; and, promotion of competency and enhancement of professional practice. While researchers have studied the accuracy of self-assessment in clinical education, none has presented a validated tool for self-assessment in orthotics and prosthetics clinical education. This study used Kane’s Validity Framework (2006) to explore a self-competency self-assessment tool used in orthotics and prosthetics education. Purpose: The objectives of this study were to evaluate the reliability of items in the self-assessment survey, examine the latent common factors measured by the survey, use inferences from clinical practice to refine and reduce the items in the survey, and examine the relationship between clinical autonomy and self-assessment. Methods: Retrospective data from students in one orthotics and prosthetics education program from July 2017 to December of 2019 were used for analysis. At multiple times during the educational program students completed a self-assessment survey which included 29 items addressing the National Commission on Orthotic and Prosthetic Education’s residency objectives. The researcher analyzed reliability of the survey using Cronbach’s alpha. The evaluation of latent common factors was initiated with a six-factor confirmatory factor analysis using principal axis factoring and direct oblimin rotation. Following confirmatory factor analysis, the researcher used exploratory factor analysis to determine additional items for reduction and the most appropriate model fit. Finally, clinical autonomy was compared to a sum self-competency self-assessment score through Bivariate Pearson correlation. Results: Reliability analysis demonstrated a robust instrument with a Cronbach’s alpha of 0.927 and indicated a potential to drop four items. Confirmatory factor analysis indicated a poor fit of the six-factor model, and exploratory factor analysis and further item evaluation resulted in a total reduction of 15 items from the survey. The final and best-fitting model suggested four latent common factors: patient centeredness, regulatory awareness, device evaluation, and professional responsibility. The sum scores of the self-assessment survey did not correlate significantly with clinical autonomy. Review and revision of the self-assessment items resulted in a revised fourteen-item instrument for use in additional research. Conclusion: The results of the study imply a need to reexamine the current clinical practice framework in orthotics and prosthetics. Additionally, future research should evaluate the shortened self-assessment survey, determine extrapolation of the results, and consider implications for educational practices.

Description

Keywords

orthotics and prosthetics, clinical education, health professions education

Citation