Document Type : Original Article

Authors

1 Department of E‑learning in Medical Education, Virtual School, Center for Excellence in E‑learning in Medical Education, Tehran University of Medical Sciences, Tehran, Iran

2 Department of Medical Education, Isfahan University of Medical Sciences, Isfahan, Iran

3 Department of Evaluation and Measurement, School of Education and Educational Psychology, Allameh Tabatabaei University, Tehran, Iran

Abstract

BACKGROUND: Kane’s validity framework examines the validity of the interpretation of a test at the
four levels of scoring, generalization, extrapolation, and implications. No model has been yet proposed
to use this framework particularly for a system of assessment. This study provided a model for the
validation of the internal medicine residents’ assessment system, based on the Kane’s framework.
MATERIALS AND METHODS: Through a five stages study, first, by reviewing the literature, the
methods used, and the study challenges, in using Kane’s framework, were extracted. Then, possible
assumptions about the design and implementation of residents’ tests and the proposed methods for
their validation at each of their four inferences of Kane’s validity were made in the form of two tables.
Subsequently, in a focus group session, the assumptions and proposed validation methods were
reviewed. In the fourth stage, the opinions of seven internal medicine professors were asked about
the results of the focus group. Finally, the assumptions and the final validation model were prepared.
RESULTS: The proposed tables were modified in the focus group. The validation table was developed
consisting of tests, used at each Miller’s pyramid level. The results were approved by five professors
of the internal medicine. The final table has five rows, respectively, as the levels of Knows and Knows
How, Shows How, Shows, Does, and the fifth one for the final scores of residents. The columns
of the table demonstrate the necessary measures for validation at the four levels of inferences of
Kane’s framework.
CONCLUSION: The proposed model ensures the validity of the internal medicine specialty residency
assessment system based on Kane’s framework, especially at the implication level.

Keywords

1. Van Der Vleuten CP, Schuwirth LW, Driessen EW, Govaerts MJ,
Heeneman S. Twelve tips for programmatic assessment. Med
Teach 2015;37:641‑6.
2. Lockyer J, Carraccio C, Chan MK, Hart D, Smee S, Touchie C,
et al. Core principles of assessment in competency‑based medical
education. Med Teach 2017;39:609‑16.
3. Schuwirth LW, Van der Vleuten CP. Programmatic assessment:
From assessment of learning to assessment for learning. Med
Teach 2011;33:478‑85.
4. Harris P, Bhanji F, Topps M, Ross S, Lieberman S, Frank JR, et al.
Evolving concepts of assessment in a competency‑based world.
Med Teach 2017;39:603‑8.
5. Schuwirth LW, van der Vleuten CP. Programmatic assessment
and Kane’s validity perspective. Med Educ 2012;46:38‑48.
6. van der Vleuten CP, Dannefer EF. Towards a systems approach
to assessment. Med Teach 2012;34:185‑6.
7. Im GH, Shin D, Cheng LJ. Critical review of validation models
and practices in language testing: Their limitations and future
directions for validation research. Language Testing in Asia
2019;9:14.
8. Kane MT. Validating the interpretations and uses of test scores.
Journal of Educational Measurement. 2013;50:1‑73.
9. Kane MT. Validation as a pragmatic, scientific activity. Journal
of Educational Measurement. 2013;50115‑22.
10. Tavares W, Brydges R, Myre P, Prpic J, Turner L, Yelle R, et al.
Applying Kane’s validity framework to a simulation based
assessment of clinical competence. Adv Health Sci Educ Theory
Pract 2018;23:323‑38.
11. Bajwa NM, Yudkowsky R, Belli D, Vu NV, Park YS. Improving
the residency admissions process by integrating a professionalism
assessment: A validity and feasibility study. Adv Health Sci Educ
Theory Pract 2017;22:69‑89.
12. Hatala R, Cook DA, Brydges R, Hawkins R. Constructing a
validity argument for the Objective Structured Assessment
of Technical Skills (OSATS): A systematic review of validity
evidence. Advances in health sciences education: Theory and
practice. 2015;20:1149‑75.
13. Clauser BE, Margolis MJ, Holtman MC, Katsufrakis PJ,
Hawkins RE. Validity considerations in the assessment of
professionalism. Adv Health Sci Educ Theory Pract 2012;17:165‑81.
14. Surry LT, Torre D, Durning SJ. Exploring examinee behaviours
as validity evidence for multiple‐choice question examinations.
Medical education 2017;51:1075‑85.
15. Peeters MJ, Martin BA. Validation of learning assessments: A primer. Curr Pharm Teach Learn 2017;9:925‑33.
16. Wools S, Eggen TJ, Béguin AA. Constructing validity arguments
for test combinations. Studies in educational evaluation.
2016;48:10‑8.
17. Cook DA, Brydges R, Ginsburg S, Hatala RJ. A contemporary
approach to validity arguments: A practical guide to Kane’s
framework. Medical education. 2015;49:560‑75.
18. Johnson RC, Riazi AM. Validation of a locally created and
rated writing test used for placement in a higher education EFL
program. Assessing Writing. 2017;32:85‑104.
19. Gadbury‑Amyot CC, McCracken MS, Woldt JL, Brennan RL.
Validity and reliability of portfolio assessment of student
competence in two dental school populations: A four‑year study.
J Dent Educ 2014;78:657‑67.
20. Onishi H, Park YS, Takayanagi R, Fujinuma Y. Combining scores
based on compensatory and noncompensatory scoring rules to
assess resident readiness for unsupervised practice: Implications
from a national primary care certification examination in Japan.
Acad Med 2018;93:S45‑51.
21. Kane MT. Explicating validity. Assessment in Education:
Principles, Policy, & Practice. 2016;23:198‑211.
22. Bok HG, de Jong LH, O’Neill T, Maxey C, Hecker KG. Validity
evidence for programmatic assessment in competency‑based
education. Perspect Med Educ 2018;7:362‑72.
23. Kelly‑Riley D, Elliot NJ. The WPA outcomes statement, validation,
and the pursuit of localism. Assessing writing. 2014;21:89‑103.
24. Cook DA, Zendejas B, Hamstra SJ, Hatala R, Brydges R. What
counts as validity evidence? Examples and prevalence in a
systematic review of simulation‑based assessment. Adv Health
Sci Educ Theory Pract 2014;19:233‑50.
25. Miller GE. The assessment of clinical skills/competence/
performance. Acad Med 1990;65:S63‑7.
26. Wass V, Van der Vleuten C, Shatzer J, Jones R. Assessment of
clinical competence. Lancet 2001;357:945‑9.
27. Hawkins RE, Margolis MJ, Durning SJ, Norcini JJ. Constructing
a validity argument for the mini‑Clinical Evaluation Exercise:
A review of the research. Acad Med 2010;85:1453‑61.
28. Ten Cate O. Competency‑based education, entrustable
professional activities, and the power of language. J Grad Med
Educ 2013;5:6‑7.
29. Englander R, Frank JR, Carraccio C, Sherbino J, Ross S, Snell L,
et al. Toward a shared language for competency‑based medical
education. Med Teach 2017;39:582‑7.
30. Ten Cate O, Hart D, Ankel F, Busari J, Englander R, Glasgow N,
et al. Entrustment decision making in clinical training. Acad Med
2016;91:191‑8.
31. Ten CateO, Chen HC, HoffRG, Peters H, Bok H, van der SchaafM.
Curriculum development for the workplace using Entrustable
Professional Activities (1): AMEE Guide No. 99. Med Teach
2015;37:983‑1002.
32. Carraccio C, Englander R, Gilhooly J, Mink R, Hofkosh D,
Barone MA, et al. Building a framework of entrustable professional
activities, supported by competencies and milestones, to bridge
the educational continuum. Acad Med 2017;92:324‑30.
33. Ten Cate O. Nuts and bolts of entrustable professional activities.
J Grad Med Educ 2013;5:157‑8.