Document Type : Original Article
Authors
- . Shahnam Sedigh Maroufi
- . Parisa Moradimajd 1
- . Maryam Jalali 2
- . Ghobad Ramezani 3
- . Somayeh Alizadeh 4
1 Department of Anesthesia, Faculty of Allied Medicine, Iran University of Medical Sciences, Tehran, Iran
2 Department of Orthotics and Prosthetics, School of Rehabilitation Sciences, Iran University of Medical Sciences, Tehran, Iran
3 Department of Medical Education, Center for Educational Research in Medical Science, School of Medicine, Iran University of Medical Sciences, Tehran, Iran
4 Medical Education Development Center, North Khorasan University of Medical Science, Bojnurd, Iran
Abstract
BACKGROUND: Medical education has special features due to the need various areas of
learning. The present study was conducted to provide a complete picture of the evaluation system
in Iran University of Medical Sciences for improving the evaluation system and medical sciences
examinations.
MATERIALS AND METHODS: The research was cross‑sectional study that conducted through
self‑reporting of educational departments, a comprehensive review of the student evaluation system
in the affiliated faculties of Iran University of Medical Sciences from 2017 to 2018. Educational
members and heads of nine faculties and 80 departments participated in this study. The research
tool was a researcher‑made questionnaire, include two parts: (1) 10 general questions about the
activities of the educational groups regarding the student evaluation system and (2) 20 questions
about the types and quality of examinations.
RESULTS: From 80 questionnaires, 71 were completed by the managers of the departments. The
results showed that 62% of the faculty members in the educational departments in the last 2 years did
not participate in the workshop on the methods of evaluation and making tests. 56% of the faculties
have a reference for continuous monitoring of students’ assessment and evaluation, and in 87% of
the cases, the content is given in accordance with the objectives. The use of logbooks was more
common (28%) than other methods to assess practical skills.
CONCLUSION: Empower faculty members on the use of various tools, strengthening the supervision
of formative evaluation and use of medical education graduates to promote evaluation methods
seems necessary.
Keywords
collaboration learning and academic performance of boys and
girls in third grade elementary school students in Bandar Abbas.
J Educat Sci 2009;5:79‑92.
2. Seraje F, Maroufi Y, Razege T. Identify the evaluation challenges
of student learning in the system Iranian higher education. J Edu
Measurem Evaluat Stud 2014;4:33‑54.
3. Shumway JM, Harden RM. AMEE Guide No. 25: The assessment
of learning outcomes for the competent and reflective physician.
Med Teach 2003;25:569‑84.
4. Zubair A, Yap SC, Hoon EK. Practical Guide to Medical
Student Assessment. Singapore: World Scientific Publishing Co.
Singapore; 2006.
5. Swanwick T. Understanding Medical Education. Understanding
Medical Education: Evidence, Theory and Practice; 2013. p. 1‑6.
6. Wass V, Van der Vleuten C, Shatzer J, Jones R. Assessment of
clinical competence. Lancet 2001;357:945‑9.
7. Brady AM. Assessment of learning with multiple‑choice
questions. Nurse Educat Practice 2005;5:238‑42.
8. Council GM. How are students assessed at medical schools
across the UK? In: Council GM, editor. General Medical Council
publisher. 2014.
9. Gandomkar R, Amini B. Application of various student
assessment methods by educational departments in Tehran
University of Medical Sciences, Iran. Strid Develop Med Edu
2015;12:209‑18.
10. Haigh C. An evaluation of a judgmental model of assessment
for assessing clinical skills in MSc students. Nurse Educ Practice
2003;3:43‑8.
11. Poehner ME. Dynamic assessment: Fairness through the prism of
mediation. Assessment in Education: Principles. Policy Practice
2011;18:99‑112.
12. Gauthier S, Cavalcanti R, Goguen J, Sibbald M. Deliberate practice
as a framework for evaluating feedback in residency training.
Med Teach 2015;37:551‑7.
13. Kokinova M. Psychometric Analysis of Student Performance on
Basic Medical Science Tests; 2003.
14. Berkhof M, van Rijssen HJ, Schellart AJ, Anema JR, van der
Beek AJ. Effective training strategies for teaching communication
skills to physicians: an overview of systematic reviews. Patient
Educat Counsel 2011;84:152‑62.
15. Ellaway R, Masters K. AMEE Guide 32: e‑Learning in medical
education Part 1: Learning, teaching and assessment. Med Teach
2008;30:455‑73.
16. Tartwijk JV, Driessen EW. Portfolios for assessment and learning:
AMEE Guide no. 45. Med Teach 2009;31:790‑801.
17. Mavis BE, Cole BL, Hoppe RB. A survey of information sources
used for progress decisions about medical students. Med Educat
Online 2000; 5:5849.
18. Dent JA, Harden RM. A practical guide for medical teacher. In:
Edition F, editor.CHURCHILL LIVING STONE ELSEVIER; 2014.
19. Tousignant M, Desmarchais JE. Accuracy of student self‑assessment
ability compared to their own performance in a problem‑based
learning medical program: A correlation study. Adv Health Sci
Educ 2002;7:27‑19.
20. Hawkins SC, Osborne A, Schofield SJ, Pournaras DJ, Chester JF.
Improving the accuracy of self‑assessment of practical clinical
skills using video feedback–the importance of including
benchmarks. Medical teacher. 2012 Apr 1;34 (4):279‑84.
21. Kouhpayezadeh J, Dargahi H, Arabshahi S. Clinical assessment
methods in medical sciences universities of Tehran – Clinical
instructors’ viewpoint. Hormozgan Med J 2011;5:402‑395.