Document Type : Original Article
Authors
Department of Obstetrics and Gynecology, School of Medicine, Isfahan University of Medical Sciences, Isfahan, Iran
Abstract
INTRODUCTION: Clinical education is one of the most important parts of medical students’ education,
and it is a major part of the education of qualified and professional people. Therefore, this study was
conducted to determine the effect of applying Direct Observation of Procedural Skills (DOPS) on
midwifery students’ clinical skills.
MATERIALS AND METHODS: This is a quasi‑experimental, two‑group study conducted as a
pre‑ and post‑study on midwifery students in 2017–2018. Cluster and randomized sampling method
was used. The processors involved in this study were three main skills of vaginal examination, pelvic
examination, and vaginal delivery. The DOPS method was used to assess the practical skills in the
interventional group during three times the process at day, 1 day, and at least 1 week later; the
usual logbook method was used in the control group. Two groups were evaluated at the end of the
midwifery course by Comprehensive Final Midwifery checklist. The tools were checked for validity
and reliability, and data were analyzed using descriptive and analytical statistics.
RESULTS: There was no statistically significant difference between the two groups in terms
of important demographic variables such as age, grade, marital status, and initial assessment
score (P > 0.05). The mean of final scores in the normal delivery, vaginal examination, and pelvimetry
was statistically significantly higher in the interventional group (P < 0.001). On the other hand, the
functional field of the students in the interventional group was statistically significantly improved
in normal delivery and pelvimetry (P < 0.05), and this difference was not significant in the vaginal
examination. In addition, the mean scores of students before and after the DOPS method were
statistically significantly different in every skill in Comprehensive Final Midwifery checklist (P < 0/05).
CONCLUSIONS: The DOPS assessment methodology is not only a useful tool of clinical evaluation,
but also an effective tool for clinical learning of students. For this purpose, it is suggested that
educational members of midwifery take enough time to design DOPS method in the same process.
Keywords
of nursing students’ performance in advanced cardiopulmonary
resuscitation through role‑playing learning model. J Educ Health
Promot 2019;8:151.
2. Crossley J, Humphris G, Jolly B. Assessing health professionals.
Med Educ 2002;36:800‑4.
3. Mohamadirizi S, Kohan S, Shafei F, Mohamadirizi S. The
relationship between clinical competence and clinical self-efficacy
among nursing and midwifery students. International Journal of
Pediatrics. 2015;3:1117-23.
4. Jalili M, Imanipour M, Nayeri DN, Mirzazadeh A. Evaluation of
the nursing students’ skills by DOPS. J Med Educ 2015;14:13‑9.
5. Grauer GF, Forrester SD, Shuman C, Sanderson MW. Comparison
of student performance after lecture‑based and case‑based/
problem‑based teaching in a large group. J Vet Med Educ
2008;35:310‑7.
6. Smith‑Strøm H, Nortvedt MW. Evaluation of evidence‑based
methods used to teach nursing students to critically appraise
evidence. J Nurs Educ 2008;47:372‑5.
7. Franko DL, Cousineau TM, Trant M, Green TC, Rancourt D,
Thompson D, et al. Motivation, self‑efficacy, physical activity and nutrition in college students: Randomized controlled trial of an
internet‑based education program. Prev Med 2008;47:369‑77.
8. Bari V. Direct observation of procedural skills in radiology. AJR
Am J Roentgenol 2010;195:W14‑8.
9. Sohrabi Z, Salehi K, Rezaie H, Haghani F. The implementation
of direct observation of procedural skills (DOPS) in Iran’s
universities of medical sciences: A systematic review. Iran J Med
Educ 2016;16:8‑14.
10. Chehrzad M, Sohail SZ, Mirzaee M, Kazemnejad E. compare
of Osce and traditional clinical evaluation methods on nursing
students’ satisfaction. J Med Faculty Guilan Univ Med Sci
2007;13:8‑12.
11. Noohi E, Motasedi M, Haghdoost A. Clinical teachers’ viewpoints
towards objective structured clinical examination in Kerman
University of Medical Science. Iran J Med Educ 2008;8:113‑9.
12. Kariman N, Moafi F. Effect of portfolio assessment on student
learning in prenatal training for midwives. J Educ Eval Health
Prof 2011;8:2.
13. Habibi H, Khaghanizadeh M, Mahmoudi H, Ebadi A,
SeyedMazhari M. Comparison of the effects of modern assessment
methods (DOPS and mini‑CEX) with traditional method on
nursing students’ clinical skills: A randomized trial. Iran J Med
Educ 2013;13:16‑22.
14. Mitchell C, Bhat S, Herbert A, Baker P. Workplace‑based
assessments of junior doctors: Do scores predict training
difficulties? Med Educ 2011;45:1190‑8.
15. Kuhpayehzade J, Hemmati A, Baradaran H, Mirhosseini F,
Akbari H, Sarvieh M. Validity and reliability of direct observation
of procedural skills in evaluating clinical skills of midwifery
students of Kashan nursing and midwifery school. Sabzevar Med
Univ J 2014;21:12‑16.
16. Bagheri M, Sadeghnezhad M, Sayyadee T, Hajiabadi F. The effect
of direct observation of procedural skills (DOPS) evaluation
method on learning clinical skills among emergency medicine
students. Iran J Med Educ 2014;13:1073‑81.
17. Nazari R, Hajihosseini F, Sharifnia H, Hojjati H. The effect of
formative evaluation using “direct observation of procedural
skills” (DOPS) method on the extent of learning practical skills
among nursing students in the ICU. Iran J Nurs Midwifery Res
2013;18:290‑3.
18. Nooreddini A, Sedaghat S, Sanagu A, Hoshyari H, Cheraghian B.
Effect of clinical skills evaluation applied by direct observation
clinical skills (DOPS) on the clinical performance of junior nursing
students. J Res Dev Nurs Midwifery 2015;12:8‑16.
19. Bhugra D, Malik A, Brown N. Workplace‑based Assessment
in Psychiatry. London: Royal College of Psychiatrists; 2007.
p. 1‑13.
20. Tsui KH, Liu CY, Lui JM, Lee ST, Tan RP, Chang PL. Direct
observation of procedural skills to improve validity of students’
measurement of prostate volume in predicting treatment
outcomes. Urol Sci 2013;24:84‑8.
21. Shumway JM, Harden RM; Association for Medical Education
in Europe. AMEE Guide No. 25: The assessment of learning
outcomes for the competent and reflective physician. Med Teach
2003;25:569‑84.
22. van der Vleuten CP, Schuwirth LW. Assessing professional
competence: From methods to programmes. Med Educ
2005;39:309‑17.