Document Type : Original Article
Authors
1 Department of Infectious Diseases, School of Medicine, Infectious Diseases Research Center, Birjand University of Medical Sciences, Birjand, Iran
2 Department of Community Medicine, School of Medicine, Birjand University of Medical Sciences, Birjand, Iran
3 Department of Oral and Maxillofacial Medicine, School of Dentistry, Infectious Diseases Research Center, Birjand University of Medical Sciences, Birjand, Iran
4 e‑Learning Center, Birjand University of Medical Science, Birjand, Iran
5 Department of Epidemiology and Biostatistics, School of Health, Social Determinants of Health Research Center, Birjand University of Medical Sciences, Birjand, Iran
Abstract
BACKGROUND: Education and assessment have changed during the COVID‑19 pandemic so that
online courses replaced face‑to‑face classes to observe the social distance. The quality of online
assessments conducted during the pandemic is an important subject to be addressed. In this study,
the quality of online assessments held in two consecutive semesters was investigated.
MATERIALS AND METHODS: One thousand two hundred and sixty‑nine multiple‑choice online
examinations held in the first (n = 535) and second (n = 734) semesters in Birjand University of
Medical Sciences during 2020–2021 were examined. Mean, standard deviation, number of questions,
skewness, kurtosis, difficulty, and discrimination index of tests were calculated. Data mining was
applied using the k‑means clustering approach to classify the tests.
RESULTS: The mean percentage of answers to all tests was 69.97 ± 19.16, and the number of
questions was 34.48 ± 18.75. In two semesters, there was no significant difference between the
difficulty of examinations (P = 0.84). However, there was a significant difference in the discrimination
index, skewness, and kurtosis of tests (P < 0.001). Moreover, according to the results of the clustering
analysis in the first semester, 43% of the tests were very hard, 16% hard, and 7% moderate. In the
second semester, 43% were hard, 16% moderate, and 41% easy.
CONCLUSION: To evaluate the tests’ quality, calculating difficulty and discrimination indices is
not sufficient; many factors can affect the quality of tests. Furthermore, the experience of the first
semester had changed characteristics of the second‑semester examinations. To enhance the quality
of online tests, establishing appropriate rules to hold the examinations and using questions with
higher taxonomy are recommended.
Keywords
- Rezaei H, Haghdoost A, Javar HA, Dehnavieh R, Aramesh S,
Dehgani N, et al. The effect of coronavirus (COVID‑19) pandemic
on medical sciences education in Iran. J Educ Health Promot
2021;10:136.
2. Ghadrdoost B, Sadeghipour P, Amin A, Bakhshandeh H, Noohi F,
Maleki M, et al. Validity and reliability of a virtual education
satisfaction questionnaire from the perspective of cardiology
residents during the COVID‑19 pandemic. J Educ Health Promot
2021;10:291.
3. Harries AJ, Lee C, Jones L, Rodriguez RM, Davis JA,
Boysen‑Osborn M, et al. Effects of the COVID‑19 pandemic on
medical students: A multicenter quantitative study. BMC Med
Educ 2021;21:14.
4. Khan RA, Jawaid M. Technology Enhanced Assessment (TEA) in
COVID 19 pandemic. Pak J Med Sci 2020;36:S108‑10.
5. Jagadeesan AR, Neelakanta RR. Online self‑assessment tool in
Biochemistry – A medical student’s perception during COVID‑19
pandemic. J Educ Health Promot 2021;10:137.
6. Walsh K. Point of view: Online assessment in medical
education‑current trends and future directions. Malawi Med J
2015;27:71‑2.
7. Boitshwarelo B, Reedy AK, Billany T .
Envisioning the use of online tests in assessing twenty‑first
century learning: A literature review. Res Pract Technol Enhanc
Learn 2017;12:16.
8. Marius P, Marius M, Dan S, Emilian C, Dana G. Medical students’
acceptance of online assessment systems. Acta Med Marisiensis
2016;62:30‑2.
9. Salas‑Morera L, Arauzo‑Azofra A, García‑Hernández L. Analysis
of online quizzes as a teaching and assessment tool. J Technol Sci
Educ 2012;2:39‑45.
10. Hingorjo MR, Jaleel F. Analysis of one‑best MCQs: The difficulty
index, discrimination index and distractor efficiency. J Pak Med
Assoc 2012;62:142‑7.
11. Mahjabeen W, Alam S, Hassan U, Zafar T, Butt R, Konain S, et al.
Difficulty index, discrimination index and distractor efficiency in
multiple choice questions. Ann PIMS Shaheed Zulfiqar Ali Bhutto
Med Univ 2017;13:310‑5.
12. Upadhyah AA, Maheria PB, Patel J. Analysis of one best MCQS
in five preuniversity physiology examinations. Int J Physiol
2019;7:10‑5.
13. Baghaei R, Shams S, Feizi A. Evaluation of the nursing students
final exam multiple‑choice questions in urmia university of
medical sciences. Nurs Midwifery J 2016;14:291‑9.
14. Cheang Q, Hasni Z. Analisis item dan pembinaan ujian‑satu
perbandingan antara pendekatan rujukan norma dan rujukan
kriteria. J Pendidikan Tigaenf 1998;2:112‑20.
15. Verma M, Mehta D. A comparative study of techniques in data
mining. Int J Emerg Technol Adv Eng 2014;4:314‑21.
16. Dubey A, Choubey A. A systematic review on k‑means clustering
techniques. Int J Sci Res Eng Technol 2017;6:624-7.
17. Adriyendi M. Clustering using K‑means and fuzzy C‑means
on food productivity. International Journal of u- and e- Service,
Science and Technology 2016;9:291-308.
18. Fuller R, Joynes V, Cooper J, Boursicot K, Roberts T. Could
COVID‑19 be our ‘There is no alternative’(TINA) opportunity to
enhance assessment? Med Teach 2020;42:781‑6.
19. Hassan B, Shati AA, Alamri A, Patel A, Asseri AA, Abid M,
et al. Online assessment for the final year medical students
during COVID‑19 pandemics; the exam quality and students’
performance. Onkol Radioter 2020;14:1‑6. - 20. Holbrook A, Liu JT, Rieder M, Gibson M, Levine M, Foster G,
et al. Prescribing competency assessment for Canadian
medical students: A pilot evaluation. Can Med Educ J
2019;10:e103‑10.
21. Johari J, Sahari J, Abd Wahab D, Abdullah S, Abdullah S,
Omar MZ, et al. Difficulty index of examinations and their relation
to the achievement of programme outcomes. Proc Soc Behav Sci
2011;18:71‑80.