[1] Unicef. (2021). Formative assessment for quality, inclusive digital and distance learning during and beyond the COVID-19 pandemic. United Nations Children's Fund Regional Office for Europe and Central Asia: Switzerland.
[2] DiBattista, D., & Kurzawa, L. (2011). Examination of the quality of multiple-choice items on classroom tests. Canadian Journal for the Scholarship of Teaching and Learning, 2(2), 4.
[3] Johnson, C. I., & Mayer, R. E. (2009). A testing effect with multimedia learning. Journal of Educational Psychology, 101(3), 621.
[4] Pereira, D., Flores, M. A., & Niklasson, L. (2016). Assessment revisited: a review of research in Assessment and Evaluation in Higher Education. Assessment & Evaluation in Higher Education, 41(7), 1008-1032.
[5] Irons, A., & Elkington, S. (2021). Enhancing learning through formative assessment and feedback. Routledge.
[6] Brown, A. (1993). The role of test-taker feedback in the test development process: Test-takers' reactions to a tape-mediated test of proficiency in spoken Japanese. Language testing, 10(3), 277-301.
[7] Klein, S. P., Kuh, G., Chun, M., Hamilton, L., & Shavelson, R. (2005). An approach to measuring cognitive outcomes across higher education institutions. Research in Higher Education, 46, 251-276.
[8] Zimmerman, B. J. (2013). Theories of self-regulated learning and academic achievement: An overview and analysis. Self-regulated learning and academic achievement, 1-36.
[9] Boitshwarelo, B., Reedy, A. K., & Billany, T. (2017). Envisioning the use of online tests in assessing twenty-first century learning: a literature review. Research and Practice in Technology Enhanced Learning, 12(1), 1-16.
[10] Brown, G. T., & Abdulnabi, H. H. (2017, June). Evaluating the quality of higher education instructor-constructed multiple-choice tests: Impact on student grades. In Frontiers in Education (Vol. 2, p. 24). Frontiers Media SA.
[11] Przymuszała, P., Piotrowska, K., Lipski, D., Marciniak, R., & Cerbin-Koczorowska, M. (2020). Guidelines on writing multiple choice questions: a well-received and effective faculty development intervention. SAGE Open, 10(3), 2158244020947432.
[12] Abad, F. J., Olea, J., & Ponsoda, V. (2001). Analysis of the optimum number of alternatives from the Item Response Theory. Psicothema, 13(1), 152-158.
[13] Baghaei, P., & Amrahi, N. (2011). The effects of the number of options on the psychometric characteristics of multiple-choice items. Psychological Test and Assessment Modelling, 53(2), 192-211. https://doi.org/10.4304/jltr.2.5.1052-1060
[14] Epstein, R.M. (2007). Medical education: Assessment in medical education. The New England Journal of Medicine, 356(4), 387-396. https://doi.org/10.1056/NEJMra054784
[15] Black, P. (1999). Assessment, learning theories and testing systems. Learners, learning and assessment, 118-134.
[16] Dehnad, A., Nasser, H., & Hosseini, A. F. (2014). A comparison between three-and four-option multiple choice questions. Procedia-Social and Behavioral Sciences, 98, 398-403.
[17] Shizuka, T., Takeuchi, O., Yashima, T., & Yoshizawa, K. (2006). A comparison of three-and four-option English tests for university
[18] Tarrant, M., & Ware, J. (2008). Impact of item‐writing flaws in multiple‐choice questions on student achievement in high‐stakes
[19] Bozkurt, A., Jung, I., Xiao, J., Vladimirschi, V., Schuwer, R., Egorov, G., ... & Paskevicius, M. (2020). A global outlook to the interruption of education due to COVID-19 pandemic: Navigating in a time of uncertainty and crisis. Asian Journal of Distance Education, 15(1), 1-126.
[20] Black, P., & Wiliam, D. (2009). Developing the theory of formative assessment. Educational Assessment, Evaluation and
[21] Tarrant, M., & Ware, J. (2008). Impact of item‐writing flaws in multiple‐choice questions on student achievement in high‐stakes
[22] Delen, E., & Liew, J. (2016). The use of interactive environments to promote self-regulation in online learning: A literature review. European Journal of Contemporary Education, 15(1), 24-33.
[23] Munro, N. (2014). Exceptional academic achievement in South African higher education (Doctoral dissertation).
[24] Rodriguez, M. C. (2005). Three options are optimal for multiple‐choice items: A meta‐analysis of 80 years of research. Educational measurement: issues and practice, 24(2), 3-13.
[25] Sidick, J. T., Barrett, G. V., & Doverspike, D. (1994). Three‐alternative multiple choice tests: An attractive option. Personnel Psychology, 47(4), 829-835.
[26] Owen, S. V., & Froman, R. D. (1987). What's wrong with three-option multiple choice items?. Educational and psychological measurement, 47(2), 513-522.
[27] Rogers, W. T., & Harley, D. (1999). An empirical comparison of three-and four-choice items and tests: susceptibility to testwiseness and internal consistency reliability. Educational and Psychological Measurement, 59(2), 234-247.
[28] Shizuka, T., Takeuchi, O., Yashima, T., & Yoshizawa, K. (2006). A comparison of three-and four-option English tests for university entrance selection purposes in Japan. Language Testing, 23(1), 35-57. nursing assessments. Medical education, 42(2), 198-206.
[29] Tarrant, M., Ware, J., & Mohammed, A. M. (2009). An assessment of functioning and non-functioning distractors in multiple-choice
[30] Sidick, J. T., Barrett, G. V., &Doverspike, D. (1994). Three-alternative multiple-choice tests: An attractive option. Personnel Psychology, 47, 829-835. https://doi.org/10.1111/j.1744-6570.1994.tb01579.x
[31] Lee, H., & Winke, P. (2013). The differences among three-, four-, and five-option-item formats in the context of a high-stakes English-language listening test. Language Testing, 30(1), 99-123. https://doi.org/10.1177/0265532212451235
[32] Manalu, H. F., & Diana Anggraeni, S. S. (2020). The optimal number of options used in multiple-choice test format for national examinations in Indonesia. Humanities & Social Sciences Reviews, 8(2), 824-834.
[33] Nicol, D. J., & Macfarlane‐Dick, D. (2006). Formative assessment and self‐regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199-218.
[34] Haladyna, T. M., Downing, S. M., & Rodriguez, M. C. (2002). A review of multiple-choice item-writing guidelines for classroom assessment. Applied measurement in education, 15(3), 309-333.
[35] Delgado, J. A., & Rivera, C. A. (2008). Concept mapping as an assessment tool in higher education activities. In The Third International conference on Concept Mapping, Tallinn, Estonia & Helsinki, Finland.
[36] Rodriguez, M. C. (1997, April). The art & science of item writing: A meta-analysis of multiple-choice item format effects. In annual meeting of the American educational research association, Chicago, IL.
[37] Frisbie, D. A., & Sweeney, D. C. (1982). The relative merits of multiple true-false achievement tests. Journal of Educational Measurement, 29-35.
[38] Ikebukuro, K. (1999). A new multiple-choice question format more parallel to the knowledge quantity. Relationship between expected correct answer ratio and knowledge quantity. Medical Education, 15-20.
[39] Siddiqui, N. I., Bhavsar, V. H., Bhavsar, A. V., & Bose, S. (2016). Contemplation on marking scheme for Type X multiple choice questions, and an illustration of a practically applicable scheme. Indian journal of pharmacology, 48(2), 114.
[40] Qadir, J., Taha, A. E. M., Yau, K. L. A., Ponciano, J., Hussain, S., Al-Fuqaha, A., & Imran, M. A. (2020). Leveraging the force of formative assessment & feedback for effective engineering education.