An Item Analysis of English Test During Online Learning

Umi Ma’rifah, Nyanuar Algiovan, Cucu Sutarsyah

Abstract


The assessment process in education holds a crucial and fundamental function since the quality of Education will be measured from the assessment process. The process of assessing language teaching then becomes interesting as a study material during online learning. This study aimed to dissect the quality of three important things in the analysis of English test items namely Item Difficulty, Item Discrimination, and Distractors Effectiveness. This study used a quantitative research approach. The research sample was obtained from 49 answer sheet of students' work in taking the school exam. The researchers used Iteman application program to analyze the results. The results revealed that the test used had low reliability. In terms of the item difficulty, the majority of the items (52 %) are difficult, 34% moderate and 14% in easy category. None of the items is considered very good in item discrimination, some of items accepted but revision, and rejected. Whilst, it was found that in the distractors, 38 items are effective and the rest are ineffective. It is concluded that test makers’ understanding on the quality of the test are needed to determine the quality of a good and appropriate language measurement.

Keywords


English Test; Item Analysis; Iteman; Online Learning

Full Text:

PDF

References


Anderson, G., & Arsenault, N. (1998). Fundamentals of Educational Research.

Arikunto, S. (1984). Dasar-Dasar Evaluasi Pendidikan, Yogyakarta:PT. BINA AKSARA.

Azis, A. (2015). Conceptions and Practices of Assessment: a Case of Teachers Representing Improvement Conception. TEFLIN Journal - A Publication on the Teaching and Learning of English, 26(2), 129. https://doi.org/10.15639/teflinjournal.v26i2/129-154

Barbosa, H., & Garcia, F. (2005). Importance of online assessment in the E-learning process. ITHET 2005: 6th International Conference on Information Technology Based Higher Education and Training, 2005, 2005(May 2014), 1–7. https://doi.org/10.1109/ITHET.2005.1560287

Bashir, A., Uddin, M. E., Basu, B. L., & Khan, R. (2021). Transitioning to online education in English departments in Bangladesh: Learner perspectives. Indonesian Journal of Applied Linguistics, 11(1), 11–20. https://doi.org/10.17509/ijal.v11i1.34614

Danili, E., & Reid, N. (2006). Cognitive Factors That Can Potentially Affect Pupils’ Test Performance. Chemistry Education Research and Practice, 7(2), 64–83. https://doi.org/10.1039/B5RP90016F

Ebel, R. L., & Frisbie, D. A. (1991). Essentials of Educational Measurement Fifth Edition. In Prentice Hall of India. https://doi.org/10.1016/0022-4405(73)90057-5

Fiktorius, T. (2014). A Validation Study on National English Examination of Junior High School in Indonesia.

Fulcher, G., & Davidson, F. (2007). Language Testing and Assessment: An Advanced Resource Book. In ELT Journal (Vol. 63, Issue 2). https://doi.org/10.1093/elt/ccp010

Hartati, N., & Yogi, H. P. S. (2019). Item Analysis for a Better Quality Test. English Language in Focus (ELIF), 2(1), 59. https://doi.org/10.24853/elif.2.1.59-70

Heaton, J. (1988). Writing English Language Tests. In Longman Group.

Inbar-Lourie, O. (2008). Constructing a language assessment knowledge base: A focus on language assessment courses. In Language Testing (Vol. 25, Issue 3). https://doi.org/10.1177/0265532208090158

Indrayani, M. S. D., Marhaeini, A. A. I. N., Paramartha, A. A. G. Y., & Wahyuni, L. G. E. (2020). The Analysis of the Teacher-Made Multiple-Choice Tests Quality for English Subject. Journal of Education Research and Evaluation, 4(3), 272. https://doi.org/10.23887/jere.v4i3.25814

Jannah, R., Hidayat, D. N., Husna, N., & Khasbani, I. (2021). An item analysis on multiple-choice questions: a case of a junior high school English try-out test in Indonesia. Leksika: Jurnal Bahasa, Sastra Dan Pengajarannya, 15(1), 9. https://doi.org/10.30595/lks.v15i1.8768

Karim, S. A., Sudiro, S., & Sakinah, S. (2021). Utilizing test items analysis to examine the level of difficulty and discriminating power in a teacher-made test. EduLite: Journal of English Education, Literature and Culture, 6(2), 256. https://doi.org/10.30659/e.6.2.256-269

Kearns, L. (2012). Student Assessment in Online Learning: Challenges and Effective Practices. Jolt.Merlot.Org, 8(3), 198–208. http://jolt.merlot.org/vol8no3/kearns_0912.htm

Maharani, A. V., & Putro, N. H. P. S. (2020). Item Analysis of English Final Semester Test. Indonesian Journal of EFL and Linguistics, 5(2), 491. https://doi.org/10.21462/ijefl.v5i2.302

Manalu, D., Sipayung, K. T., & Lestari, F. D. (2019). an Analysis of Students Reading Final Examination By Using Item Analysis Program on Eleventh Grade of Sma Negeri 8 Medan. JETAL: Journal of English Teaching & Applied Linguistic, 1(1), 13–19. https://doi.org/10.36655/jetal.v1i1.98

Manfenrius, A., & Sutapa, G. B. W. (2015). Items Analysis on the Score of the English Summative Test. 1–10.

Rahmawanti, M. R., & Umam, A. (2019). Integrating Web 2.0 Tools in Writing Class to Promote Assessment for Learning. JEES (Journal of English Educators Society), 4(2), 53. https://doi.org/10.21070/jees.v4i2.2516

Rahmawati, Y., & Ertin. (2014). Developing Assessment For Speaking. Indonesian Journal of English Education, 1, 1–12.

Rahmawati, Y. (2012). English education department teacher training and education faculty sebelas maret university surakarta. 265, 265–278.

Rao, M. M., & Haque, M. I. (2019). A study on impact of testing on English as a foreign language teaching in a Saudi Arabian University. Humanities and Social Sciences Reviews, 7(2), 58–71. https://doi.org/10.18510/hssr.2019.727

Sato. (2018). The Impact of the Test of English for Academic Purposes (TEAP) on Japanese Students’ English Learning. JACET Journal, 62, 89–107.

Sung, P.-J., Lin, S.-W., & Hung, P.-H. (2015). Factors Affecting Item Difficulty in English Listening Comprehension Tests. Universal Journal of Educational Research, 3(7), 451–459. https://doi.org/10.13189/ujer.2015.030704

Triono, D., Sarno, R.., & Sungkono, K. R. (2020). Item Analysis for examination test in the postgraduate student’s selection with classical test theory and rasch measurement model. Proceedings - 2020 International Seminar on Application for Technology of Information and Communication: IT Challenges for Sustainability, Scalability, and Security in the Age of Digital Disruption, ISemantic 2020, 523–529. https://doi.org/10.1109/iSemantic50169.2020.9234204

Trivict, T., & Densiana, F. (2020). The quality of an English summative test of a public junior. 3(2), 133–141.

Wijayanti, D. N. (2019). English Teachers’ Understanding of Language Assessment. Journal of English Teaching and Learning Issues, 2(2), 93–114. https://doi.org/10.21043/jetli.v2i1.

Yulianto, D. & Mujtahin, N. M. (2021). Online Assessment during Covid-19 Pandemic : EFL Teachers ’ Perspectives and Their Practices. Journal of English Teaching, 7(2), 229-242. DOI: Https://Doi.Org/10.33541/Jet.V7i2.2770, 7(2021), 229–242. https://doi.org/10.33541/jet.v7i2.2770




DOI: http://dx.doi.org/10.18415/ijmmu.v8i12.3396

Refbacks

  • There are currently no refbacks.


Copyright (c) 2022 International Journal of Multicultural and Multireligious Understanding

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

International Journal of Multicultural and Multireligious Understanding (IJMMU) ISSN 2364-5369
https://ijmmu.com
editor@ijmmu.com
dx.doi.org/10.18415/ijmmu
facebook.com/ijmmu
Copyright © 2014-2018 IJMMU. All rights reserved.