Vol. 11 No. 7 (2025): July
Open Access
Peer Reviewed

Systematic Literature Review: Three-Tier Diagnostic Test to Identifying Misconceptions in Chemistry

Authors

DOI:

10.29303/jppipa.v11i7.11011

Published:

2025-07-25

Downloads

Abstract

This study presents a literature review on the use of the three-tier diagnostic test in identifying students’ misconceptions in chemistry and offers insights into its effectiveness in educational evaluation. A literature review of studies from 2019 to 2024 shows that 71.43% of these tests use close-ended questions due to their efficiency in terms of time, cost, and data analysis, compared to 28.57% that use open-ended questions. Research findings indicate that this test has successfully identified misconceptions in various chemistry topics, such as equilibrium, chemical bonding, ionization energy, and acids-bases. The ability of the three-tier diagnostic test to detect misconceptions makes it an effective tool for improving students’ conceptual understanding. The findings indicate that the three-tier diagnostic test has been widely used to uncover students’ conceptual difficulties across various chemistry topics. Its use has shown promise in supporting more accurate and in-depth assessments compared to traditional evaluation methods. The review suggests that this type of diagnostic testing can enhance the quality of formative assessments and contribute to improved instructional planning.

Keywords:

Chemistry Close-ended Misconception Open-ended Three-tier diagnostic test

References

Baburajan, V., de Abreu e Silva, J., & Pereira, F. C. (2022). Open vs Closed-Ended Questions in Attitudinal Surveys – Comparing, Combining, and Interpreting Using Natural Language Processing. Transportation Research Part C: Emerging Technologies, 137, 103589. https://doi.org/10.1016/j.trc.2022.103589 DOI: https://doi.org/10.1016/j.trc.2022.103589

Barke, H.-D., Hazari, A., & Yitbarek, S. (2009). Students’ Misconceptions and How to Overcome Them. Misconceptions in Chemistry, 21–36. https://doi.org/10.1007/978-3-540-70989-3_3 DOI: https://doi.org/10.1007/978-3-540-70989-3_3

Behmke, D. A., & Atwood, C. H. (2013). Implementation and Assessment of Cognitive Load Theory (CLT) Based Questions in an Electronic Homework and Testing System. Chemistry Education Research and Practice, 14(3), 247–256. https://doi.org/10.1039/C3RP20153H DOI: https://doi.org/10.1039/C3RP20153H

Bingölbali, E., Bingölbali, F., & Prof, A. (2021). An Examination of Open-Ended Mathematics Questions’ Affordances. International Journal of Progressive Education, 17(4), 1–16. https://doi.org/10.29329/ijpe.2021.366.1 DOI: https://doi.org/10.29329/ijpe.2021.366.1

Birenbaum, M., & Tatsuoka, K. K. (1987). Open-Ended Versus Multiple-Choice Response Formats—It Does Make a Difference for Diagnostic Purposes. Applied Psychological Measurement, 11(4), 385–395. https://doi.org/10.1177/014662168701100404 DOI: https://doi.org/10.1177/014662168701100404

Caleon, I. S., & Subramaniam, R. (2010). Do Students Know What They Know and What They Don’t Know? Using a Four-Tier Diagnostic Test to Assess the Nature of Students’ Alternative Conceptions. Research in Science Education, 40(3), 313–337. https://doi.org/10.1007/s11165-009-9122-4 DOI: https://doi.org/10.1007/s11165-009-9122-4

Cooper, C., Booth, A., Varley-Campbell, J., Britten, N., & Garside, R. (2018). Defining the Process to Literature Searching in Systematic Reviews: A Literature Review of Guidance and Supporting Studies. BMC Medical Research Methodology, 18(1), 1–14. https://doi.org/10.1186/S12874-018-0545-3 DOI: https://doi.org/10.1186/s12874-018-0545-3

Desai, S., & Reimers, S. (2019). Comparing the Use of Open and Closed Questions for Web-Based Measures of the Continued-Influence Effect. Behavior Research Methods, 51(3), 1426–1440. https://doi.org/10.3758/s13428-018-1066-z DOI: https://doi.org/10.3758/s13428-018-1066-z

Ekawisudawati, E., Wijaya, M., & Danial, M. (2021). Analisis Miskonsepsi Peserta Didik pada Materi Asam Basa Menggunakan Instrumen Three-Tier Diagnostic Test. Chemistry Education Review (CER), 5(1), 62–72. https://doi.org/10.26858/ cer.v5i1.26359 DOI: https://doi.org/10.26858/cer.v5i1.26359

Fanani, R. D., Supardi, Z. A. I., & Suprapto, N. (2023). The Development of HOTS-Based Question Instrument in Temperature and Heat Material. Jurnal Penelitian Pendidikan IPA, 9(9), 6890–6895. https://doi.org/10.29303/jppipa.v9i9.3282 DOI: https://doi.org/10.29303/jppipa.v9i9.3282

Fanfiana, R. O., Hadisaputra, S., & Supriadi, S. (2024). Identification of Student Misconceptions Using A Three-Tier Test on The Concept of Atoms, Ions, and Molecules. Chemistry Education Practice, 7(1), 75–81. https://doi.org/10.29303/cep.v7i1.6133 DOI: https://doi.org/10.29303/cep.v7i1.6133

Farzad, M., MacDermid, J. C., Lu, Z., & Shafiee, E. (2020). Validation of Persian Version of Patient-Rated Wrist and Hand Evaluation: Confirmatory Factor Analysis and Rasch Analysis. Archives of Rehabilitation Research and Clinical Translation, 2(4), 100076. https://doi.org/10.1016/j.arrct.2020.100076 DOI: https://doi.org/10.1016/j.arrct.2020.100076

Febliza, A., Kadarohman, A., Aisyah, S., & Abdullah, N. (2024). Development and Validation of a Three-Tier Test for Identifying Misconceptions in Organic Chemistry Course. Journal of Innovative Science Education, 13(3), 124–133. https://doi.org/10.15294/jise.v13i3.13578

Gurel, D. K., Eryilmaz, A., & McDermott, L. C. (2015). A Review and Comparison of Diagnostic Instruments to Identify Students’ Misconceptions in Science. Eurasia Journal of Mathematics, Science and Technology Education, 11(5), 989–1008. https://doi.org/10.12973/ eurasia.2015.1369a DOI: https://doi.org/10.12973/eurasia.2015.1369a

Gusmanida, G., Sujati, H., & Herwin, H. (2024). Testing the Construct Validity and Reliability of the Student Learning Motivation Scale Using Confirmatory Factor Analysis (CFA). Jurnal Penelitian Pendidikan IPA, 10(7), 4227–4234. https://doi.org/10.29303/jppipa.v10i7.7518 DOI: https://doi.org/10.29303/jppipa.v10i7.7518

Hale, L. V. A., Lutter, J. C., & Shultz, G. V. (2016). The Development of a Tool for Measuring Graduate Students’ Topic Specific Pedagogical Content Knowledge of Thin Layer Chromatography. Chemistry Education Research and Practice, 17(4), 700–710. https://doi.org/10.1039/c5rp00190k DOI: https://doi.org/10.1039/C5RP00190K

Irfandi, I., Murwindra, R., Musdansi, D. P., N, W. A., & Hanri, C. (2022). Identification and Analysis of Students’ Misconceptions Using Three-Tier Multiple Choice Diagnostic Instruments on Thermochemistry Topic. IJECA (International Journal of Education and Curriculum Application), 5(3), 306–316. https://doi.org/10.31764/ ijeca.v5I3.11613 DOI: https://doi.org/10.31764/ijeca.v5i3.11613

Istiyani, R., Muchyidin, A., & Rahardjo, D. H. (2018). Analysis of Student Misconception on Geometry Concepts Using Three-Tier Diagnostic Test. Cakrawala Pendidikan, 37(2), 261628. https://doi.org/10.21831/cp.v37i2.14493 DOI: https://doi.org/10.21831/cp.v37i2.14493

Johnstone, A. H. (1993). The Development of Chemistry Teaching: A Changing Response to Changing Demand. Journal of Chemical Education, 70(9), 701–705. https://doi.org/10.1021/ ed070p701 DOI: https://doi.org/10.1021/ed070p701

Jusniar, J., Effendy, E., Budiasih, E., & Sutrisno, S. (2020). Developing a Three-Tier Diagnostic Instrument on Chemical Equilibrium (TT-DICE). Educación Química, 31(3), 84–102. https://doi.org/10.22201/ fq.18708404e.2020.3.72133 DOI: https://doi.org/10.22201/fq.18708404e.2020.3.72133

Lim, W. M., Kumar, S., & Ali, F. (2022). Advancing Knowledge Through Literature Reviews: ‘What’, ‘Why’, and ‘How to Contribute.’ The Service Industries Journal, 42(7–8), 481–513. https://doi.org/10.1080/02642069.2022.2047941 DOI: https://doi.org/10.1080/02642069.2022.2047941

Linnenluecke, M. K., Marrone, M., & Singh, A. K. (2020). Conducting Systematic Literature Reviews and Bibliometric Analyses. Australian Journal of Management, 45(2), 175–194. https://doi.org/10.1177/0312896219877678 DOI: https://doi.org/10.1177/0312896219877678

Lu, S., & Bi, H. (2016). Development of a Measurement Instrument to Assess Students’ Electrolyte Conceptual Understanding. Chemistry Education Research and Practice, 17(4), 1030–1040. https://doi.org/10.1039/c6rp00137h DOI: https://doi.org/10.1039/C6RP00137H

Mardiyyaningsih, A. N., Erlina, E., Ulfah, M., & Wafiq, A. F. (2023). Validity and Reliability of the Two-tier Diagnostic Test to Identify Students’ Alternative Conceptions of Intermolecular Forces. Jurnal Penelitian Pendidikan IPA, 9(6), 4375–4381. https://doi.org/10.29303/jppipa.v9i6.2797 DOI: https://doi.org/10.29303/jppipa.v9i6.2797

Marsh, H. W., Guo, J., Dicke, T., Parker, P. D., & Craven, R. G. (2020). Confirmatory Factor Analysis (CFA), Exploratory Structural Equation Modeling (ESEM), and Set-ESEM: Optimal Balance Between Goodness of Fit and Parsimony. Multivariate Behavioral Research, 55(1), 102–119. https://doi.org/10.1080/00273171.2019.1602503 DOI: https://doi.org/10.1080/00273171.2019.1602503

Meiliyadi, L. A. D., Asyari, A., & Arizona, K. (2023). Identification of Tadris Biology Students Level Understanding and Misconceptions on the Material of Quantities and Units Using 3-Tier Diagnostic Method. Jurnal Penelitian Pendidikan IPA, 9(12), 12042–12048. https://doi.org/10.29303/jppipa.v9i12.6122 DOI: https://doi.org/10.29303/jppipa.v9i12.6122

Mellyzar, M. (2021). Analysis of Understanding Chemical Bond Concepts in Students with Three-Tier Multiple Choice. JEC, 3(1), 53–66. https://doi.org/10.21580/ jec.2021.3.1.7560 DOI: https://doi.org/10.21580/jec.2021.3.1.7560

Mulyana, V., & Desnita, D. (2023). Empirical Validity and Reliability of the Scientific Literacy Assessment Instrument Based on the Tornado Physics Enrichment Book. Jurnal Penelitian Pendidikan IPA, 9(5), 3961–3967. https://doi.org/10.29303/jppipa.v9i5.3290 DOI: https://doi.org/10.29303/jppipa.v9i5.3290

Natalia, P. D., & Sudrajat, A. (2023). Development of Three˗Tier Diagnostic Test Instrument to Measure Misconceptions of Class XI Students on Reaction Rate Materials. Jurnal Teknologi Pendidikan: Jurnal Penelitian dan Pengembangan Pembelajaran, 8(1), 1–11. https://doi.org/10.33394/jtp.v8i1.6192 DOI: https://doi.org/10.33394/jtp.v8i1.6192

Paul, J., & Criado, A. R. (2020). The Art of Writing Literature Review: What Do We Know and What Do We Need to Know? International Business Review, 29(4), 101717. https://doi.org/10.1016/j.ibusrev.2020.101717 DOI: https://doi.org/10.1016/j.ibusrev.2020.101717

Prodjosantoso, A. K., Hertina, A. M., & Irwanto, I. (2019). The Misconception Diagnosis on Ionic and Covalent Bonds Concepts with Three Tier Diagnostic Test. International Journal of Instruction, 12(1), 1477–1488. https://doi.org/10.29333/iji.2019.12194a DOI: https://doi.org/10.29333/iji.2019.12194a

Rismaningsih, F., & Nurhafsari, A. (2022). Identify Hydrostatic Misconceptions Using Four Tier Diagnostic Tests with the Help of iSpring Suite 9 (Case Study in UNIS Faculty of Engineering Students). Jurnal Penelitian Pendidikan IPA, 8(2), 773–780. https://doi.org/10.29303/jppipa.v8i2.1290 DOI: https://doi.org/10.29303/jppipa.v8i2.1290

Santoso, B., Marchira, C. R., & Sumarni, P. (2017). Development and Validity and Reliability Tests of Professionalism Assessment Instrument in Psychiatry Residents. Jurnal Pendidikan Kedokteran Indonesia: The Indonesian Journal of Medical Education, 6(1), 60–65. https://doi.org/10.22146/jpki.25369 DOI: https://doi.org/10.22146/jpki.25369

Sari, D. N., Arif, K., Yurnetti, Y., & Putri, A. N. (2024). Identification of Students’ Misconceptions in Junior High Schools Accredited A Using the Three Tier Test Instrument in Science Learning. Jurnal Penelitian Pendidikan IPA, 10(1), 1–11. https://doi.org/10.29303/jppipa.v10i1.5064 DOI: https://doi.org/10.29303/jppipa.v10i1.5064

Selina, S., Muharini, R., Lestari, I., Masriani, M., & Rasmawan, R. (2024). Analysis of Understanding the Concept of Alkenes through the Three-tier Multiple Choice Diagnostic Test Instrument. Jurnal Penelitian Pendidikan IPA, 10(6), 3463–3472. https://doi.org/10.29303/jppipa.v10i6.6510 DOI: https://doi.org/10.29303/jppipa.v10i6.6510

Septian, I. D., Susilaningsih, E., & Sumarti, S. S. (2020). A Misconception Analysis of Buffer Material Using Three Tier Multiple Choice Test assisted by CBT for SMAN 9 Semarang. Journal of Innovative Science Education, 9(1), 40–49. https://doi.org/10.15294/jise.v8i1.32027

Setiawan, N. C. E., & Ilahi, P. R. (2022). Identification of Misconceptions in Chemical Bonding Materials Using Three Tier Diagnostic Test. Journal of Natural Science and Integration, 5(1), 77–89. https://doi.org/10.24014/jnsi.v5i1.16860 DOI: https://doi.org/10.24014/jnsi.v5i1.16860

Shiddiqi, M. H. A., Arthamena, V. D., Ayyubi, M., Manarisip, A. J., & Aznam, N. (2024). Systematic Literature Review: Analysis of Misconception Problems and Diagnostic Instruments for Learning Chemistry. Jurnal Penelitian Pendidikan IPA, 10(4), 168–179. https://doi.org/10.29303/jppipa.v10i4.5189 DOI: https://doi.org/10.29303/jppipa.v10i4.5189

Smith, B. (2018). Doing a Literature Review: Releasing the Research Imagination. Journal of Perioperative Practice, 28(12), 318–318. https://doi.org/10.1177/1750458918810149 DOI: https://doi.org/10.1177/1750458918810149

Suprapto, N. (2020). Do We Experience Misconceptions?: An Ontological Review of Misconceptions in Science. Studies in Philosophy of Science and Education, 1(2), 50–55. https://doi.org/10.46627/sipose.v1i2.24 DOI: https://doi.org/10.46627/sipose.v1i2.24

Swarni, A., Herwin, H., & Sujati, S. (2024). Testing the Construct Validity and Reliability of the Student Learning Interest Scale Using Confirmatory Factor Analysis (CFA). Jurnal Penelitian Pendidikan IPA, 10(9), 6322–6330. https://doi.org/10.29303/jppipa.v10i9.8794 DOI: https://doi.org/10.29303/jppipa.v10i9.8794

Treagust, D. F., Chittleborough, G., & Mamiala, T. L. (2003). The Role of Submicroscopic and Symbolic Representations in Chemical Explanations. International Journal of Science Education, 25(11), 1353–1368. https://doi.org/10.1080/0950069032000070306 DOI: https://doi.org/10.1080/0950069032000070306

Vaingankar, J. A., Abdin, E., Dam, R. M. V., Chong, S. A., Tan, L. W. L., Sambasivam, R., Seow, E., Chua, B. Y., Wee, H. L., Lim, W. Y., & Subramaniam, M. (2020). Development and Validation of the Rapid Positive Mental Health Instrument (R-PMHI) for Measuring Mental Health Outcomes in the Population. BMC Public Health, 20(1), 1–12. https://doi.org/10.1186/s12889-020-08569-w DOI: https://doi.org/10.1186/s12889-020-08569-w

Wahyudi, A., Richardo, R., Eilks, I., & Kulgemeyer, C. (2023). Development of Three Tier Open-Ended Instrument to Measure Chemistry Students’ Critical Thinking Disposition Using Rasch Analysis. International Journal of Instruction, 16(3), 191–204. Retrieved from https://e-iji.net/ats/index.php/pub/article/view/76 DOI: https://doi.org/10.29333/iji.2023.16311a

Wang, T. L., Dai, F., Hu, R., Seager, S., Febriyanti, F., Wiji, W., & Widhiyanti, T. (2019). Thermochemistry Multiple Representation Analysis for Developing Intertextual Learning Strategy Based on Predict Observe Explain (POE). Journal of Physics: Conference Series, 1157(4), 042042. https://doi.org/10.1088/1742-6596/1157/4/042042 DOI: https://doi.org/10.1088/1742-6596/1157/4/042042

Wiji, W., Widhiyanti, T., Delisma, D., & Mulyani, S. (2021). The Intertextuality Study of the Conception, Threshold Concept, and Troublesome Knowledge on Redox Reaction. Journal of Engineering Science and Technology, 16(2), 1356–1369. Retrieved from https://jestec.taylors.edu.my/Vol%2016%20issue%202%20April%202021/16_2_33.pdf

Winarni, S., Effendy, E., Budiasih, E., Wonorahardjo, S., & Winarni, S. (2022). Constructing ‘Concept Approval Strategy,’ A Chemistry Learning Idea to Prevent Misconceptions. Educación Química, 33(2), 159–180. https://doi.org/10.22201/fq.18708404e.2022.2.79841 DOI: https://doi.org/10.22201/fq.18708404e.2022.2.79841

Winarsih, S., & Priatmoko, S. (2019). Analisis Pemahaman Konsep Menggunakan Three-Tier Multiple Choice Test pada Pembelajaran Hidrolisis Berbantuan Metode Blended Learning Berbasis Inkuiri Terbimbing. Chemistry in Education, 8(2), 29–36. Retrieved from https://journal.unnes.ac.id/sju/chemined/article/view/39128

Wolfswinkel, J. F., Furtmueller, E., & Wilderom, C. P. M. (2013). Using Grounded Theory as a Method for Rigorously Reviewing Literature. European Journal of Information Systems, 22(1), 45–55. https://doi.org/10.1057/ejis.2011.51 DOI: https://doi.org/10.1057/ejis.2011.51

Wren, D., & Barbera, J. (2014). Psychometric Analysis of the Thermochemistry Concept Inventory. Chemistry Education Research and Practice, 15(3), 380–390. https://doi.org/10.1039/c3rp00170a DOI: https://doi.org/10.1039/C3RP00170A

Yusrizal, Y., & Halim, A. (2017). The Effect of The One-Tier, Two-Tier, and Three-Tier Diagnostic Test Toward the Students’ Confidence and Understanding Toward the Concepts of Atomic Nuclear. Unnes Science Education Journal, 6(2), 1583–1590. https://doi.org/10.15294/usej.v6i2.15856

Author Biographies

Beatrice Ruth Nathania Simanjuntak, Universitas Pendidikan Indonesia

Author Origin : Indonesia

Wiji Wiji, Universitas Pendidikan Indonesia

Author Origin : Indonesia

Tuszie Widhiyanti, Universitas Pendidikan Indonesia

Author Origin : Indonesia

Downloads

Download data is not yet available.

How to Cite

Simanjuntak, B. R. N., Wiji, W., & Widhiyanti, T. (2025). Systematic Literature Review: Three-Tier Diagnostic Test to Identifying Misconceptions in Chemistry. Jurnal Penelitian Pendidikan IPA, 11(7), 37–47. https://doi.org/10.29303/jppipa.v11i7.11011