STUDENTS’ DEPTH OF COGNITIVE PERFORMANCE ON SECONDARY SCHOOL STATE MATHEMATICS TESTS

Authors

  • Marta Mikite University of Latvia (LV)
  • Ilze France University of Latvia (LV)

DOI:

https://doi.org/10.17770/sie2023vol1.7086

Keywords:

cognitive abilities, item analysis, mathematics assessment, national large-scale assessment, performance analysis, secondary school exams

Abstract

Latvia began curriculum reform implementation in 2020. Not only has the learning content changed, but students are also now able to choose from one of three levels to study mathematics in secondary school. This requires new exams at the end of each of these courses. The aim of this study is to investigate student performance and to what extent students can use higher cognitive skills on the new exams. Results were analyzed using Classical Test Theory and Item Response Theory. Comparing the results of the new exams using Wright maps shows a trend that, although there are flaws, they are more representative of the relevant population than the previous exam. A group of experts determined the level of cognition for each of the tasks and analyzed students' performance. The results show that there are tasks where students can demonstrate the highest level of cognitive abilities. However, such tasks are relatively few, and they are mathematical rather than problems in a real context. Performance analysis shows that students have difficulty expressing their thoughts and this is one of the reasons why only a small number of students solve challenging problems

References

Anderson, L. W., & Krathwohl, D. R. (2001). A Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives: Complete Edition. New York: Longman.

Betels, Dž. (2003). Rokasgrāmata pārbaudes darbu veidotājiem. Rīga: Izglītības un zinātnes ministrija.

Biggs, J., Collis, K. (1982). Evaluating the quality of learning: The SOLO taxonomy. New York: Academic Press.

Bloom, B., Englehart, M. Furst, E., Hill, W., & Krathwohl, D. (1956). Taxonomy of educational objectives: The classification of educational goals. Handbook I: Cognitive domain. New York, Toronto: Longmans, Green.

Bond, T. G., & Fox, C. M. (2013). Applying the Rasch model: Fundamental measurement in the human sciences. New Jersey: Lawrence Erlbaum Associates Publishers.

Boone, W. J. (2016). Rasch analysis for instrument development: Why, when, and how? CBE—Life Sciences Education, 15(4), rm4.

Cimen, E. E. (2010). How compatible are the 9th grade mathematics written exams with mathematical power assessment criteria. Procedia-Social and Behavioral Sciences, 2(2), 4462-4467.

Eacott, S., & Holmes, K. (2010). Leading Reform in Mathematics Education: Solving a Complex Equation. Mathematics Teacher Education and Development, Vol. 12.2, 84–97.

Eklöf, H., & Nyroos, M. (2013). Pupil perceptions of national tests in science: perceived importance, invested effort, and test anxiety. European journal of psychology of education, 28(2), 497-510

Ekolu, S. O., & Quainoo, H. (2019). Reliability of assessments in engineering education using Cronbach’s alpha, KR and split-half methods. Global journal of engineering education, 21(1), 24-29.

France, I., Cakane, L., Namsone, D., & Cirulis, A. (2017). Cognitive Demand in Observed Lessons and National Testing Compared to PISA Mathematics Results in Latvia. EDULEARN Proceedings. DOI:10.21125/edulearn.2017.1102

Gneezy, U., List, J.A., Livingston, J.A., Qin, X., Sadoff, S., & Xu, Y. (2019). Measuring success in education: the role of effort on the test itself. American Economic Review: Insights, 1(3), 291-308.

Heyneman, S. P., & Ransom, A. W. (1990). Using examinations and testing to improve educational quality. Educational Policy, 4(3), 177-192.

Jakaitiene, A., Želvys, R., Vaitekaitis, J., Raižiene, S., & Dukynaite, R. (2021). Centralised Mathematics Assessments of Lithuanian Secondary School Students: Population Analysis. Informatics in Education, Vol. 20, No. 3, 439–462. DOI: 10.15388/infedu.2021.18

Krishnan V., (2013). The Early Child Development Instrument (EDI): An item analysis using Classical Test Theory (CTT) on Alberta’s data.

Latifi, S., Bulut, O., Gierl, M., Christie, T., & Jeeva, S. (2016). Differential performance on national exams: Evaluating item and bundle functioning methods using English, mathematics, and science assessments. Sage Open, 6(2). DOI: https://doi.org/10.1177/2158244016653791

Ministru kabinets (2019). Noteikumi par valsts vispārējās vidējās izglītības standartu un vispārējās vidējās izglītības programmu paraugiem, Nr. 416. Pieejams: https://likumi.lv/ta/id/309597-noteikumi-par-valsts-visparejas-videjas-izglitibas-standartu-un-visparejas-videjas-izglitibas-programmu-paraugiem

Ministru kabinets (2020). Noteikumi par valsts profesionālās vidējās izglītības standartu un valsts arodizglītības standartu, Nr. 332. Pieejams: https://likumi.lv/ta/id/315146-noteikumi-par-valsts-profesionalas-videjas-izglitibas-standartu-un-valsts-arodizglitibas-standartu

Namsone, D., & Oliņa, Z. (2018). Kā vērtē kompleksu sniegumu, No (sast. Namsone), Mācīšanās lietpratībai /Learning for Proficiency/ (44-65). Rīga: LU Akadēmiskais apgāds. DOI: https://doi.org/10.22364/ml.2018

Oliņa Z., Namsone, D., & France, I. (2018). Kompetence kā komplekss skolēna mācīšanās rezultāts. No (sast. Namsone), Mācīšanās lietpratībai /Learning for Proficiency/ (18-43). Rīga: LU Akadēmiskais apgāds. DOI: https://doi.org/10.22364/ml.2018.1

Paul, R., & Nosich, G. M. (1992). A Model for the National Assessment of Higher Order Thinking. In R. Paul (Ed.), Critical Thinking: What Every Student Needs to Survive in a Rapidly Changing World (pp. 78-123). Dillon Beach, CA: Foundation for Critical Thinking.

Pestovs, P., Saleniece I., & Namsone, D. (2019). Science Large-Scale Assessment to the Revised Science Curriculum. Proceedings of the 3rd International Baltic Symposium on Science and Technology Education, BalticSTE2019, Šiauliai, Lithuania.

Planinic, M., Boone, W. J., Susac, A., & Ivanjek, L. (2019). Rasch analysis in physics education research: Why measurement matters. Physical Review Physics Education Research, 15(2), 020111.

Rasch, G. (1960). Probabilistic models for some intelligence and attainment tests. Copenhagen: Danmarks Paedagogiske Institut.

Schraw, G., & Robinson, D. R. (2011). Conceptualizing and assessing higher order thinking skills. In G. Schraw & D. R. Robinson (Eds.), Assessment of higher order thinking skills (pp. 1–15). IAP Information Age Publishing.

Shiel, G., & Cartwright, F. (2015). National Assessments of Educational Achievement, Volume 4: Analyzing Data from a National Assessment of Educational Achievement. World Bank Publications.

Smith, M. S., & Stein, M. K. (1998). Selecting and creating mathematical tasks: From research to practice. Mathematics Teaching in the Middle School, 3(5), 344–350.

Swank, J. M., & Mullen, P. R. (2017). Evaluating evidence for conceptually related constructs using bivariate correlations. Measurement and Evaluation in Counseling and Development, 50(4), 270-274.

Tavakol, M. & Dennick R. Making sense of Cronbach's alpha. Int J Med Educ. 2011 Jun 27;2:53-55. DOI: 10.5116/ijme.4dfb.8dfd. PMID: 28029643; PMCID: PMC4205511.

VISC. (2023). Valsts pārbaudes darbu programmas. Pieejams: https://www.visc.gov.lv/lv/valsts-parbaudes-darbu-programmas

Downloads

Published

2023-07-03

How to Cite

Mikite, M., & France, I. (2023). STUDENTS’ DEPTH OF COGNITIVE PERFORMANCE ON SECONDARY SCHOOL STATE MATHEMATICS TESTS. SOCIETY. INTEGRATION. EDUCATION. Proceedings of the International Scientific Conference, 1, 528-540. https://doi.org/10.17770/sie2023vol1.7086