Educational Sciences: Theory & Practice

ISSN: 2630-5984

Examining Differential Item Functions of Different Item Ordered Test Forms According to Item Difficulty Levels

Ömay Çokluk
Department of Measurement and Evaluation, Faculty of Educational Sciences, Ankara University, Ankara Turkey
Emrah Gül
Department of Measurement and Evaluation, Hakkari University, Hakkâri Turkey
Çilem Doğan Gül
Department of Measurement and Evaluation, Ankara University, Ankara Turkey.


The study aims to examine whether differential item function is displayed in three different test forms that have item orders of random and sequential versions (easy-to-hard and hard-to-easy), based on Classical Test Theory (CTT) and Item Response Theory (IRT) methods and bearing item difficulty levels in mind. In the correlational research, the data from a total of 578 seventh graders were gathered using an Atomic Structures Achievement Test. R programming language and “difR” package were employed for all the analyses. As a result of the analyses, it was concluded that a comparison of IRT- and CTT-based methods indicate a greater number of items with distinctively significant differential item functioning. Different item ordering leads students at the same ability levels to display different performances on the same items. As a result, it is found that item order differentiates the probability of correct response to the items for those at the same ability levels. A test form of sequential easy-to-hard questions brings more advantages than that of a hard-to-easy sequence or a random version. The findings show that it is essential to arrange tests that are employed to make decisions about people in consideration with psychometric principles.

Achievement test, Test form, Item order, Item difficulty, Classical test theory, Item response theory, Differential item function, R programming language.