مطالب مرتبط با کلیدواژه

Item response Theory (IRT)


۱.

Differential Item Functioning (DIF) in Terms of Gender in the Reading Comprehension Subtest of a High-Stakes Test(مقاله علمی وزارت علوم)

کلیدواژه‌ها: Validity Test validation Test fairness Differential Item Functioning (DIF) Logistic Regression (LR) Item response Theory (IRT)

حوزه های تخصصی:
تعداد بازدید : ۱۰۵۲ تعداد دانلود : ۴۷۲
Validation is an important enterprise especially when a test is a high stakes one. Demographic variables like gender and field of study can affect test results and interpretations. Differential Item Functioning (DIF) is a way to make sure that a test does not favor one group of test takers over the others. This study investigated DIF in terms of gender in the reading comprehension subtest (35 items) of a high stakes test using a three-step logistic regression procedure (Zumbo, 1999). The participants of the study were 3,398 test takers, both males and females, who took the test in question (the UTEPT) as a partial requirement for entering a PhD program at the University of Tehran. To show whether the 35 items of the reading comprehension part exhibited DIF or not, logistic regression using a three step procedure (Zumbo, 1999) was employed. Three sets of criteria of Cohen’s (1988), Zumbo’s (1999), and Jodin and Girel’s (2001) were selected. It was revealed that, though the 35 items show “small” effect sizes according to Cohen’s classification, they do not display DIF based on the other two criteria. Therefore, it can be concluded that the reading comprehension subtest of the UTEPT favors neither males nor females.
۲.

Applying IRT Model to Determine Gender and Discipline-based DIF and DDF: A Study of the IAU English Proficiency Test

کلیدواژه‌ها: Differential Distractor Functioning (DDF) Differential Item Functioning (DIF) English Proficiency Test (EPT) Item response Theory (IRT) Test Bias

حوزه های تخصصی:
تعداد بازدید : ۸۹ تعداد دانلود : ۷۳
The purpose of this study was to examine gender and discipline-based Differential Item Functioning (DIF) and Differential Distractor Functioning (DDF) on the Islamic Azad University English Proficiency Test (IAUEPT). The study evaluated DIF and DDF across genders and disciplines using the Rasch model. To conduct DIF and DDF analysis, the examinees were divided into two groups: Humanities and Social Sciences (HSS) and Non-Humanities and Social Sciences (N-HSS). The results of the DIF analysis showed that four out of 100 items had DIF across gender, and two items had discipline DIF. Additionally, gender DDF analysis identified one item each for Options A, B, and C, and four items for Option D. Similarly, the discipline DDF analysis revealed one item for Option A, three items for Option B, four items for Option C, and three items for Option D. The findings of this study have significant implications for test developers. The identification of potential biases in high-stakes proficiency tests can help ensure fairness and equity for all examinees. Furthermore, identifying gender DIF can shed light on potential gender-based gaps in the curriculum, highlighting areas where male or female learners may be disadvantaged or underrepresented in terms of knowledge or skills.