Performance of exploratory structural equation model (ESEM) in detecting differential item functioning



Musa Adekunle Ayanwale

The validity of a standardised test is questioned if an irrelevant construct is accounted for the performance of examinees, which is wrongly modeled as the ability in the construct (test items). A test must ensure precision in the examinee’s ability irrespective of their sub-population in any demographic variables. This paper explored the potentials of gender and school location as major covariates on the West African Examinations Council (WAEC) mathematics items among examinees (N=2,866) using Exploratory Structural Equation Modeling (ESEM). The results remarked that the test is multidimensional (six-factors) with compliance fix indices of (χ2 (940)=4882.024, p < 0.05, CFI=0.962, TLI=0.930, RMSEA=0.038, SRMR=0.030, 90 % CI=0.037-0.039, Akaike information criterion (AIC)=147290.577, Bayesian information criterion (BIC)=149585.436 and Sample-size adjusted BIC=148362.154) respectively. Also, there were 10 (20 %) significant DIF items in the WAEC to gender, while 3 (6 %) of the items indicated significant DIF to school location. Observed DIF items acquaint test developers; the existence of DIF may differentially affect the performance of examinees with the same ability level. The implications of the test are severe for the examinees. Hence, accurate and unbiased assessment should be the basic principles for any test item measurement, and test developers need to test the items to be free from biases psychometrically.

FULL PAPER

How to cite paper:

Ayanwale, M, (2022). Performance of exploratory structural equation model (ESEM) in detecting differential item functioning. EUREKA: Social and Humanities, 1, 58-73. doi:https://doi.org/10.21303/2504-5571.2022.002254