DefinePK

DefinePK hosts the largest index of Pakistani journals, research articles, news headlines, and videos. It also offers chapter-level book search.

Expert Prediction Versus Difficulty Index Measured by Psychometric Analysis; A Mixed Method Study Interpreted through Diagnostic Judgment by Cognitive Modeling Framework


Article Information

Title: Expert Prediction Versus Difficulty Index Measured by Psychometric Analysis; A Mixed Method Study Interpreted through Diagnostic Judgment by Cognitive Modeling Framework

Authors: Memoona Mansoor, Shazia Imran, Ali Tayyab, Rehmah Sarfraz

Journal: Journal of University College of Medicine and Dentistry

HEC Recognition History
Category From To
Y 2024-10-01 2025-12-31
Y 2023-07-01 2024-09-30

Publisher: The University of Lahore

Country: Pakistan

Year: 2024

Volume: 3

Issue: 2

Language: English

DOI: 10.51846/jucmd.v3i2.3047

Keywords: AssessmentItem difficultyExpert PredictionDia Com FrameworkTest Psychometrics

Categories

Abstract

Objective: The item difficulty is determined in two ways; one relies on expert judgments, and the other on psychometric analysis. This study compared item developers' perceptions of item difficulty with psychometric analysis results and explored their thought processes in categorizing items.Methodology: This explanatory sequential mixed method study was conducted from October to December in 2022 in three phases (quantitative, qualitative, and mixed method strand). Difficulty ranking of items by 20 subject experts, for all the preclinical years' end-of-module exams was compared with that obtained by psychometric analysis from the OMR (Optical Mark Reader). Cohen’s Kappa was used to check the agreement and Pearson’s correlation was used to infer the correlation between the two measures (item writers’ perception of item difficulty and Rightmark analysis). All the item developers (20) were interviewed through an open-ended two-item questionnaire. Interviews were recorded and transcribed. Themes and subthemes were identified from interview data through manual coding. The anonymity of the participants was maintained.Results: A total of 1150 items from Anatomy, Physiology, Biochemistry, Pharmacology, Pathology & Forensic Medicine were compared. These items were developed by 20 content experts. There was a weak positive (r=0.11) but significant correlation (p=0.00) between faculty perception and Right mark analysis of the item difficulty. However, there was no agreement between the two measurements (Cohen’s Kappa k=0.042, p=0.027). The interviews of item developers identified four major themes: academic performance, learning habits, the content targeted, and the item's construction.
Conclusion: Experts consider contextual factors which cover content and student background, when ranking items, while psychometric analysis is based on item performance data. Thus, contextual nuances may lead to differences in judgment.


Paper summary is not available for this article yet.

Loading PDF...

Loading Statistics...