Abstract
The equivalence of scores and one-parameter logistic model item difficulty estimates (bs) obtained from computer-based and paper-and-pencil (PP) forms of a licensure examination was evaluated across testing modes. Item response theory was used to construct forms that were similar to the operational PP and computer-based examinations (however, fewer items were used). Forms were administered in both testing modes to examinees during a single testing session. A repeated measures experimental design permitted statistically powerful tests of several factors that have been documented to affect scores and bs. There was no effect of either order or mode of administration on bs. An order x mode-of-administration effect resulted in a significant difference in PP versus computer-based ability estimates when a PP form was administered prior to, but not after, a computer-based form. Ability estimates were equivalent for those examinees administered a computer-based form initially. The presence of an order x mode interaction demonstrates the importance of controlling for the order of administration when evaluating the equivalence of scores across modes.
Get full access to this article
View all access options for this article.
