Abstract
Studies examining the similarity of online self-report survey responses using different item formats have yielded inconclusive results. Additionally, no studies have used appropriate methods for thoroughly and correctly examining equivalence across conditions. We examined the comparability of survey responses across four item formats—horizontal radio button, text box, drop-down menu, and vertical radio button—in two studies. The second study added two response categories: optional responding and forced responding. Participants were college students at two institutions of higher education who were randomly assigned to conditions. They completed measures of computer self-efficacy, personality, and social desirability. Results of both studies indicated quantitative (mean scores) and qualitative (internal consistency estimates and scale intercorrelations) equivalence. However, there were notable differences in auxiliary equivalence such that participants in the text box condition had lower amounts of missing data than those in the other conditions, those in the horizontal radio button condition completed the study in the shortest amount of time, and participants across conditions generally preferred to use drop-down menus compared to other item formats.
Keywords
Get full access to this article
View all access options for this article.
References
Supplementary Material
Please find the following supplemental material available below.
For Open Access articles published under a Creative Commons License, all supplemental material carries the same license as the article it is associated with.
For non-Open Access articles published, all supplemental material carries a non-exclusive license, and permission requests for re-use of supplemental material or any part of supplemental material shall be sent directly to the copyright owner as specified in the copyright notice associated with the article.
