Abstract
Using online crowdsourcing platforms has become an option for researchers to rapidly recruit participants for completing tasks that require human ingenuity. However, a growing concern is that workers on online crowdsourcing platforms, such as Amazon Mechanical Turk (AMT), receive unfair compensation for tasks completed. In this article, we explored the effects of the income level of participant’s country and the rate of payment on perceived payment fairness, task quality, and subjective experience. We tested our hypothesis using 3-way ANOVA and chi-square test of independence. The results showed that lower compensation increased the number of participants whose data might be unusable for research. Participants in the lower compensation rate group reported better perceived performance compared to those who received higher compensation. We found that high income country participants report less perceived effort and frustration than lower-middle income country participants.
Get full access to this article
View all access options for this article.
