Abstract
The system of least prompts response prompting procedure has a rich history in special education research and practice. Recently, two independent systematic reviews were conducted to determine if the system of least prompts met criteria to be classified as an evidence-based practice. Both reviews used single-case design standards developed by What Works Clearinghouse to evaluate the rigor and effects of studies; however, findings and implications varied significantly across reviews. We examined the data supporting each review and discuss how two reviews on the same topic area using the same standards for evaluating studies could arrive at different conclusions. Results indicate that varying search parameters, visual analysis protocols, and the flexibility allotted by the design standards may have contributed to differences. We discuss the importance of multiple literature reviews on the same topic area with regard to replication research in special education. In addition, we highlight the necessity of open data in such reviews. Finally, we recommend how practitioners and researchers should collectively interpret the differing findings and conclusions from the reviews examining the system of least prompts.
Get full access to this article
View all access options for this article.
References
Supplementary Material
Please find the following supplemental material available below.
For Open Access articles published under a Creative Commons License, all supplemental material carries the same license as the article it is associated with.
For non-Open Access articles published, all supplemental material carries a non-exclusive license, and permission requests for re-use of supplemental material or any part of supplemental material shall be sent directly to the copyright owner as specified in the copyright notice associated with the article.
