Knowledge of the best uses, limitations, and quality criteria for single subject design (SSD) research has grown over the decades. High-quality SSDs provide replicable results and advance knowledge. Using modern technology, meta-analysis and systematic review methods can be applied to synthesize results of SSDs to provide best evidence on numerous important interventions (e.g., those involving assistive technology). Larger numbers of SSDs are needed.
AllisonD. B., & GormanB. S. (1993). Calculating effect sizes for meta-analysis: The case of the single case. Behaviour Research and Therapy, 31, 621–631.
3.
BarlowD. H.NockM., & HersenM. (2009). Single case experimental designs: Strategies for studying behavior change (3rd ed.). Boston: Pearson/Allyn and Bacon.
4.
BengaliM. K., & OttenbacherK. J. (1998). The effect of autocorrelation on the results of visually analyzing data from single-subject designs. American Journal of Occupational Therapy, 52, 650–655.
5.
BobrovitzC. D., & OttenbacherK. J. (1998). Comparison of visual inspection and statistical analysis of single-subject data in rehabilitation research. American Journal of Physical and Medical Rehabilitation, 77, 94–102.
6.
CarswellA.McCollM. A.BaptisteS.LawM.PolatajkoH., & PollockN. (2004). The Canadian Occupational Performance Measure: A research and clinical literature review. Canadian Journal of Occupational Therapy, 71, 210–222.
7.
CookT. D., & CampbellD. T. (1979). Quasi-experimentation: Design & analysis issues for field settings. Boston: Houghton Mifflin Co.
8.
DijkersM. P. (2009). The value of traditional reviews in the era of systematic reviewing. American Journal of Physical and Medical Rehabilitation, 88, 423–430.
9.
EdlundW.GronsethG.SoY., & FranklinG. (2004). American Academy of Neurology Clinical Practice Guideline Process Manual. St. Paul, MN: American Academy of Neurology.
10.
FranklinR. D.AllisonD. B., & GormanB. S. (1996). Design and analysis of single-case research. Mahwah, NJ: Lawrence Erlbaum Associates.
11.
GuyattG. H.HaynesR. B.JaeschkeR. Z.CookD. J.GreenL.NaylorC. D. (2000). Users' Guides to the Medical Literature: XXV. Evidence-based medicine: Principles for applying the Users' Guides to patient care. Evidence-Based Medicine Working Group. Journal of the American Medical Association, 284, 1290–1296.
HigginsJ. P. T., & GreenS. (Eds.). (2005). Cochrane Handbook for Systematic Reviews of Interventions 4.2.5. Chichester, UK: John Wiley & Sons, Ltd.
14.
HigginsJ. P. T., & GreenS. (Eds.). (2009). Cochrane Handbook for Systematic Reviews of Interventions 5.0.2. Chichester, UK: Cochrane Collaboration.
15.
HornerR. H.CarrE. G.HalleJ.McGeeG.OdomS., & WolleryM. (2005). The use of single-subject research to identify evidence-based practice in special education. Exceptional Children, 71, 165–179.
16.
JanoskyJ. E.LeiningerS. L.HoergerM. P., & LibkumanT. M. (2009). Single subject designs in biomedicine. Dordrecht: Springer.
17.
JensenL. (1990). Guidelines for the application of ARIMA models in time series. Research in Nursing and Health, 13, 429–435.
18.
JohnstonM. V., & Case-SmithJ. (2009). Development and testing of interventions in occupational therapy: Toward a new generation of research in occupational therapy. OTJR: Occupation, Participation and Health, 29, 4–13.
19.
JohnstonM. V.OttenbacherK. J., & ReichardtC. S. (1995). Strong quasi-experimental designs for research on the effectiveness of rehabilitation. American Journal of Physical and Medical Rehabilitation, 74, 383–392.
20.
KratochwillT. R., & LevinJ. R. (Eds.). (1992). Single-case research design and analysis: New directions for psychology and education. Hillsdale, NJ: Lawrence Erlbaum Associates.
21.
MadsenH. (2008). Time series analysis. Boca Raton: Chapman & Hall/CRC.
22.
NelsonB. K. (1998). Statistical methodology: V. Time series analysis using autoregressive integrated moving average (ARIMA) models. Academic Emergency Medicine, 5, 739–744.
23.
OttenbacherK. (1986). Evaluating clinical change: Strategies for the occupational and physical therapist. Baltimore: Williams & Wilkins.
24.
OttenbacherK. J. (1990). Visual inspection of single-subject data: An empirical analysis. Mental Retardation, 28, 283–290.
25.
OttenbacherK. J. (1993). Interrater agreement of visual analysis in single-subject decisions: Quantitative review and analysis. American Journal of Mental Retardation, 98, 135–142.
26.
OttenbacherK. J., & CusickA. (1990). Goal attainment scaling as a method of clinical service evaluation. American Journal of Occupational Therapy, 44, 519–525.
27.
PortneyL. G., & WatkinsM. P. (2008). Foundations of clinical research: Applications to practice (3rd ed.). Upper Saddle River, NJ: Prentice Hall Health.
28.
RamsayC. R.MatoweL.GrilliR.GrimshawJ. M., & ThomasR. E. (2003). Interrupted time series designs in health technology assessment: Lessons from two systematic reviews of behavior change strategies. International Journal of Technology Assessment in Health Care, 19, 613–623.
29.
Riley-TillmanT. C., & BurnsM. K. (2009). Evaluating educational interventions: Single-case design for measuring response to intervention. New York: Guilford Press.
30.
RoleyS. S.DeLanyJ. V.BarrowsC. J.BrownriggS.HonakerD.SavaD. I. (2008). Occupational therapy practice framework: Domain & practice, 2nd edition. American Journal of Occupational Therapy, 62, 625–683.
31.
RomeiserL. L.HickmanR. R.HarrisS. R., & HerizaC. B. (2008). Single-subject research design: Recommendations for levels of evidence and quality rating. Developmental Medicine and Child Neurolology, 50, 99–103.
32.
RosnerB. (2006). Fundamentals of biostatistics (6th ed.). Belmont, CA: Thomson-Brooks/Cole.
33.
SackettD. L. (2000). Evidence-based medicine: How to practice and teach EBM (2nd ed.). New York: Churchill Livingstone.
34.
SalzbergC. L.StrainP. S.BaerD. M. (1987). Meta-analysis for single-subject research: When does it clarify, when does it obscure?RASE: Remedial & Special Education, 82(2), 43–48.
35.
SchlosserR. (2003). Synthesizing efficacy research in AAC. In SchlosserR.LloydL.SchlodberR. W. (Eds.), The efficacy of augmentative and alternative communication. Bingley, UK: Emerald Group Publishing Ltd.; 230–257.
36.
SchlosserR. W. (2009). The role of single-subject experimental designs in evidence-based practice times. Austin, TX: National Center for the Dissemination of Disability Research. Retrieved May 29, 2009, from http://www.ncddr.org/kt/products/focus/focus22/.
37.
ShadishW. R., & RindskopfD. M. (2007). Methods for evidence-based practice: Quantitative synthesis of single-subject designs. New Directions for Evaluation, 113, 95–109.
38.
SmithG. C., & PellJ. P. (2003). Parachute use to prevent death and major trauma related to gravitational challenge: Systematic review of randomised controlled trials. British Medical Journal, 327, 1459–1461.
39.
StocksJ. T., & WilliamsM. (1995). Evaluation of single subject data using statistical hypothesis tests versus visual inspection of charts with and without celeration lines. Journal of Social Service Research, 20, 105–126.
40.
van den NoortgateW., & OnghenaP. (2003). Hierarchical linear models for the quantitative integration of effect sizes in single-case research. Behavior Research Methods, Instruments, Computers, 35(1), 1–10.
41.
van den NoortgateW., & OnghenaP. (2007). Aggregating single case results. The Behavior Analyst Today, 8, 196–209.
42.
WestS.KingV.CareyT. S.LohrK. N.SuttonS. F., & LuxL. (2002). Summary of systems to rate the strength of scientific evidence. AHRQ Evidence report/technology assessment: Number 47 (No. AHRQ Publication No. 02-E015). Rockville, MD: Agency for Healthcare Research and Quality.
43.
WestS. G.DuanN.PequegnatW.GaistP.Des JarlaisD. C.HoltgraveD. (2008). Alternatives to the randomized controlled trial. American Journal of Public Health, 98, 1359–1366.