Abstract
The production of official statistics has become increasingly challenging over the past 25 years because of concerns about data quality, particularly related to declining response rates and complex coverage issues requiring the use of multiple sampling frames. Concurrently, government agencies are collecting more data through administrative sources. Although these data sources offer potential cost efficiencies and timeliness, they present their own quality challenges that must be systematically addressed to ensure their viability for official statistical purposes. Building on the Total Survey Error paradigm, we propose a related paradigm that can be used to evaluate the quality and potential error sources of administrative data. Our Total Data Quality paradigm extends beyond traditional probability surveys to encompass administrative data, found data, or other nonprobability data sources. It incorporates several unique features tailored to address the specific challenges inherent in administrative records, aiming to identify and mitigate these distinct issues effectively. We illustrate the application of this paradigm through a case study of the National Incident-Based Reporting System, the primary administrative source used to produce statistics for crimes recorded by law enforcement in the United States.
Get full access to this article
View all access options for this article.
