Abstract
Statistical theory indicates that restriction of the range of possible values of normally distributed variables, and many non-normal variables, reduces correlations in unrestricted populations. Contrary to this typical outcome, results of a simulation study show that range restriction sometimes increased the correlation between variables having outlier-prone distributions. This result occurred in the case of exponential and ex-Gaussian distributions, which are encountered in experimental studies involving response times. It did not occur in truncated versions of the same densities. Chance occurrence of outliers in contaminated-normal, or mixed-normal, distributions reduced the correlation found between samples from uncontaminated populations. Conversely, detection and downweighting of outliers increased the magnitude of sample correlations, and a similar result occurred for many other outlier-prone distributions. Practical implications of these findings are discussed.
Get full access to this article
View all access options for this article.
