Abstract
Robustness of normal test theory for correlation coefficients is at least asymptotically ensured for bivariate distributions satisfying a linearity and a homoscedasticity condition for the null theory and a further kurtosis condition for the nonnull theory. If any one of these conditions fall, it may be demonstrated that robustness may fail as well. This result is applied to study of the point biserial, multiserial correlation coefficients, and the ψ-coefficient.
Keywords
Get full access to this article
View all access options for this article.
References
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.
21.
22.
23.
24.
25.
26.
27.
28.
29.
30.
31.
32.
33.
34.
35.
36.
