This article addresses conceptual and methodological shortcomings regarding conducting and interpreting intelligence test factor analytic research that appeared in the Decker, S. L., Bridges, R. M., Luedke, J. C., & Eason, M. J. (2020). Dimensional evaluation of cognitive measures: Methodological confounds and theoretical concerns. Journal of Psychoeducational Assessment. Advance online publication article.
AppelbaumM.CooperH.KlineR. B.Mayo-WilsonE.NezuA. M.RaoS. M. (2018). Journal article reporting standards for quantitative research in psychology: The APA publications and communications board task force report. American Psychologist, 73(1), 3-25. doi:10.1037/amp0000191
2.
ArbuckleJ. L. (2019). Amos (Version 26.0) [Computer Program]. Chicago: IBM SPSS.
3.
BeaujeanA. A. (2015). John Carroll’s views on intelligence: Bi-factor vs. higher-order models. Journal of Intelligence, 3(4), 121-136. doi:10.3390/jintelligence3040121
4.
BensonN. F.BeaujeanA. A.McGillR. J.DombrowskiS. C. (2018). Revisiting Carroll’s survey of factor-analytic studies: Implications for the clinical assessment of intelligence. Psychological Assessment, 30(8), 1028-1038. doi:10.1037/pas0000556
5.
BentlerP. M.WuE. J. C. (2016). EQS for windows. Multivariate software.
6.
BonifayW.LaneS. P.ReiseS. P. (2017). Three concerns with applying a bifactor model as a structure of psychopathology. Clinical Psychological Science, 5(1), 184-186. doi:10.1177/2167702616657069
7.
BrownT. A. (2015). Confirmatory factor analysis for applied research (2nd ed.). New York: Guilford.
8.
CanivezG. L.WatkinsM. W.DombrowskiS. C. (2017). Structural validity of the Wechsler intelligence scale for children-fifth edition: Confirmatory factor analyses with the 16 primary and secondary subtests. Psychological Assessment, 29(4), 458-472. doi:10.1037/pas0000358
9.
CanivezG. L.WatkinsM. W.GoodR.JamesK.JamesT. (2017). Construct validity of the WISC–IVUK with a referred Irish sample: Wechsler and CHC model comparisons with 15 subtests. British Journal of Educational Psychology, 87(3), 383-407. doi:10.1111/bjep.12155
10.
CanivezG. L.WatkinsM. W.McGillR. J. (2019). Construct validity of the Wechsler intelligence scale for children-fifth UK edition: Exploratory and confirmatory factor analyses of the 16 primary and secondary subtests. The British Journal of Educational Psychology, 89(2), 195-224. doi:10.1111/bjep.12230
11.
CarrettaT. R.ReeM. J. (2001). Pitfalls of ability research. International Journal of Selection and Assessment, 9(4), 325-335. doi:10.1111/1468-2389.00184
12.
CarrollJ. B. (1993). Human cognitive abilities: A survey of factor analytic studies. Cambridge: Cambridge University Press. doi:10.1017/CBO9780511571312
13.
CarrollJ. B. (1998). Human cognitive abilities: A critique. In McArdleJ. J.WoodcockR. W. (Eds.), Human cognitive abilities in theory and practice (pp. 5-23). Hillsdale, NJ: Lawrence Erlbaum Associates, Inc.
14.
ChenF. F.WestS.SousaK. (2006). A comparison of bifactor and second-order models of quality of life. Multivariate Behavioral Research, 41(2), 189-225. doi:10.1207/s15327906mbr4102_5
15.
DeckerS. L.BridgesR. M.LuedkeJ. C.EasonM. J. (2020). Dimensional evaluation of cognitive measures: Methodological confounds and theoretical concerns. Journal of Psychoeducational Assessment. Advance online publication. doi:10.1177/0734282920940879
16.
DombrowskiS. C. (2020). A newly proposed framework and a clarion call to improve practice. In DombrowskiS. C. (Ed.), Psychoeducational assessment and report writing (2nd ed., pp. 9-59). Cham Switzerland: Springer Nature. doi:10.1007/978-3-030-44641-3_2
17.
DombrowskiS. C.BeaujeanA. A.McGillR. J.BensonN. F.SchneiderW. J. (2019). Using exploratory bifactor analysis to understand the latent structure of multidimensional psychological measures: An example featuring the WISC-V. Structural Equation Modeling, 26(6), 847-860. doi:10.1080/10705511.2019.1622421
18.
DombrowskiS. C.GolayP.McGillR. J.CanivezG. L. (2018). Investigating the theoretical structure of the DAS-II core battery at school age using Bayesian structural equation modeling. Psychology in the Schools, 55(2), 190-207. doi:10.1002/pits.22096
19.
DombrowskiS. C.McGillR. J.CanivezG. L.PetersonC. H. (2019). Investigating the theoretical structure of the differential ability scales-second edition through hierarchical exploratory factor analysis. Journal of Psychoeducational Assessment, 37(1), 91-104. doi:10.1177/0734282918760724
20.
DombrowskiS. C.McGillR. J.MorganG. W. (2019). Monte Carlo modeling of contemporary intelligence test (IQ) factor structure: Implications for IQ assessment, interpretation and theory. Assessment. Advance online publication. doi:10.1177/1073191119869828
Fenollar-CortésJ.WatkinsM. W. (2019). Construct validity of the Spanish version of the Wechsler intelligence scale for children fifth edition (WISC-VSpain). International Journal of School & Educational Psychology, 7(3), 150-164. doi:10.1080/21683603.2017.1414006
23.
GignacG. E. (2016). The higher-order model imposes a proportionality constraint: That is why the bifactor model tends to fit better. Intelligence, 55, 57-68. doi:10.1016/j.intell.2016.01.006
24.
GiordanoC.WallerN. G. (2020). Recovering bifactor models: A comparison of seven methods. Psychological Methods, 25(2), 143-156. doi:10.1037/met0000227
25.
GorsuchR. L. (1983). Factor analysis (2nd ed.). New Jersey: Erlbaum.
26.
HornJ. L. (1968). Organization of abilities and the development of intelligence. Psychological Review, 75(3), 242-259. doi:10.1037/h0025662
27.
IBMCorp (2020). IBM SPSS statistics for windows, version 27.0. Armonk, NY: IBM Corp.
JensenA. R. (1987). The g beyond factor analysis. In RonningR. R.GloverJ. A.ConoleyJ. C.WittJ. C. (Eds.), The influence of cognitive psychology on testing (pp. 87-142). Hillsdale, NJ: Lawrence Erlbaum Associates, Inc.
30.
JensenA. R. (1998). The g factor: The science of mental ability. Santa Barbara, CA: Praeger.
31.
JohnsonW.BouchardT. J. (2005). The structure of human intelligence: It is verbal, perceptual, and image rotation (VPR), not fluid and crystallized. Intelligence, 33(4), 393-416. doi:10.1016/j.intell.2004.12.002
32.
KeithT. Z.ReynoldsM. R. (2018). Using confirmatory factor analysis to aid in understanding the constructs measured by intelligence tests. In FlanaganD. P.McDonoughE. M. (Eds.), Contemporary intellectual assessment: Theories, tests, and issues (4th ed., pp. 853-900). New York: Guilford.
33.
KlineR. B. (2016). Principles and practice of structural equation modeling (4th ed.). New York: Guilford.
34.
KovacsK.ConwayA. R. A. (2016). Process overlap theory: A unified account of the general factor of intelligence. Psychological Inquiry, 27(3), 151-177. doi:10.1080/1047840X.2016.1153946
35.
LecerfTCanivezGL (2018). Complementary exploratory and confirmatory factor analyses of the French WISC–V: Analyses based on the standardization sample. Psychological Assessment, 30(6), 1009-1808. doi:10.1037/pas0000526
36.
LoehlinJ. C.BeaujeanA. A. (2016). Latent variable models: An introduction to factor, path, and structural equation analysis (5th ed.). New York: Routledge.
37.
LubinskiD. (2004). Introduction to the special section on cognitive abilities: 100 years after Spearman’s (1904) “‘General intelligence,’ objectively determined and measured.”Journal of Personality and Social Psychology, 86(1), 96-111. doi:10.1037/0022-3514.86.1.96
38.
McGillR. J. (2020). An instrument in search of a theory: Structural validity of the Kaufman assessment battery for children-second edition normative update at school-age. Psychology in the Schools, 57(2), 247-264. doi:10.1002/pits.22304
39.
McGillR. J.DombrowskiS. C.CanivezG. L. (2018). Cognitive profile analysis in school psychology: History, issues, and continued concerns. Journal of School Psychology, 71, 108-121. doi:10.1016/j.jsp.2018.10.007
40.
MansolfM.ReiseS. P. (2017). When and why the second-order and bifactor models are distinguishable. Intelligence, 61, 120-129. doi:10.1016/j.intell.2017.01.012
41.
Maydeu-OlivaresA.CoffmanD. L. (2006). Random intercept item factor analysis. Psychological Methods, 11(4), 344-362. doi:10.1037/1082-989X.11.4.344
42.
MorganG. B.HodgeK. J.WellsK. E.WatkinsM. W. (2015). Are fit indices biased in favor of bi-factor models in cognitive ability research? A comparison of fit in correlated factors, higher-order, and bi-factor models via Monte Carlo simulations. Journal of Intelligence, 3(1), 2-20. doi:10.3390/jintelligence3010002
43.
MuthénL. K.MuthénB. O. (1998–2017). Mplus user’s guide(8th ed.).
44.
MurrayA. L.JohnsonW. (2013). The limitations of model fit in comparing the bi-factor versus higher-order models of human cognitive ability structure. Intelligence, 41(5), 407-422. doi:10.1016/j.intell.2013.06.004
45.
ReiseS. P. (2012). The rediscovery of bifactor measurement models. Multivariate Behavioral Research, 47(5), 667-696. doi:10.1080/00273171.2012.715555
46.
ReiseS. P.BonifayW. E.HavilandM. G. (2013). Scoring and modeling psychological measures in the presence of multidimensionality. Journal of Personality Assessment, 95(2),129-140. doi:10.1080/00223891.2012.725437
47.
RodriguezA.ReiseS. P.HavilandM. G. (2016a). Evaluating bifactor models: Calculating and interpreting statistical indices. Psychological Methods, 21(2),137-150. doi:10.1037/met0000045
48.
RodriguezA.ReiseS. P.HavilandM. G. (2016b). Applying bifactor statistical indices in the evaluation of psychological measures. Journal of Personality Assessment, 98(3), 223-237. doi:10.1080/00223891.2015.1089249
49.
SassoG. M. (2001). The retreat from inquiry and knowledge in special education. Journal of Special Education, 34(4), 178-193. doi:10.1177/002246690103400401
50.
SchmidJ.LeimanJ. M. (1957). The development of hierarchical factor solutions. Psychometrika, 22, 53-61. doi:10.1007/BF02289209
51.
SchmiedekF.LiS.-C. (2004). Toward an alternative representation for disentangling age- associated differences in general and specific cognitive abilities. Psychology and Aging,19(1), 40-56. doi:10.1037/0882-7974.19.1.40
52.
SchneiderW. J.KaufmanA. S. (2016). Commentary on current practices and future directions for the assessment of child and adolescent intelligence in schools around the world. International Journal of School & Educational Psychology, 4(4), 283-288. doi:10.1080/21683603.2016.1206383
53.
SelbomM.TellegenA. (2019). Factor analysis in psychological assessment research: Common pitfalls and recommendations. Psychological Assessment, 31(12), 1428-1441. doi:10.1037/pas0000623
54.
SternbergR. J. (2003). “My house is a very very very fine house” –But it is not the only house. In NyborgH. (Ed.), The scientific study of general intelligence: Tribute to Arthur R. Jensen (pp. 373-395). Oxford, UK: Elsevier Science, LTD.
55.
StricklandTWatkinsMWCaterinoLC (2015). Structure of the Woodcock-Johnson III cognitive tests in a referral sample of elementary school students. Psychological Assessment, 27(2), 689-697. doi:10.1037/pas0000052
56.
WarneRTBurninghamC (2019). Spearman’s g found in 31 non-Western nations: Strong evidence that g is a universal phenomenon. Psychological Bulletin, 145(3), 237-272. doi:10.1037/bul0000184
57.
WatkinsM. W. (2018). Exploratory factor analysis: A guide to best practice. Journal of Black Psychology, 44(3), 219-246. doi:10.1177/0095798418771807
58.
WolffHGPreisingK (2005). Exploring item and higher order factor structure with the Schmid–Leiman solution: Syntax codes for SPSS and SAS. Behavior Research Methods, 37, 48-58. doi:10.3758/BF03206397
59.
WoodcockR. W.McGrewK. S.MatherN. (2001). Woodcock-Johnson III Tests of Cognitive Abilities. Itasca, IL: Riverside Publishing.
60.
YungY. F.ThissenD.McLeodL. D. (1999). On the relationship between the higher-order factor model and the hierarchical factor model. Psychometrika, 64(2), 113-128. doi:10.1007/bf02294531