Abstract
Multifactor dimensionality reduction (MDR) method is a machine learning algorithm to detect nonlinear interactions. This multifactor dimensionality reduction analysis is a combination of factor selection by classification accuracy, model selection by prediction accuracy and cross-validation consistency of classification accuracy, and statistical significance by the permutation testing. In this paper, we compare the performances of the standard multifactor dimensionality reduction method and a modified method in which the best model is selected by the area under receiver operating characteristic curve and cross-validation consistency of the area under the receiver operating characteristic curve. We conducted simulation studies based on 1,000 replicates per each parameter setting. The proposed MDR shows 1–8% increase of power to detect nonlinear interactions while false discovery rates remain the same as the original MDR. As an illustration, we applied these methods to pharmacogenomics data of antiepileptic drugs.
Get full access to this article
View all access options for this article.
