BerkRA (2016) “Classification and regression trees (CART). In: Regression Trees, New York, NY: Springer, pp.129–186.
2.
BolognaGHayashiY (2018) A comparison study on rule extraction from neural network ensembles, boosted shallow trees, and SVMs. Applied Computing Intelligence and Soft Computing2018: 1–20.
3.
ChenYLiYChengX, et al. (2006) Survey and Taxonomy of Feature Selection Algorithms in Intrusion Detection System. Beijing: Springer, pp.153–167.
4.
ClarkeRTysonJJDixonJM (2015) Endocrine resistance in breast cancer—An overview and update. Molecular and Cellular Endocrinology418(3): 220–234.
5.
CutlerACutlerDRStevensJR (2011) Random forests. In: Ensemble Machine Learning, USA, Springer Science+Business Media, LLC2012, pp.157–176.
6.
HanJMoragaC (1995) The influence of the sigmoid function parameters on the speed of Back propagation learning. In: Proceedings of the international workshop on artificial neural networks: from natural to artificial neural computation, London, Springer, pp.195–201.
HuangM-WChenC-WLinW-C, et al. (2017) SVM and SVM ensembles in breast cancer prediction. PLoS One12(1): e0161501.
9.
LiJChengKWangS, et al. (2018) Feature selection: A data perspective. ACM Computer Survey50(6): 94.
10.
PadillaJMCMurilloJAOBlancoMDRM, et al. (2016) Breast cancer tumor classification using LASSO method selection approach. Proc ISSSD3: 260–270.
11.
RongMGongDGaoXZ. (2019) FS and its use in big data: Challenges, methods and trends. IEEE Access7: 1–18.
12.
SiuAL (2016) Screening for breast cancer: U.S. Preventive services task force recommendation statement. Annals of Internal Medicine164(4): 279–296.
13.
Ul HaqAZhangDPengH, et al. (2019) Combining multiple feature-ranking techniques and clustering of variables for feature selection. IEEE Access7: 151482–151492.
14.
WuniriQHuangfuWLiuY, et al. (2019) Generic-driven wrapper embedded with feature-type aware hybrid Bayesian classifier. IEEE Access7: 119931–119942.