[1] V. Bol´on-Canedo, N. S´anchez-Maro~no, A. Alonso-Betanzos, A review of feature selection methods on synthetic
data, Knowledge and information systems, 34(3) (2013), 483-519.
[2] V. Bol´on-Canedo, N. S´anchez-Marono, A. Alonso-Betanzos, J. M. Ben´ıtez, F. Herrera, A review of microarray
datasets and applied feature selection methods, Information Sciences, 282 (2014), 111-135.
[3] G. Chandrashekar, F. Sahin, A survey on feature selection methods, Computers & Electrical Engineering, 40(1)
(2014), 16-28.
[4] N. Chen, Z. Xu, M. Xia, Correlation coefficients of hesitant fuzzy sets and their applications to clustering analysis,
Applied Mathematical Modelling, 37(4) (2013), 2197-2211.
[5] Y. Chen, Y. Xue, Y. Ma, F. Y, Measures of uncertainty for neighborhood rough sets, Knowledge-Based Systems,
120 (2017), 226-235.
[6] Y. Chen, Z. Zhang, J. Zheng, Y. Ma, Y. Xue, Gene selection for tumor classification using neighborhood rough sets
and entropy measures, Journal of Biomedical Informatics, 67 (2017), 59-68.
[7] B. Choi, H. Kim, W. Cha, A Comparative Study on Discretization Algorithms for Data Mining, Communications
for Statistical Applications and Methods, 18(1) (2011), 89-102.
[8] A. Chouchoulas, Q. Shen, Rough set-aided keyword reduction for text categorization, Applied Artificial Intelligence,
15(9) (2001), 843-873.
[9] J. Dai, Q. Xu, Attribute selection based on information gain ratio in fuzzy rough set theory with application to
tumor classification, Applied Soft Computing, 13(1) (2013), 211-221.
[10] M.K. Ebrahimpour, M. Eftekhari, Ensemble of feature selection methods: A hesitant fuzzy sets approach, Applied
Soft Computing, 50 (2017), 300-312.
[11] M. K. Ebrahimpour, M. Zare, M. Eftekhari, G. Aghamolaei, Occam’s razor in dimension reduction: Using reduced
row Echelon form for finding linear independent features in high dimensional microarray datasets, Engineering
Applications of Artificial Intelligence, 62 (2017), 214-221.
[12] U. M. Fayyad, K. B. Irani, Multi-interval discretization of continuous-valued attributes for classification learning,
in: Proceedings of the International Joint Conference on Uncertainty in AI, Chambery, France, 6(1) (1993), 1022-
1029.
[13] I. Guyon, A. Elisseeff, An introduction to variable and feature selection, Journal of machine learning research,
3(1) (2003), 1157-1182.
[14] M. A. Hall, Correlation-based feature selection for machine learning, University of Waikato Hamilton, (1999),
51-151.
[15] R. Jensen, Q. Shen, New approaches to fuzzy-rough feature selection, IEEE Transactions on Fuzzy Systems, 17(4)
(2009), 824-838.
[16] K. Kaneiwa, A rough set approach to multiple dataset analysis, Applied Soft Computing, 11(2) (2011), 2538-2547.
[17] I. Kononenko, Estimating attributes: analysis and extensions of RELIEF, European conference on machine learning, (1994), 171-182.
[18] M. Kudo, J. Sklansky, Comparison of algorithms that select features for pattern classifiers, Pattern recognition,
33(1) (2000), 25-41.
[19] J. Liu, Q. Hu, D. Yu, A comparative study on rough set based class imbalance learning, Knowledge-Based Systems,
21(8) (2008), 753-763.
[20] J. Liu, Q. Hu, D. Yu, A weighted rough set based method developed for class imbalance learning, Information
Sciences, 178(4) (2008), 1235-1256.
[21] P. Maji, A Rough Hypercuboid Approach for Feature Selection in Approximation Spaces, IEEE Transactions on
Knowledge and Data Engineering, 26(1) (2014), 16-29.
[22] P. Maji, P. Garai, On fuzzy-rough attribute selection: criteria of max-dependency, max-relevance, min-redundancy,
and max-significance, Applied Soft Computing, 13(9) (2013), 3968-3980.
[23] P. Maji, S. K. Pal, Fuzzy Rough Sets for Information Measures and Selection of Relevant Genes From Microarray
Data, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 40(3) (2010), 741-752.
[24] P. Maji, S. Paul, Rough set based maximum relevance-maximum significance criterion and Gene selection from
microarray data, International Journal of Approximate Reasoning, 52(3) (2011), 408-426.
[25] P.E. Meyer, Information-theoretic variable selection and network inference from microarray data, Ph. D. Thesis.
Universit´e Libre de Bruxelles, (2008), 19-84.
[26] M. Moradkhani, A. Amiri, M. Javaheri, H. Safari, A hybrid algorithm for feature subset selection in highdimensional datasets using FICA and IWSSr algorithm, Applied Soft Computing, 25 (2015), 123-135.
[27] J. Moreno-Torres, J. S´aez, F. Herrera, Study on the impact of partition-induced dataset shift on k-fold crossvalidation, IEEE Transactions on Neural Networks and Learning Systems, 23(8) (2012), 1304-1312.
[28] Z. Pawlak, Rough sets, International Journal of Parallel Programming, 11(5) (1982), 341-356.
[29] H. Peng, F. Long, C. Ding, Feature selection based on mutual information criteria of max-dependency, maxrelevance, and min-redundancy, IEEE Transactions on pattern analysis and machine intelligence, 27(8) (2005),
1226-1238.
[30] D. Sheskin, Handbook of parametric and nonparametric statistical procedures, crc Press, (2003), 225-239.
[31] A. Statnikov, I. Tsamardinos, Y. Dosbayev, C. F. Aliferis, GEMS: a system for automated cancer diagnosis and
biomarker discovery from microarray gene expression data, International journal of medical informatics, 74(7)
(2005), 491-503.
[32] V. Torra, Hesitant fuzzy sets, International Journal of Intelligent Systems, 25(6) (2010), 529-539.
[33] E. Tuv, A. Borisov, G. Runger, K. Torkkola, Feature selection with ensembles, artificial variables, and redundancy
elimination, Journal of Machine Learning Research, 10(Jul) (2009), 1341-1366.
[34] H. U˘guz, A two-stage feature selection method for text categorization by using information gain, principal component analysis and genetic algorithm, Knowledge-Based Systems, 24(7) (2011), 1024-1032.
[35] C. Wang, M. Shao, Q. He, Y. Qian, Y. Qi, Feature subset selection based on fuzzy neighborhood rough sets,
Knowledge-Based Systems, 111 (2016), 173-179.
[36] X. Zhang, C. Mei, D. Chen, J. Li, Feature selection in mixed data: A method using a novel fuzzy rough set-based
information entropy, Pattern Recognition, 56 (2016), 1-15.