Feature Selection
Feature Selection is a process where we select a subset of features which contribute most to the output variable. If there are n features and out of them only few features have relavance with the output variable then, instead of training the model with all the n features we can select only few features so the performance of the model increases and could overcome the problem of curse of dimensionality.
Filter Method
checks the relevance of features with output variable.
1) CHI squared test
2) ANOVA test
3) Correlation Coefficient
Wrapper methods
1) forward Selection
2) Backward elimination
Embedded methods
Learns the feature selection while building the model.
1) Decision Tree