Basics:
In Classification, we worked with datasets composed of only two independent variables.
Two dimensions visualize better how Machine Learning models worked (by plotting the prediction regions and the prediction boundary for each model).
There are two types of Dimensionality Reduction techniques:
We'll cover Feature Extraction techniques here.
In Classification, we worked with datasets composed of only two independent variables.
Two dimensions visualize better how Machine Learning models worked (by plotting the prediction regions and the prediction boundary for each model).
In case where there are more than two independent variables, we can often end up with two independent variables by applying an appropriate Dimensionality Reduction technique.
There are two types of Dimensionality Reduction techniques:
- Feature Selection - Covered in Part 2 - Regression.
- Backward Elimination,
- Forward Selection,
- Bidirectional Elimination,
- Score Comparison
- Feature Extraction
- Principal Component Analysis (PCA) - Works on Linear Separable Model
- Linear Discriminant Analysis (LDA) - Works on Linear Separable Model
- Kernel PCA - Works on Non-Linear Separable Model
- Quadratic Discriminant Analysis (QDA)
We'll cover Feature Extraction techniques here.
Hope this helps!!!
Arun Manglick
Great, thanks for sharing this blog.Really looking forward to read more. Keep writing
ReplyDeleteMachine Learning Online Course