Random Forest

Important Interview Questions On Random Forest Machine Learning Algorithm

In this video we will be discussing about the important interview questions on Random Forest algorithm.

Important Interview Questions:
  1. Decision Tree
  2. Entropy, Information Gain, Gini Impurity
  3. Decision Tree Working For Categorical and Numerical Features
  4. What are the scenarios where Decision Tree works well
  5. Decision Tree Low Bias And High Variance- Overfitting
  6. Hyperparameter Techniques
  7. Library used for constructing decision tree
  8. Impact of Outliers Of Decision Tree
  9. Impact of mising values on Decision Tree
  10. Does Decision Tree require Feature Scaling
Random Forest Classifier And Regresor
  1. Ensemble Techniques(Boosting And Bagging)
  2. Working of Random Forest Classifier
  3. Working of Random Forest Regresor
  4. Hyperparameter Tuning(Grid Search And RandomSearch)

Theoretical Understanding:

  1. Tutorial 37:Entropy In Decision Tree https://www.youtube.com/watch?v=1IQOtJ4NI_0
  2. Tutorial 38:Information Gain https://www.youtube.com/watch?v=FuTRucXB9rA
  3. Tutorial 39:Gini Impurity https://www.youtube.com/watch?v=5aIFgrrTqOw
  4. Tutorial 40: Decision Tree For Numerical Features: https://www.youtube.com/watch?v=5O8HvA9pMew
  5. How To Visualize DT: https://www.youtube.com/watch?v=ot75kOmpYjI

Theoretical Understanding:

  1. Ensemble technique(Bagging): https://www.youtube.com/watch?v=KIOeZ5cFZ50
  2. Random forest Classifier And Regressor https://www.youtube.com/watch?v=nxFG5xdpDto
  3. Construct Decision Tree And working in Random Forest: https://www.youtube.com/watch?v=WQ0iJSbnnZA&t=406s
 

Important properties of Random Forest Classifiers

  1. Decision Tree—Low Bias And High Variance

  2. Ensemble Bagging(Random Forest Classifier)–Low Bias And Low Variance

 
1. What Are the Basic Assumption?

There are no such assumptions

 
2. Advantages

Advantages of Random Forest

  1. Doesn’t Overfit

  2. Favourite algorithm for Kaggle competition

  3. Less Parameter Tuning required

  4. Decision Tree can handle both continuous and categorical variables.

  5. No feature scaling required: No feature scaling (standardization and normalization) required in case of Random Forest as it uses DEcision Tree internally

  6. Suitable for any kind of ML problems

 
3. Disadvantages

Disadvantages of Random Forest

1.Biased With features having many categories

  1. Biased in multiclass classification problems towards more frequent classes.
 
4. Whether Feature Scaling is required?

No

6. Impact of outliers?

Robust to Outliers

 
Types of Problems it can solve(Supervised)
  1. Classification
  2. Regression
 
 
 
Performance Metrics
 
Classification
  1. Confusion Matrix
  2. Precision,Recall, F1 score
Regression
  1. R2,Adjusted R2
  2. MSE,RMSE,MAE

Download the github material from here

If you are looking for affordable tech course such as data science, machine learning, deep learning,cloud and many more you can go ahead with iNeuron oneneuron platform where you will able to get 200+ tech courses at an affordable price for a lifetime access.

Leave a Reply