Q. How does Random Forest reduce the risk of overfitting compared to a single Decision Tree?
A.
By using a single tree with more depth
B.
By averaging the predictions of multiple trees
C.
By using only the most important features
D.
By increasing the size of the training dataset
Show solution
Solution
Random Forest reduces overfitting by averaging the predictions of multiple trees, which smooths out the noise and variance.
Correct Answer:
B
— By averaging the predictions of multiple trees
Learn More →
Q. In Random Forests, what does the term 'feature randomness' refer to?
A.
Randomly selecting features for each tree
B.
Randomly selecting data points for training
C.
Randomly assigning labels to data
D.
Randomly adjusting tree depth
Show solution
Solution
Feature randomness refers to the practice of randomly selecting a subset of features for each tree in the forest, which helps to create diverse models.
Correct Answer:
A
— Randomly selecting features for each tree
Learn More →
Q. What is a key characteristic of ensemble methods like Random Forests?
A.
They use a single model for predictions
B.
They combine multiple models to improve performance
C.
They require less computational power
D.
They are only applicable to regression tasks
Show solution
Solution
Ensemble methods like Random Forests combine multiple models to improve overall performance and robustness.
Correct Answer:
B
— They combine multiple models to improve performance
Learn More →
Q. What is the main disadvantage of Decision Trees?
A.
They are computationally expensive
B.
They can easily overfit the training data
C.
They cannot handle missing values
D.
They require a large amount of data
Show solution
Solution
Decision Trees can easily overfit the training data, especially if they are deep and complex.
Correct Answer:
B
— They can easily overfit the training data
Learn More →
Q. What is the purpose of pruning in Decision Trees?
A.
To increase the depth of the tree
B.
To remove unnecessary branches
C.
To add more features
D.
To improve computational efficiency
Show solution
Solution
Pruning is used to remove unnecessary branches from a Decision Tree to reduce complexity and prevent overfitting.
Correct Answer:
B
— To remove unnecessary branches
Learn More →
Q. What is the role of the 'max_depth' parameter in Decision Trees?
A.
To control the number of features used
B.
To limit the number of samples at each leaf
C.
To prevent the tree from growing too deep and overfitting
D.
To increase the computational efficiency
Show solution
Solution
The 'max_depth' parameter limits how deep the tree can grow, helping to prevent overfitting by controlling model complexity.
Correct Answer:
C
— To prevent the tree from growing too deep and overfitting
Learn More →
Q. Which evaluation metric is commonly used for classification problems with Decision Trees?
A.
Mean Squared Error
B.
Accuracy
C.
R-squared
D.
Log Loss
Show solution
Solution
Accuracy is a common evaluation metric for classification problems, measuring the proportion of correct predictions.
Correct Answer:
B
— Accuracy
Learn More →
Q. Which evaluation metric is commonly used for classification tasks with Decision Trees?
A.
Mean Absolute Error
B.
Accuracy
C.
R-squared
D.
Silhouette Score
Show solution
Solution
Accuracy is a common evaluation metric for classification tasks, measuring the proportion of correct predictions.
Correct Answer:
B
— Accuracy
Learn More →
Showing 1 to 8 of 8 (1 Pages)