Q. How does Random Forest improve upon a single Decision Tree?
A.
By using a single tree with more depth.
B.
By averaging the predictions of multiple trees.
C.
By using only the most important features.
D.
By increasing the size of the training dataset.
Show solution
Solution
Random Forest improves accuracy by averaging the predictions of multiple trees, which reduces overfitting.
Correct Answer:
B
— By averaging the predictions of multiple trees.
Learn More →
Q. In Random Forests, what is the purpose of bootstrapping?
A.
To reduce the number of features
B.
To create multiple subsets of the training data
C.
To increase the depth of trees
D.
To improve interpretability
Show solution
Solution
Bootstrapping involves creating multiple subsets of the training data by sampling with replacement, which helps in building diverse trees in Random Forests.
Correct Answer:
B
— To create multiple subsets of the training data
Learn More →
Q. In which scenario would you prefer using a Random Forest over a Decision Tree?
A.
When interpretability is the main concern.
B.
When you have a small dataset.
C.
When you need high accuracy and robustness.
D.
When computational resources are limited.
Show solution
Solution
Random Forest is preferred for high accuracy and robustness, especially in larger datasets where overfitting is a concern.
Correct Answer:
C
— When you need high accuracy and robustness.
Learn More →
Q. What does the Gini impurity measure in Decision Trees?
A.
The accuracy of the model.
B.
The purity of a node in the tree.
C.
The depth of the tree.
D.
The number of features used.
Show solution
Solution
Gini impurity measures the impurity or disorder of a node, helping to determine the best splits in the tree.
Correct Answer:
B
— The purity of a node in the tree.
Learn More →
Q. What does the term 'ensemble learning' refer to in the context of Random Forests?
A.
Using a single model for predictions
B.
Combining multiple models to improve accuracy
C.
Training models on different datasets
D.
Using only linear models
Show solution
Solution
Ensemble learning refers to the technique of combining multiple models, such as decision trees in Random Forests, to improve overall prediction accuracy.
Correct Answer:
B
— Combining multiple models to improve accuracy
Learn More →
Q. What is a common application of decision trees in real-world scenarios?
A.
Image recognition
B.
Natural language processing
C.
Credit scoring
D.
Time series forecasting
Show solution
Solution
Decision trees are commonly used in credit scoring to assess the risk of lending to individuals based on various features.
Correct Answer:
C
— Credit scoring
Learn More →
Q. What is the main criterion used to split nodes in a decision tree?
A.
Mean Squared Error
B.
Entropy or Gini Impurity
C.
Cross-Entropy Loss
D.
R-squared Value
Show solution
Solution
Decision trees commonly use criteria like Entropy or Gini Impurity to determine the best feature for splitting nodes.
Correct Answer:
B
— Entropy or Gini Impurity
Learn More →
Q. What is the primary purpose of a decision tree in machine learning?
A.
To visualize data distributions
B.
To classify or predict outcomes based on input features
C.
To reduce dimensionality of data
D.
To cluster similar data points
Show solution
Solution
A decision tree is primarily used for classification or regression tasks, where it predicts outcomes based on input features.
Correct Answer:
B
— To classify or predict outcomes based on input features
Learn More →
Q. What is the role of feature importance in Random Forest?
A.
To determine the number of trees to use.
B.
To identify which features contribute most to the model's predictions.
C.
To select the best hyperparameters.
D.
To visualize the decision boundaries.
Show solution
Solution
Feature importance helps identify which features are most influential in making predictions, aiding in feature selection and model interpretation.
Correct Answer:
B
— To identify which features contribute most to the model's predictions.
Learn More →
Q. Which evaluation metric is commonly used for assessing the performance of a Decision Tree classifier?
A.
Mean absolute error
B.
F1 score
C.
R-squared
D.
Root mean squared error
Show solution
Solution
The F1 score is commonly used for classification tasks to balance precision and recall.
Correct Answer:
B
— F1 score
Learn More →
Q. Which evaluation metric is commonly used to assess the performance of a classification model like a decision tree?
A.
Mean Absolute Error
B.
Accuracy
C.
Silhouette Score
D.
Adjusted R-squared
Show solution
Solution
Accuracy is a common metric used to evaluate the performance of classification models, including decision trees.
Correct Answer:
B
— Accuracy
Learn More →
Q. Which of the following is a key advantage of using Random Forests over a single decision tree?
A.
Faster training time
B.
Higher interpretability
C.
Reduced risk of overfitting
D.
Simpler model structure
Show solution
Solution
Random Forests reduce the risk of overfitting by averaging the predictions of multiple decision trees, leading to better generalization.
Correct Answer:
C
— Reduced risk of overfitting
Learn More →
Q. Which of the following is NOT a common criterion for splitting nodes in Decision Trees?
A.
Entropy
B.
Gini impurity
C.
Mean squared error
D.
Information gain
Show solution
Solution
Mean squared error is typically used in regression tasks, while the other options are used for classification tasks.
Correct Answer:
C
— Mean squared error
Learn More →
Q. Which of the following statements about Random Forests is true?
A.
They can only be used for regression tasks.
B.
They are less interpretable than single decision trees.
C.
They require more computational resources than a single decision tree.
D.
All of the above.
Show solution
Solution
Random Forests can be used for both classification and regression, are less interpretable than single trees, and require more computational resources.
Correct Answer:
D
— All of the above.
Learn More →
Q. Which of the following techniques is used to prevent overfitting in decision trees?
A.
Increasing the depth of the tree
B.
Pruning the tree
C.
Using more features
D.
Decreasing the sample size
Show solution
Solution
Pruning the tree involves removing sections of the tree that provide little power in predicting target variables, thus preventing overfitting.
Correct Answer:
B
— Pruning the tree
Learn More →
Showing 1 to 15 of 15 (1 Pages)