Q. How does a Random Forest improve upon a single Decision Tree?
A.
By using a single model for predictions
B.
By averaging the predictions of multiple trees
C.
By increasing the depth of each tree
D.
By using only the most important features
Show solution
Solution
Random Forest improves accuracy by averaging the predictions of multiple Decision Trees, which reduces variance.
Correct Answer:
B
— By averaging the predictions of multiple trees
Learn More →
Q. In a Random Forest, what is the purpose of bootstrapping?
A.
To reduce overfitting
B.
To increase the number of features
C.
To create multiple subsets of data for training
D.
To improve model interpretability
Show solution
Solution
Bootstrapping involves creating multiple subsets of the training data, which helps in building diverse trees in the Random Forest.
Correct Answer:
C
— To create multiple subsets of data for training
Learn More →
Q. In Random Forests, how are the trees typically constructed?
A.
Using all features for each split.
B.
Using a random subset of features for each split.
C.
Using only the most important feature.
D.
Using a fixed number of features for all trees.
Show solution
Solution
Random Forests use a random subset of features for each split, which helps in reducing correlation among trees.
Correct Answer:
B
— Using a random subset of features for each split.
Learn More →
Q. What does the Gini impurity measure in a Decision Tree?
A.
The accuracy of the model.
B.
The likelihood of misclassifying a randomly chosen element.
C.
The depth of the tree.
D.
The number of features used.
Show solution
Solution
Gini impurity measures the likelihood of misclassifying a randomly chosen element from the dataset, helping to determine the best splits.
Correct Answer:
B
— The likelihood of misclassifying a randomly chosen element.
Learn More →
Q. What is a key characteristic of Random Forests compared to a single Decision Tree?
A.
They are less prone to overfitting.
B.
They require more computational resources.
C.
They can only handle binary classification.
D.
They are always more interpretable.
Show solution
Solution
Random Forests are less prone to overfitting because they aggregate the predictions of multiple trees.
Correct Answer:
A
— They are less prone to overfitting.
Learn More →
Q. What is the Gini impurity used for in Decision Trees?
A.
To measure the accuracy of the model
B.
To determine the best split at each node
C.
To evaluate the performance of Random Forests
D.
To select features for the model
Show solution
Solution
Gini impurity is used to determine the best split at each node in a Decision Tree, helping to create more homogeneous child nodes.
Correct Answer:
B
— To determine the best split at each node
Learn More →
Q. What is the main purpose of feature importance in Random Forests?
A.
To reduce the number of trees in the forest.
B.
To identify which features contribute most to the predictions.
C.
To increase the depth of the trees.
D.
To ensure all features are used equally.
Show solution
Solution
Feature importance helps identify which features contribute most to the predictions made by the Random Forest model.
Correct Answer:
B
— To identify which features contribute most to the predictions.
Learn More →
Q. What is the main purpose of pruning a Decision Tree?
A.
To increase the depth of the tree
B.
To reduce the size of the tree and prevent overfitting
C.
To improve the training speed
D.
To enhance feature selection
Show solution
Solution
Pruning reduces the size of the tree, which helps to prevent overfitting and improves generalization to unseen data.
Correct Answer:
B
— To reduce the size of the tree and prevent overfitting
Learn More →
Q. What is the maximum depth of a Decision Tree?
A.
It is always fixed.
B.
It can be controlled by hyperparameters.
C.
It is determined by the number of features.
D.
It is irrelevant to the model's performance.
Show solution
Solution
The maximum depth of a Decision Tree can be controlled by hyperparameters, which helps manage overfitting.
Correct Answer:
B
— It can be controlled by hyperparameters.
Learn More →
Q. What type of learning does a Decision Tree primarily use?
A.
Unsupervised Learning
B.
Reinforcement Learning
C.
Supervised Learning
D.
Semi-supervised Learning
Show solution
Solution
Decision Trees primarily use Supervised Learning, where the model is trained on labeled data.
Correct Answer:
C
— Supervised Learning
Learn More →
Q. Which algorithm is primarily used for regression tasks in Decision Trees?
A.
CART (Classification and Regression Trees)
B.
ID3
C.
C4.5
D.
K-Means
Show solution
Solution
CART (Classification and Regression Trees) is used for both classification and regression tasks in Decision Trees.
Correct Answer:
A
— CART (Classification and Regression Trees)
Learn More →
Q. Which algorithm is typically faster for making predictions, Decision Trees or Random Forests?
A.
Decision Trees
B.
Random Forests
C.
Both are equally fast
D.
It depends on the dataset size
Show solution
Solution
Decision Trees are generally faster for making predictions because they involve a single tree, while Random Forests require aggregating results from multiple trees.
Correct Answer:
A
— Decision Trees
Learn More →
Q. Which algorithm is typically faster for making predictions?
A.
Decision Trees
B.
Random Forests
C.
Support Vector Machines
D.
Neural Networks
Show solution
Solution
Decision Trees are typically faster for making predictions because they involve traversing a single tree structure.
Correct Answer:
A
— Decision Trees
Learn More →
Q. Which of the following is a disadvantage of Decision Trees?
A.
They can handle both numerical and categorical data
B.
They are prone to overfitting
C.
They are easy to interpret
D.
They require less data
Show solution
Solution
Decision Trees are prone to overfitting, especially when they are deep and complex.
Correct Answer:
B
— They are prone to overfitting
Learn More →
Showing 1 to 14 of 14 (1 Pages)