Q. In a case study, which method is often used to evaluate the effectiveness of feature engineering?
-
A.
Cross-validation
-
B.
Data normalization
-
C.
Hyperparameter tuning
-
D.
Model deployment
Solution
Cross-validation helps assess how well the feature engineering has improved model performance.
Correct Answer:
A
— Cross-validation
Learn More →
Q. In a case study, which method would be best for handling missing values in a dataset?
-
A.
Drop the rows with missing values
-
B.
Impute missing values with the mean
-
C.
Use a neural network to predict missing values
-
D.
All of the above
Solution
All methods can be valid depending on the context and the amount of missing data.
Correct Answer:
D
— All of the above
Learn More →
Q. In a feature engineering case study, what is the role of domain knowledge?
-
A.
To automate model training
-
B.
To inform feature selection and creation
-
C.
To evaluate model performance
-
D.
To visualize data
Solution
Domain knowledge helps identify relevant features that can significantly impact model performance.
Correct Answer:
B
— To inform feature selection and creation
Learn More →
Q. In feature engineering, what does 'one-hot encoding' achieve?
-
A.
It reduces the dimensionality of the dataset
-
B.
It converts categorical variables into a numerical format
-
C.
It normalizes the data
-
D.
It increases the number of features exponentially
Solution
One-hot encoding transforms categorical variables into a binary matrix, making them suitable for machine learning algorithms.
Correct Answer:
B
— It converts categorical variables into a numerical format
Learn More →
Q. In feature engineering, what does normalization refer to?
-
A.
Scaling features to a common range
-
B.
Removing outliers from the dataset
-
C.
Encoding categorical variables
-
D.
Selecting important features
Solution
Normalization is the process of scaling features to a specific range, often [0, 1].
Correct Answer:
A
— Scaling features to a common range
Learn More →
Q. What is a common challenge when selecting features for a model?
-
A.
Overfitting due to too many features
-
B.
Underfitting due to too few features
-
C.
Both A and B
-
D.
None of the above
Solution
Both overfitting and underfitting can occur depending on the number of features selected.
Correct Answer:
C
— Both A and B
Learn More →
Q. What is a common pitfall in model selection?
-
A.
Using too few features
-
B.
Overfitting the model to the training data
-
C.
Not validating the model
-
D.
All of the above
Solution
All these factors can lead to poor model performance and generalization.
Correct Answer:
D
— All of the above
Learn More →
Q. What is a potential drawback of using too many features in a model?
-
A.
Overfitting
-
B.
Underfitting
-
C.
Increased accuracy
-
D.
Faster training time
Solution
Using too many features can lead to overfitting, where the model learns noise instead of the underlying pattern.
Correct Answer:
A
— Overfitting
Learn More →
Q. What is feature engineering primarily concerned with?
-
A.
Creating new features from existing data
-
B.
Selecting the best model for prediction
-
C.
Evaluating model performance
-
D.
Training neural networks
Solution
Feature engineering involves transforming raw data into meaningful features that improve model performance.
Correct Answer:
A
— Creating new features from existing data
Learn More →
Q. What is the main advantage of using ensemble methods in model selection?
-
A.
They are simpler to implement
-
B.
They combine predictions from multiple models to improve accuracy
-
C.
They require less data
-
D.
They are always faster than single models
Solution
Ensemble methods leverage the strengths of multiple models to enhance overall predictive performance.
Correct Answer:
B
— They combine predictions from multiple models to improve accuracy
Learn More →
Q. What is the purpose of hyperparameter tuning in model selection?
-
A.
To adjust the model's architecture
-
B.
To select the best features
-
C.
To improve model performance
-
D.
To visualize results
Solution
Hyperparameter tuning optimizes the model's parameters to enhance its predictive performance.
Correct Answer:
C
— To improve model performance
Learn More →
Q. What is the purpose of model selection in machine learning?
-
A.
To choose the best algorithm for the data
-
B.
To preprocess the data
-
C.
To visualize the data
-
D.
To deploy the model
Solution
Model selection aims to identify the most suitable algorithm that performs best on the given dataset.
Correct Answer:
A
— To choose the best algorithm for the data
Learn More →
Q. Which evaluation metric is commonly used for classification problems?
-
A.
Mean Squared Error
-
B.
Accuracy
-
C.
Silhouette Score
-
D.
R-squared
Solution
Accuracy measures the proportion of correct predictions in classification tasks.
Correct Answer:
B
— Accuracy
Learn More →
Q. Which of the following is a common technique in feature selection?
-
A.
Principal Component Analysis (PCA)
-
B.
K-means Clustering
-
C.
Support Vector Machines
-
D.
Random Forest Regression
Solution
Principal Component Analysis (PCA) is used to reduce dimensionality and select important features.
Correct Answer:
A
— Principal Component Analysis (PCA)
Learn More →
Q. Which of the following is NOT a method of feature extraction?
-
A.
TF-IDF
-
B.
Bag of Words
-
C.
One-Hot Encoding
-
D.
Linear Regression
Solution
Linear Regression is a modeling technique, not a feature extraction method.
Correct Answer:
D
— Linear Regression
Learn More →
Q. Which of the following techniques can help in reducing overfitting?
-
A.
Feature scaling
-
B.
Regularization
-
C.
Data augmentation
-
D.
All of the above
Solution
All of the mentioned techniques can help mitigate overfitting in machine learning models.
Correct Answer:
D
— All of the above
Learn More →
Showing 1 to 16 of 16 (1 Pages)