Which of the following is a common activation function used in hidden layers of

Practice Questions

Q1
Which of the following is a common activation function used in hidden layers of neural networks?
  1. Softmax
  2. ReLU
  3. Mean Squared Error
  4. Cross-Entropy

Questions & Step-by-Step Solutions

Which of the following is a common activation function used in hidden layers of neural networks?
  • Step 1: Understand what an activation function is. It helps a neural network learn by introducing non-linearity.
  • Step 2: Learn about different types of activation functions. Some common ones are Sigmoid, Tanh, and ReLU.
  • Step 3: Identify which activation function is commonly used in hidden layers. ReLU (Rectified Linear Unit) is widely used.
  • Step 4: Know why ReLU is popular. It is simple to compute and helps with faster training of the neural network.
No concepts available.
Soulshift Feedback ×

On a scale of 0–10, how likely are you to recommend The Soulshift Academy?

Not likely Very likely