What activation function was introduced by Kunihiko Fukushima in 1969?

Understand the Problem

The question is asking to identify which activation function was introduced by Kunihiko Fukushima in 1969. This requires knowledge of the history and types of activation functions used in neural networks.

Answer

ReLU (Rectified Linear Unit)

The activation function introduced by Kunihiko Fukushima in 1969 is the ReLU (Rectified Linear Unit).

Answer for screen readers

The activation function introduced by Kunihiko Fukushima in 1969 is the ReLU (Rectified Linear Unit).

More Information

The ReLU function has gained prominence for its simplicity and effectiveness, particularly in deep learning. It allows models to converge faster and often performs better than sigmoid or tanh activation functions.

AI-generated content may contain errors. Please verify critical information

Thank you for voting!
Use Quizgecko on...
Browser
Browser