ReLU
Rectified Linear Unit — an activation function that outputs zero for negative inputs and passes positive inputs unchanged, adding nonlinearity to neural networks.
learn more?
Subscribe and we'll send new content to your inbox.
Rectified Linear Unit — an activation function that outputs zero for negative inputs and passes positive inputs unchanged, adding nonlinearity to neural networks.
Subscribe and we'll send new content to your inbox.