Everyone's talking about AI adoption. Almost nobody has the real numbers. Help us change that — and get the full report 👉 Engineers | Leaders

ReLU

Rectified Linear Unit — an activation function that outputs zero for negative inputs and passes positive inputs unchanged, adding nonlinearity to neural networks.