LayerNorm
Layer Normalization — a technique that normalizes activations across features by centering and scaling, with learnable parameters, to stabilize deep network training.
learn more?
Subscribe and we'll send new content to your inbox.
Layer Normalization — a technique that normalizes activations across features by centering and scaling, with learnable parameters, to stabilize deep network training.
Subscribe and we'll send new content to your inbox.