Learning Rate Transfer in Normalized Transformers
arXiv:2604.27077v1 Announce Type: new
Abstract: The Normalized Transformer, or nGPT (arXiv:2410.01131) achieves impressive training speedups and does not require weight decay or learning rate warmup. However, despite having hyperparameters that explic…