cs.CV, cs.LG, cs.NE, stat.ML

Deep Learning using Rectified Linear Units (ReLU)

arXiv:1803.08375v3 Announce Type: replace-cross
Abstract: The Rectified Linear Unit (ReLU) is a foundational activation function in artficial neural networks. Recent literature frequently misattributes its origin to the 2018 (initial) version of this …