logoalt Hacker News

hervatureyesterday at 12:33 AM0 repliesview on HN

Neural network training is harder when the input range is allowed to deviate from [-1, 1]. The only reason why it sometimes works for neural networks is because the first layer has a chance to normalize it.