神经网络于过拟合

“Small” neural network (fewer parameters; more prone to underfitting)

Computationally cheaper

"Large" neural network (more parameters; more prone to overfitting)

Computationally more expensive.

Use regularization (λ) to address overfitting.

简单的神经网络(更少的参数)容易出现欠拟合,但优点是计算简单。

复杂的神经网络(跟多参数,更复杂的结构)一般情况下意味着更好的性能,但是计算成本高,而且容易出现过拟合现象,这时需要运用正则化解决过拟合问题。

猜你喜欢

转载自www.cnblogs.com/qkloveslife/p/9887119.html