Dropout 下(关于《Dropout: A Simple way to prevent neural networks from overfitting》)

先上菜单:

摘要:

Deep neural nets with a large number of parameters are very powerful machine learning systems. However, overfitting is a serious problem in such networks. Large networks are also slow to use, making it difficult to deal with overfitting by combining the predictions of many different large neural nets at test time. 具有大量参数的深度神经网络是非常强大的机器学习系统。然而,在这样的网络中,过度拟合是一个严重的问题。大型网络的使用速度也较慢,因此在测试时结合许多不同大型神经网络的预测,很难处理过度拟合问题。Dropout is a technique for addressing this problem.The key idea is to randomly drop units (along with their connections) from the neural network during training. (dropout是解决这个问题的一种方法。关键思想是在训练过程中从神经网络中随机删除单元(以及它们的连接)This prevents units from co-adapting too much. During training,dropout samples from an exponential number of dierent thinnednetworks. At test time,it is easy to approximate the effect of averaging the predictions of all these thinned networks by simply using a single unthinned network that has smaller weights. (这就防止了单位过度的相互适应。在训练过程中,舍弃来自不同的指数级别的“稀疏”网络的样本。在测试时,只需使用一个权重较小的未减薄网络,就可以很容易地估计出所有这些变薄网络的平均预测效果。)This significantly reduces overfitting and gives major improvements over other regularization methods. We show that dropout improves the performance of neural networks on supervised learning tasks in vision, speech recognition, document classification and computational biology,obtaining state-of-the-art results on many benchmark data sets.(这大大减少了过度拟合,并对其他正则化方法进行了重大改进。实验结果表明,在视觉、语音识别、文档分类和计算生物学等方面,dropout都能提高神经网络在有监督学习任务中的性能,在许多基准数据集上都获得了最新的结果。

Keywords: neural networks, regularization(正则化), model combination(模型组合), deep learning

先介绍一下本文结构:

本文的结构如下:2节描述了这个想法的动机。第3节描述了以前的相关工作。第4节正式描述了dropout模型。第5节给出了训练dropout网络的算法。在第6节中,我们展示了我们的实验结果,我们将dropout应用于不同领域的问题,并与其他形式的正则化和模型组合进行了比较。第7节分析了dropout对神经网络不同性质的影响,并描述了dropout如何与网络的超参数相互作用。第8节描述了drop - RBM模型。在第9节中,我们探讨了边缘化dropout的概念。在附录A中,我们提供了一个训练dropout网的实用指南。这包括在训练drop - out网络时,选择超参数所涉及的实际考虑的详细分析。(背景部分:1-3节 ;方法部分:4-5节;实验及分析:6-7节;其他:8-10节;总结:11;附录:A-B)

背景部分重点摘要:

猜你喜欢

转载自www.cnblogs.com/Ann21/p/9830781.html