Coursera-吴恩达-深度学习-神经网络和深度学习-week3-测验

本文章内容:

Coursera吴恩达深度学习课程,第一课,神经网络和深度学习Neural Networks and Deep Learning,

第三周:浅层神经网络(Shallow neural networks)

部分的测验,题目及答案截图。
 

F正确

C错误 example都用()表示。

 As seen in lecture the output of the tanh is between -1 and 1, it thus centers the data which makes the learning simpler for the next layer.

Sigmoid outputs a value between 0 and 1 which makes it a very good choice for binary classification. You can classify as 0 if the output is less than 0.5 and classify as 1 if the output is more than 0.5. It can be done with tanh as well but it is less convenient as the output is between -1 and 1.

 we use (keepdims = True) to make sure that A.shape is (4,1) and not (4, ). It makes our code more rigorous.

ogistic Regression doesn't have a hidden layer. If you initialize the weights to zeros, the first example x fed in the logistic regression will output zero but the derivatives of the Logistic Regression depend on the input x (because there's no hidden layer) which is not zero. So at the second iteration, the weights values follow x's distribution and are different from each other if x is not a constant vector.

 gradient to be close to zero. This slows down the optimization algorithm.

 

9.

F错误

H正确,b的维度是(当前层数的note数,1)

A正确。

C错误。

Remember that Z^{[1]}Z[1] and A^{[1]}A[1] are quantities computed over a batch of training examples, not only 1.

猜你喜欢

转载自blog.csdn.net/qq1376725255/article/details/85412176