Andrew Ng Neural Networks and Deep Learning Lesson 4 Week 4 Homework 4-1

Building your Deep Neural Network

1. Initialize parameters
W1,b1,W2,b2
2. Forward propagation
Z[l]=W[l]A[l−1]+b[l]
where A[0]=X
3. Calculate activation functio
sigmoid/relu
A[l]=g(Z[l])=g(W[l]A[l−1]+b[l])
output (A,cache)
where cache includes w1,b1…
4. Implement FP

AL, cache = linear_activation_forward(A,parameters['W'+str(L)],parameters['b'+str(L)],activation = "sigmoid")
    caches.append(cache)

5. Calculate cost function
−1m∑i=1m(y(i)log(a L )+(1−y(i))log(1−a L ))
np.dot does not support exchange law
6. Implement back propagation (The following implementation logic should be reversed)

  • Take dZ, W, b to find dW, db, dA
  • 取 dA , activation_cache 求 dZ , sigmoid / relu dZ [l] = dA [l] ∗ g ′ (Z [l])
  • 取AL,Y,caches 求dA
dAL = - (np.divide(Y, AL) - np.divide(1 - Y, 1 - AL)) 

7. Calculate the gradient grads
grads["dW"+str(l)]=dW[l]

8. Update parameter
W[l]=W[l]−α dW[l]
b[l]=b[l]−α db[l]

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=326315725&siteId=291194637
Recommended