Machine Learning Review 1

Machine Learning Review

1 - Which are the two common types of supervised learning? (Choose two)
A. Clustering
B. Regression
C. Classification

BC

2 - Which of the following is unsupervised learning?
A. Clustering
B. Regression
C. Classification

A

3 - For linear regression, the model is \(f_{w,b}(x) = wx + b\)
Which of the following are the inputs or features, which are fed into the model and the model will make predictions through them?
A. \(m\)
B. \(x\)
C. \((x,y)\)
D. \(w\) and \(b\)

B

4 - For linear regression, if you find the parameters \(w\) and \(b\) such that \(J(w,b)\) is very close to zero, what conclusions can you draw?
A. The values ​​of the selected parameters \(w\) and \(b\) make the algorithm fit the training set very well
B. The values ​​of the selected parameters \(w\) and \(b\) cause the algorithm to fit the training set very well The fit to the training set is very poor
C. This is impossible - there must be a bug in the code

A

5 - Gradient descent is an algorithm that finds parameter values ​​w and b such that the cost function J is minimized. Image Name
When \(\frac{\partial J(w,b)}{\partial w}\) is a negative number (less than zero), what happens to \(w\) after one update step ?

A. Unable to judge whether \(w\) will increase or decrease
B. \(w\) increase
C. \(w\) decrease
D. \(w\) remain unchanged

B

6 - For linear regression, what is the update step for parameter b?

A.\(b = b - \alpha\frac{1}{m}\sum\limits_{i=1}^{m}(f_{w,b}(x^{(i)}) - y^{(i)})\)

B.\(b = b - \alpha\frac{1}{m}\sum\limits_{i=1}^{m}(f_{w,b}(x^{(i)}) - y^{(i)})x^{(i)}\)

A

Guess you like

Origin blog.csdn.net/cfy2401926342/article/details/131468268