摘要: 本文是吴恩达 (Andrew Ng)老师《机器学习》课程,第五章《多变量线性回归》中第28课时《多变量》的视频原文字幕。为本人在视频学习过程中记录下来并加以修正,使其更加简洁,方便阅读,以便日后查阅使用。现分享给大家。如有错误,欢迎大家批评指正,在此表示诚挚地感谢!同时希望对大家的学习能有所帮助。
In this video (article), we'll start to talk about a new version of linear regression. That's more powerful one that works with multiple variables or with multiple features. Here's what I mean.
In the original version of linear regression that we developed, we had a single feature , the size of the house, and we wanted to use that to predict the price of the house and this was our form of hypothesis.
What if we had not only the size of the house as a feature or as a variable with which to try to predict the price, but that we also knew the number of bedrooms, the number of floors, and the age of home in years. It seems that this would give us a lot more information with which to predict the price.
- I'm going to use the variables , and so on to denote my four features, and I'm going to continue to use to denote the output variable price that we're trying to predict.
- I'm going to use lower case "" to denote the number of features. So in this example, we have .
- We were using to denote the number of examples. So if you have rows, "" is the number of rows on this table or the number of training examples.
- I'm also going to use to denote the input features of the training example.
- As a concrete example, let's say is going to be a vector of the features for my second training example. And so here is going to be a vector since those are my four features that I have to try to predict the price of the second house.
- Note that in this notation, the superscript is not to the power of . Instead, it's an index into my training set which says look at the second row of this table, this refers to my second training example.
- I'm going to use also to denote the value of feature number in the training example. So concretely, will refer to feature number in the training example which is equal to .
Now that we have multiple features, let's talk about what the form of the hypothesis should be.
- Previously this () was the form of our hypothesis, where was our single feature.
- Now that we have multiple features, a form of the hypothesis in linear regression is going to be .
- Concretely, for a particular setting of our parameters, we may have . This would be one example of a hypothesis. And remember, a hypothesis is trying to predict the price of the house in thousands of dollars, just saying that the base price of the house is maybe , plus another . So that's an extra hundred dollars per square feet, plus the price goes up a little bit for each additional bedroom that the house has, is number of bedrooms; and it goes up further for additional floor the house has, because was the number of floors, and the price goes down a little bit with each additional age of the house, with each additional year of the age of the house.
If we have features, here's the form of a hypothesis. I'm going to introduce a little bit of notation to simplify this equation.
For convenience of notation,
- Let me define . Concretely, this means that for every example , . You can think of this as defining an additional zero feature. So whereas previously I had features . I'm now defining an additional sort of zero feature vector that always takes on the value of one. So now my feature vector becomes this dimensional vector that is zero indexed.
- I'm also going to think of my parameters as a vector which would be . This is another zero index dimensional vector.
- My hypothesis can now be written . And this equation is the same as this one on top because . And the neat thing is I can now take this form of the hypothesis and write this as .
- If you write out what is, this is . So this thing here is , and this is actually matrix which is also called a row vector. And you take that and multiply it with the vector which is . And so the inner product that is which is just equal to this (). This gives us a convenient way to write the form of hypothesis as just the inner product between our parameter vector and our feature vector . And it is this little bit of notation that let us write this in this compact form. So that is the form of a hypothesis when we have multiple features.
And just to give this another name, this is also called multivariate linear regression. And the term multivariate, that's just maybe a fancy term for saying that we have multiple features, or multiple variables with which to try to predict the value .
<end>