Andrew Ng’s Machine Learning Notes: Linear Regression with One Variable

2. Linear Regression with One Variable
2.1 Model representation

Our first learning algorithm is the linear regression algorithm. In this video, you will see an overview of the algorithm, and more importantly, you will understand the complete process of supervised learning.

Let’s start with an example: This example is to predict housing prices, and we are going to use a data set containing housing prices in Portland, Oregon. Here, I'm going to plot my data set based on the prices that different house sizes sold for. For example, if your friend's house is 1,250 square feet, you need to tell them how much the house can sell for. Well, one thing you can do is build a model, maybe a straight line, and from this data model, maybe you can tell your friend that he can sell this house for about 220,000 (USD) or so. This is an example of a supervised learning algorithm.
insert image description hereIt’s called supervised learning because for each piece of data, we give the “right answer,” which tells us what the actual price of the house is based on our data, and, more specifically, This is a regression problem. The term regression refers to the fact that we predict an accurate output value based on previous data, which in this case is price. At the same time, there is another most common supervised learning method called classification problem. When we want to predict discrete The output value, for example, we are looking for cancer tumors and want to determine whether the tumor is benign or malignant, this is the problem of 0/1 discrete output. Furthermore, in supervised learning we have a data set, which is called the training set.

I will use lowercase throughout this course to indicate the number of training examples.
Take the previous housing transaction issue as an example,

Guess you like

Origin blog.csdn.net/zy_dreamer/article/details/132737538