Relation Between Variables And Variables

1157 Words5 Pages
Introduction Ordinary least-squares (OLS) regression, also called linear regression, is one of the most commonly used modelling techniques, helping us examine the relationship between variables. OLS regression assumes that there is a linear relationship beteen the dependent variable and the independent variable. Basic Features With a single independent variable, this relationship can be represented as y = β0 + β1x, where β0 is the in- tercept of the model, and β1 is the parameter of the regression or the co- efficient. Our estimation using least squares approach is finding a best- fiiting straight line through the data points with equation yˆ = βˆ0 + βˆ1x, as shown in Figure 1. The reason why we…show more content…
Thus, simply adding up all the residuals from the data is not a good appraoch as the negative values and positive ones will cancel out with each other. As a result, we square all the residuals and then add them up to eliminate the influence of cancel- lation. The sum of all the squared residuals is known as the residual sum of squares (RSS), which is a con- venient measure of model-fit for OLS regression. A model that fits the data poorly will deviate significantly from the data and will consequently have a relatively large RSS, while a good- fitting model will not deviate markedly from the data, resulting in a relatively small RSS (a perfectly fitting model will have an RSS of zero, as there is no deviation between the observed and predicted values). Finding Regression Parameters with Gradient Descent Let us define the OLS cost function as: J (w) = 1 X(yˆ(i) y(i))2 2 i=1 Here, yˆ(i) is the ith predicted value, y(i) is the corresponding observed value and m is the number of observations in the data set(note that the term 1/2 is just used for convenience in later derivation). We can represent yˆ(i) = β0 + β1x(i). Hence, we will have the following cost function: yˆ(i) as 1 m J (β0, β1) = X 2 i=1 (β0 + β1x (i) (i) 2 − Now our job is
Open Document