Linear-Regression

## Cost Function

The measurement of accuracy of a hypothesis function. The accuracy is given as an average difference of all the results of the hypothesis from the inputs ($$x$$’s) to the outputs ($$y$$’s). $$J(\Theta_{0},\Theta_{1})=\frac{1}{2m}\sum_{i=1}^{m}(h_{\Theta}(x_{i}) - y_{i})^{2}$$ where $$m$$ is the number of inputs (e.g. training examples) This function is also known as the squared error function or mean squared error. The $$\frac{1}{2}$$ is a convenience for the cancellation of the 2 which will be present due to the squared term being derived (see gradient descent). ...

## Hypothesis Function

A function which maps values $$x$$ to an output value $$y$$. Historically, in ML, hypothesis functions are denoted $$h(x^{(i)})$$.