Cost Function
The measurement of accuracy of a hypothesis function. The accuracy is given as an average difference of all the results of the hypothesis from the inputs (x’s) to the outputs (y’s). J(Θ0,Θ1)=12mm∑i=1(hΘ(xi)−yi)2
The measurement of accuracy of a hypothesis function. The accuracy is given as an average difference of all the results of the hypothesis from the inputs (x’s) to the outputs (y’s). J(Θ0,Θ1)=12mm∑i=1(hΘ(xi)−yi)2
Gradient Descent Cost Function Hypothesis Function Artificial Neural Network
An optimization algorithm for finding the local minimum of a differentiable function. (The red arrows show the minimums of J(Θ0,Θ1), i.e. the cost function) To find the minimum of the cost function, we take its derivative and “move along” the tangential line of steepest (negative) descent. Each “step” is determined by the coefficient α, which is called the Learning Rate. Θjnew:=Θjold−α∂∂ΘjJ(Θ0,Θ1)
A function which maps values x to an output value y. Historically, in ML, hypothesis functions are denoted h(x(i)).
Artificial Neural Network (ANN) Layers All learning occurs in the layers. In the image, below, there are three layers, but there could be only one, or many more. In the example image the first layer is known as the Input Layer, the second the Hidden Layer, and the third the Output Layer. In a 3+ layered ANN, any layer that is not the input/output layer is a Hidden Layer. ...