[MS0540] Mathematics for AI

Gradient Descent

Error=YtrueYpredictError = Y_{true} - Y_{predict}

Loss Function

COST=i=1n(Y_trueiY_predicti)2COST = \sum_{i=1}^n (Y\_true_i - Y\_predict_i)^2

We want to minimize the cost of the function. we can re-model and substitute the equation above into a simpler term below.

Essentially,

f(x)=X2,y=X2f(x) = X^2, \\ y=X^2

What it means to minimize the function above is - to find the X value that can produce the lowest y-value. (the minimum point of the graph y=X^2)

Last updated

Was this helpful?