Coding With Fun
Home Docker Django Node.js Articles Python pip guide FAQ Policy

What's the difference between objective function, cost function, loss function?


Asked by Shepherd Arroyo on Dec 04, 2021 FAQ



In machine learning, people talk about objective function, cost function, loss function. Are they just different names of the same thing? When to use them? If they are not always refer to the same thing, what are the differences?
Thereof,
The terms cost and loss functions are synonymous (some people also call it error function). The more general scenario is to define an objective function first, which we want to optimize. This objective function could be to
Next, As such, the objective function is often referred to as a cost function or a loss function and the value calculated by the loss function is referred to as simply “ loss.” The function we want to minimize or maximize is called the objective function or criterion.
Accordingly,
The objective functionis the function you want to maximize or minimize. When they call it "cost function" (again, it's the objective function) it's because they want to only minimize it. I see the cost function and the objective function as the same thing seen from slightly different perspectives.
Consequently,
When dealing with modern neural networks, almost any error function could be eventually called a cost/loss/objective and the criterion at the same time. Therefore, it is important to distinguish between their usages: