A loss function, also known as a cost function or objective function, is a mathematical function that quantifies the discrepancy between the predicted output of a machine learning model and the true (ground truth) values. Its primary purpose is to provide a measurable value indicating how well the model is performing given the current set of parameters. The goal of a machine learning algorithm during training is typically to minimize this loss function, often through optimization algorithms like gradient descent. Different types of machine learning tasks and models utilize various loss functions; for example, mean squared error (MSE) is common for regression tasks, while cross-entropy loss is used for classification. The choice of an appropriate loss function is critical as it directly influences how the model learns and the nature of the errors it prioritizes minimizing.
Data Selection & Data Viewer
Get data insights and find the perfect selection strategy
Learn MoreSelf-Supervised Pretraining
Leverage self-supervised learning to pretrain models
Learn MoreSmart Data Capturing on Device
Find only the most valuable data directly on device
Learn More