losses module¶
- class losses.CategoricalCrossEntropyLoss(loss_smoother)¶
Bases:
losses.Loss
Categorical cross-entropy loss. Usually preceeded by a linear activation. For multi-class classification. Inherits everything from class Loss.
- cache¶
Run-time cache of attibutes such as gradients.
- Type
dict
- loss_smoother¶
Loss smoother function.
- Type
- name¶
The name of the loss.
- Type
str
- repr_str¶
The string representation of the loss.
- Type
str
- __init__()¶
Constuctor.
- compute_loss(scores, y, layers_reg_loss)¶
Computes loss of classifier - also includes the regularization losses from previous layers.
- grad()¶
Computes the gradient of the loss function.
- compute_loss(scores, y)¶
Computes loss of classifier - also includes the regularization losses from previous layers.
- Parameters
scores (numpy.ndarray) – Scores. Usually from softmax activation. Shape is (batch size, number of classes)
y (numpy.ndarray) – True labels. Shape is (batch size, )
- Returns
loss – The overall loss of the classifier.
- Return type
float
Notes
None
- class losses.CategoricalHingeLoss(loss_smoother)¶
Bases:
losses.Loss
Categorical Hinge loss for realizing an SVM classifier. Usually preceeded by a linear activation. For multi-class classification. Inherits everything from class Loss.
- cache¶
Run-time cache of attibutes such as gradients.
- Type
dict
- loss_smoother¶
Loss smoother function.
- Type
- name¶
The name of the loss.
- Type
str
- repr_str¶
The string representation of the loss.
- Type
str
- __init__()¶
Constuctor.
- compute_loss(scores, y, layers_reg_loss)¶
Computes loss of classifier - also includes the regularization losses from previous layers.
- grad()¶
Computes the gradient of the loss function.
- compute_loss(scores, y)¶
Computes loss of classifier - also includes the regularization losses from previous layers.
- Parameters
scores (numpy.ndarray) – Scores. Usually from linear activation. Shape is (number of data points, number of classes)
y (numpy.ndarray) – True labels. Shape is (number of data points, )
- Returns
loss – The overall loss of the classifier.
- Return type
float
Notes
None
- class losses.Loss(name, loss_smoother)¶
Bases:
object
Loss parent class.
- cache¶
Run-time cache of attributes such as gradients.
- Type
dict
- __init__()¶
Constuctor.
- class losses.LossSmoother(repr_str)¶
Bases:
object
Loss smoother parent class. Smooths out abruptly varying losses.
- first_call¶
If the loss smoother is called for the first time.
- Type
bool
- repr_str¶
The string representation of the loss smoother.
- Type
str
- __init__()¶
Constuctor.
- __call__(loss)¶
Computes the smoothed loss.
- class losses.LossSmootherConstant¶
Bases:
losses.LossSmoother
Constant loss smoother. Does not smooth the loss.
- first_call¶
If the loss smoother is called for the first time.
- Type
bool
- repr_str¶
The string representation of the loss smoother.
- Type
str
- cache¶
Run-time cache of variables needed for computing the smoothed loss.
- Type
dict
- __init__()¶
Constuctor.
- __call__(loss)¶
Computes the smoothed loss.
- class losses.LossSmootherMovingAverage(alpha)¶
Bases:
losses.LossSmoother
Moving average loss smoother.
- first_call¶
If the loss smoother is called for the first time.
- Type
bool
- repr_str¶
The string representation of the loss smoother.
- Type
str
- cache¶
Run-time cache of variables needed for computing the smoothed loss.
- Type
dict
- __init__()¶
Constuctor.
- __call__(loss)¶
Computes the smoothed loss.
- class losses.MeanSquaredErrorLoss(loss_smoother)¶
Bases:
losses.Loss
MSE loss.
- cache¶
Run-time cache of attibutes such as gradients.
- Type
dict
- loss_smoother¶
Loss smoother function.
- Type
- name¶
The name of the loss.
- Type
str
- repr_str¶
The string representation of the loss.
- Type
str
- __init__()¶
Constuctor.
- compute_loss(scores, y, layers_reg_loss)¶
Computes loss of classifier.
- grad()¶
Computes the gradient of the loss function.
- compute_loss(scores, y)¶
Computes loss of classifier - also includes the regularization losses from previous layers.
- Parameters
scores (numpy.ndarray) – Scores. Usually from softmax activation. Shape is (batch size, )
y (numpy.ndarray) – True labels. Shape is (batch size, )
- Returns
loss – The overall loss of the classifier.
- Return type
float
Notes
None