lr_schedules module¶
- class lr_schedules.LRConstantSchedule(lr_initial)¶
Bases:
lr_schedules.LRSchedule
Constant learning rate schedule.
- lr_initial¶
Initial, or base, learning rate.
- Type
float
- lr¶
The latest learning rate.
- Type
float
- step¶
Update step counter used for applying the learning rate schedule.
- Type
int
- __init__()¶
Constuctor.
- apply_schedule()¶
Applies the constant learning rate schedule.
- get_lr()¶
Returns the latest learning rate.
- apply_schedule()¶
Applies the constant learning rate schedule.
- Parameters
None –
- Returns
- Return type
Notes
None
- get_lr()¶
Returns the latest learning rate.
- Parameters
None –
- Returns
The latest learning rate.
- Return type
float
Notes
None
- class lr_schedules.LRCyclingSchedule(lr_initial, lr_max, step_size)¶
Bases:
lr_schedules.LRSchedule
Cyclical learning rate schedule.
- lr_initial¶
Initial, or base, learning rate.
- Type
float
- lr¶
The latest learning rate.
- Type
float
- step¶
Update step counter used for applying the learning rate schedule.
- Type
int
- lr_max¶
The maximum learning rate.
- Type
float
- step_size¶
The step size in number of update steps. A full cycle is 2 * step_size
- Type
int
- __init__()¶
Constuctor.
- apply_schedule()¶
Applies the constant learning rate schedule.
- get_lr()¶
Returns the latest learning rate.
Notes
Based on: Cyclical Learning Rates for Training Neural Networks Available at: https://arxiv.org/abs/1506.01186
The schedule starts at lr_initial, goes to lr_max in step_size update steps, and then back to lr_initial in step_size update steps. A full cycle is 2*step_size update steps.
- apply_schedule()¶
Applies the cycling learning rate schedule.
- Parameters
None –
- Returns
- Return type
Notes
Based on: https://www.datacamp.com/community/tutorials/cyclical-learning-neural-nets
- get_lr()¶
Returns the latest learning rate.
- Parameters
None –
- Returns
The latest learning rate.
- Return type
float
Notes
None
- class lr_schedules.LRExponentialDecaySchedule(lr_initial, decay_steps, decay_rate)¶
Bases:
lr_schedules.LRSchedule
Exponential decay learning rate schedule.
- lr_initial¶
Initial, or base, learning rate.
- Type
float
- lr¶
The latest learning rate.
- Type
float
- step¶
Update step counter used for applying the learning rate schedule.
- Type
int
- decay_steps¶
The number of decay steps. The smaller, the faster the decay.
- Type
int
- decay_rate¶
The rate of decay. The smaller, the faster the decay.? (weird, but looks like that)
- Type
float
- __init__()¶
Constuctor.
- apply_schedule()¶
Applies the constant learning rate schedule.
- get_lr()¶
Returns the latest learning rate.
- apply_schedule()¶
Applies the exponential decay learning rate schedule.
- Parameters
None –
- Returns
- Return type
Notes
Based on: https://keras.io/api/optimizers/learning_rate_schedules/exponential_decay/
- get_lr()¶
Returns the latest learning rate.
- Parameters
None –
- Returns
The latest learning rate.
- Return type
float
Notes
None
- class lr_schedules.LRSchedule(lr_initial, repr_str)¶
Bases:
object
Learning rate schedule parent class.
- lr_initial¶
Initial, or base, learning rate.
- Type
float
- lr¶
The latest learning rate.
- Type
float
- step¶
Update step counter used for applying the learning rate schedule.
- Type
int
- __init__()¶
Constructor.