lingvo.tasks.car.lr_util module

Learning rate schedule utility functions.

lingvo.tasks.car.lr_util._GetTrainingStatistics(train_input_p)[source]

Get training statistics, including total batch size and steps per epoch.

lingvo.tasks.car.lr_util.SetExponentialLR(train_p, train_input_p, exp_start_epoch, total_epoch, warmup_epoch=0, limit_epoch=None, multiplier_min=0.01, warmup_init=0.0)[source]

Sets a linear rampup and exponential decay LR schedule on train_p.

This is a wrapper around LinearRampupExponentialDecayScaledByNumSplitSchedule that sets the steps using epochs and the training statistics.

Parameters
  • train_p – train parameters.

  • train_input_p – The training set input parameters.

  • exp_start_epoch – The start epoch of exponential annealing.

  • total_epoch – Total number of epoch to train.

  • warmup_epoch – Epoch for the warm up ramp to end at. Note that the learning rate will be fixed between the end of the warmup phase and the beginning of the exponential annealing phase.

  • limit_epoch – Epoch to end exponential annealing. If None, this will be set to 0.95 * total_epoch, that is, the last 5% of training time will be at the minimum learning rate.

  • multiplier_min – The multiplier minimum at the end of exponential decay.

  • warmup_init – Initial value for the warmup phase. Note that warm up can be disabled by either setting warmup_init to 1 or setting warmup_epoch to 0.