Skip to content Skip to sidebar Skip to footer

How Can I Print The Learning Rate At Each Epoch With Adam Optimizer In Keras?

Because online learning does not work well with Keras when you are using an adaptive optimizer (the learning rate schedule resets when calling .fit()), I want to see if I can just

Solution 1:

I am using the following approach, which is based on @jorijnsmit answer:

defget_lr_metric(optimizer):
    deflr(y_true, y_pred):
        return optimizer._decayed_lr(tf.float32) # I use ._decayed_lr method instead of .lrreturn lr

optimizer = keras.optimizers.Adam()
lr_metric = get_lr_metric(optimizer)

model.compile(
    optimizer=optimizer,
    metrics=['accuracy', lr_metric],
    loss='mean_absolute_error', 
    )

It works with Adam.

Solution 2:

I found this question very helpful. A minimal workable example that answers your question would be:

defget_lr_metric(optimizer):
    deflr(y_true, y_pred):
        return optimizer.lr
    return lr

optimizer = keras.optimizers.Adam()
lr_metric = get_lr_metric(optimizer)

model.compile(
    optimizer=optimizer,
    metrics=['accuracy', lr_metric],
    loss='mean_absolute_error', 
    )

Solution 3:

For everyone that is still confused on this topic:

The solution from @Andrey works but only if you set a decay to your learning rate, you have to schedule the learning rate to lower itself after 'n' epoch, otherwise it will always print the same number (the starting learning rate), this is because that number DOES NOT change during training, you can't see how the learning rates adapts, because every parameter in Adam has a different learning rate that adapts itself during the training, but the variable lr NEVER changes

Solution 4:

classMyCallback(Callback):
    defon_epoch_end(self, epoch, logs=None):
        lr = self.model.optimizer.lr
        # If you want to apply decay.
        decay = self.model.optimizer.decay
        iterations = self.model.optimizer.iterations
        lr_with_decay = lr / (1. + decay * K.cast(iterations, K.dtype(decay)))
        print(K.eval(lr_with_decay))

Follow this thread.

Solution 5:

This piece of code might help you. It is based on Keras implementation of Adam optimizer (beta values are Keras defaults)

from keras import Callback
from keras import backend as K
classAdamLearningRateTracker(Callback):
    defon_epoch_end(self, logs={}):
        beta_1=0.9, beta_2=0.999
        optimizer = self.model.optimizer
        if optimizer.decay>0:
            lr = K.eval(optimizer.lr * (1. / (1. + optimizer.decay * optimizer.iterations)))
        t = K.cast(optimizer.iterations, K.floatx()) + 1
        lr_t = lr * (K.sqrt(1. - K.pow(beta_2, t)) /(1. - K.pow(beta_1, t)))
        print('\nLR: {:.6f}\n'.format(lr_t))

Post a Comment for "How Can I Print The Learning Rate At Each Epoch With Adam Optimizer In Keras?"