class documentation

Reduce Learning Rate on Plateau callback

This callback function monitors a given metric for a set amount of iterations and reduces the learning rate if the metric doesn't improve for the given duration.

Method __init__ Initialize
Method run The callback method that will be called after each epoch
Instance Variable _factor Undocumented
Instance Variable _metric_old Undocumented
Instance Variable _min_val Undocumented
Instance Variable _monitor Undocumented
Instance Variable _patience Undocumented
Instance Variable _patience_ctr Undocumented
Instance Variable _verbose Undocumented
def __init__(self, monitor: str = 'loss', patience: int = 10, factor: float = 0.9, min_val: float = 1e-10, verbose: bool = False, **kwargs): (source)

Initialize

Args
monitor:
The metric to monitor. It is one of the 4: loss, acc, val_loss, val_acc
patience:
How many epoch to monitor before stopping
factor:
fraction to which the learning rate will be lowered to. Note - This is not by how much to reduce but how much to reduce to
min_val:
Min possible learning rate
verbose:
Log callback function logs
def run(self, model: BaseModel): (source)

The callback method that will be called after each epoch

Args
model: The model on which callback wil be called

Undocumented

_metric_old = (source)

Undocumented

_min_val = (source)

Undocumented

_monitor = (source)

Undocumented

_patience = (source)

Undocumented

_patience_ctr: int = (source)

Undocumented

_verbose = (source)

Undocumented