class documentation
class ReduceLROnPlateau(BaseCallback): (source)
Constructor: ReduceLROnPlateau(monitor, patience, factor, min_val, ...)
Reduce Learning Rate on Plateau callback
This callback function monitors a given metric for a set amount of iterations and reduces the learning rate if the metric doesn't improve for the given duration.
Method | __init__ |
Initialize |
Method | run |
The callback method that will be called after each epoch |
Instance Variable | _factor |
Undocumented |
Instance Variable | _metric |
Undocumented |
Instance Variable | _min |
Undocumented |
Instance Variable | _monitor |
Undocumented |
Instance Variable | _patience |
Undocumented |
Instance Variable | _patience |
Undocumented |
Instance Variable | _verbose |
Undocumented |
def __init__(self, monitor:
str
= 'loss', patience: int
= 10, factor: float
= 0.9, min_val: float
= 1e-10, verbose: bool
= False, **kwargs):
(source)
¶
Initialize
- Args
- monitor:
- The metric to monitor. It is one of the 4: loss, acc, val_loss, val_acc
- patience:
- How many epoch to monitor before stopping
- factor:
- fraction to which the learning rate will be lowered to. Note - This is not by how much to reduce but how much to reduce to
- min_val:
- Min possible learning rate
- verbose:
- Log callback function logs