Built-in optimizer classes.
Modules
schedulesmodule: Public API for tf.keras.optimizers.schedules namespace.
Classes
class Adadelta: Optimizer that implements the Adadelta algorithm.class Adagrad: Optimizer that implements the Adagrad algorithm.class Adam: Optimizer that implements the Adam algorithm.class Adamax: Optimizer that implements the Adamax algorithm.class Ftrl: Optimizer that implements the FTRL algorithm.class Nadam: Optimizer that implements the NAdam algorithm.class Optimizer: Updated base class for optimizers.class RMSprop: Optimizer that implements the RMSprop algorithm.class SGD: Stochastic gradient descent and momentum optimizer.
Functions
deserialize(...): Inverse of theserializefunction.get(...): Retrieves a Keras Optimizer instance.serialize(...)


