![]() |
NeuZephyr
Simple DL Framework
|
Contains optimization algorithms for training deep learning models. More...
Classes | |
class | AdaDelta |
AdaDelta optimizer for deep learning models. More... | |
class | AdaGrad |
AdaGrad optimizer for deep learning models. More... | |
class | Adam |
Adam optimizer for deep learning models. More... | |
class | Momentum |
Momentum optimizer for deep learning models. More... | |
class | NAdam |
NAdam optimizer for deep learning models. More... | |
class | Optimizer |
Base class for optimization algorithms in deep learning. More... | |
class | RMSprop |
RMSprop optimizer for deep learning models. More... | |
class | SGD |
Stochastic Gradient Descent (SGD) optimizer for deep learning models. More... | |
Contains optimization algorithms for training deep learning models.
The nz::opt
namespace includes a collection of optimization algorithms designed to update model parameters during the training of deep learning models. These optimizers aim to minimize the loss function by adjusting the learning rate dynamically or incorporating momentum terms to improve convergence.
Key components in this namespace:
These optimizers are designed to work efficiently in high-performance computing environments, utilizing GPU-based tensor operations to accelerate training. The algorithms in this namespace can be easily extended to support additional optimization strategies in the future.
This namespace plays a critical role in the optimization of deep learning models by providing a set of tools to adaptively adjust model parameters during training, improving the overall performance and stability of the training process.