![]() |
NeuZephyr
Simple DL Framework
|
| ▼Nnz | |
| ▼NcuStrm | Provides core components for CUDA stream and event lifecycle management in GPU computing environments |
| CEventPool | Internal event management system for CUDA stream synchronization (Part of StreamManager) |
| CStreamManager | Centralized CUDA stream and resource management system with automatic dependency tracking |
| ▼Ndata | Contains data structures and utilities for tensor operations in machine learning workflows |
| CDimension | Represents a multi - dimensional shape, typically used in deep learning for tensor dimensions |
| CMappedTensor | A class for representing multidimensional arrays in CUDA zero-copy memory, providing host-accessible container-like interfaces |
| CTensor | A class for representing and manipulating multidimensional arrays (tensors) in GPU memory |
| ▼Ngraph | Contains classes and functions for managing and executing computation graphs in deep learning workflows |
| CComputeGraph | Represents a computational graph, which manages nodes and the computation flow |
| ▼Nnodes | Contains classes and functionality for nodes in a neural network or computational graph |
| ▼Ncalc | Contains classes and functionality for computation nodes in a neural network or computational graph |
| CAddNode | Represents a node that performs element-wise addition between two input tensors |
| CAveragePoolingNode | Implements average pooling operation for spatial downsampling in neural networks |
| CCol2ImgNode | Reconstructs spatial tensors from column matrices generated by im2col transformation |
| CELUNode | Represents an Exponential Linear Unit (ELU) activation function node in a computational graph |
| CExpandNode | Expands tensors with batch size 1 to arbitrary batch dimensions through data replication |
| CGlobalAvgPoolNode | Performs global average pooling operation across spatial dimensions of input tensor |
| CGlobalMaxPoolNode | Performs global max pooling operation across spatial dimensions of input tensor |
| CHardSigmoidNode | Represents a Hard Sigmoid activation function node in a computational graph |
| CHardSwishNode | Represents a Hard Swish activation function node in a computational graph |
| CImg2ColNode | Implements im2col transformation for efficient convolution operations in neural networks |
| CLeakyReLUNode | Represents a Leaky Rectified Linear Unit (LeakyReLU) activation function node in a computational graph |
| CMatMulNode | Represents a matrix multiplication operation node in a computational graph |
| CMaxPoolingNode | Implements max pooling operation for spatial downsampling with feature preservation |
| CReLUNode | Represents a Rectified Linear Unit (ReLU) operation node in a computational graph |
| CReshapeNode | Implements tensor shape transformation within a neural network computational graph |
| CScalarAddNode | Represents a scalar addition operation node in a computational graph |
| CScalarDivNode | Represents a scalar division operation node in a computational graph |
| CScalarMulNode | Represents a scalar multiplication operation node in a computational graph |
| CScalarSubNode | Represents a scalar subtraction operation node in a computational graph |
| CSigmoidNode | Represents a Sigmoid activation function node in a computational graph |
| CSoftmaxNode | Implements the Softmax activation function as a node in a neural network computational graph |
| CSubNode | Represents a subtraction operation node in a computational graph |
| CSwishNode | Represents a Swish activation function node in a computational graph |
| CTanhNode | Represents a hyperbolic tangent (tanh) activation function node in a computational graph |
| ▼Nio | This namespace contains standard nodes used in computational graphs for neural networks |
| CInputNode | Represents an input node in a computational graph |
| COutputNode | Base class for loss function nodes in a computational graph |
| ▼Nloss | Contains loss function nodes for computing various loss metrics in a machine learning model |
| CBinaryCrossEntropyNode | Represents the Binary Cross-Entropy (BCE) loss function node in a computational graph |
| CMeanSquaredErrorNode | Represents the Mean Squared Error (MSE) loss function node in a computational graph |
| CNode | Base class for nodes in a neural network or computational graph |
| ▼Nopt | Contains optimization algorithms for training deep learning models |
| CAdaDelta | AdaDelta optimizer for deep learning models |
| CAdaGrad | AdaGrad optimizer for deep learning models |
| CAdam | Adam optimizer for deep learning models |
| CMomentum | Momentum optimizer for deep learning models |
| CNAdam | NAdam optimizer for deep learning models |
| COptimizer | Base class for optimization algorithms in deep learning |
| CRMSprop | RMSprop optimizer for deep learning models |
| CSGD | Stochastic Gradient Descent (SGD) optimizer for deep learning models |
| CCudaException | A final class that represents CUDA exceptions, inheriting from std::runtime_error |
| CModel | Base class for constructing neural network models with automatic computation graph management |