Module dense_nn

Source
Expand description

A minimal dense neural network implementation for educational purposes.

Layers operate on Matrix values and support ReLU and Sigmoid activations. This is not meant to be a performant deep‑learning framework but rather a small example of how the surrounding matrix utilities can be composed.

use rustframe::compute::models::dense_nn::{ActivationKind, DenseNN, DenseNNConfig, InitializerKind, LossKind};
use rustframe::matrix::Matrix;

// Tiny network with one input and one output neuron.
let config = DenseNNConfig {
    input_size: 1,
    hidden_layers: vec![],
    output_size: 1,
    activations: vec![ActivationKind::Relu],
    initializer: InitializerKind::Uniform(0.5),
    loss: LossKind::MSE,
    learning_rate: 0.1,
    epochs: 1,
};
let mut nn = DenseNN::new(config);
let x = Matrix::from_vec(vec![1.0, 2.0], 2, 1);
let y = Matrix::from_vec(vec![2.0, 3.0], 2, 1);
nn.train(&x, &y);

Structs§

DenseNN
A multi-layer perceptron with full configurability
DenseNNConfig
Configuration for a dense neural network

Enums§

ActivationKind
Supported activation functions
InitializerKind
Weight initialization schemes
LossKind
Supported losses