Building an Autograd engine to understand Automatic Differentiation from the ground up

Demograd is a minimal Autograd engine and neural network library built for educational purposes. It is designed to mimic the core functionalities of PyTorch, including a tensor class with automatic differentiation, a set of basic differentiable operations, activation functions, neural network layers, and optimizers. The design emphasizes clarity, modularity, and reproducibility.

Check it out here :

GitHub - Nizben/demograd

Note: This automatic differentiation engine is also heavily inspired by Karpathy’s Micrograd.

Overview

Demograd provides the following core components:

Usage

Building Models

You can construct neural networks by composing layers defined in nn.py. For example, a simple MLP can be created as follows:

from demograd.nn import Linear, Sequential
from demograd.activations import ReLU

# Define an MLP with one hidden layer:
model = Sequential(
    Linear(input_dim, hidden_dim),
    ReLU.apply,
    Linear(hidden_dim, output_dim)
)