Chat about this codebase

AI-powered code exploration

Online

Project Overview

nanograd is a minimal Rust library for automatic differentiation (AD) targeting scalar computations. It provides a simple API for building and differentiating computation graphs, along with utilities to visualize those graphs. Developers use nanograd to learn AD concepts, prototype small machine-learning experiments, or embed a lightweight AD engine into larger Rust projects.

What Is nanograd?

  • A Rust crate (nanograd) offering a Scalar type that tracks operations in a computation graph.
  • Implements reverse-mode AD so you can compute gradients via backward().
  • Includes graph-export functions to generate DOT files for visualization.

Main Features

Scalar Automatic Differentiation

  • Wrap f64 values in Scalar to record operations.
  • Compute gradients with backward().
  • Query grad() on any input node.

Computation Graph Visualization

  • Export graphs in DOT format.
  • Integrate with Graphviz or other tools to inspect your model’s structure.

Typical Use Cases

  • Teaching and learning the mechanics of reverse-mode AD.
  • Quick prototyping of small neural networks or optimization problems.
  • Embedding a no-frills AD engine into other Rust applications.

Quick Start Example

This example computes the derivative of f(x) = x² + 3x at x = 2.

use nanograd::Scalar;

fn main() {
    // Create scalar input
    let x = Scalar::new(2.0);
    // Build computation: y = x^2 + 3x
    let y = x.clone() * x + Scalar::new(3.0) * x.clone();
    // Compute gradients
    y.backward();
    // dy/dx = 2*x + 3 = 7
    println!("f(2) = {}, f'(2) = {}", y.data(), x.grad());
}

To visualize the graph:

use nanograd::graphviz::to_dot;
let dot = to_dot(&y);
// Write `dot` to file and render with Graphviz

Why Choose nanograd?

  • Lightweight: no heavy dependencies beyond rand.
  • Easy embedding: small API surface and MIT-licensed.
  • Educational: exposes the core of reverse-mode AD in under 200 lines of Rust.
  • Extensible: inspect and modify computation graphs for custom operations.

Quick Start & Basic Usage

Get Rust, clone nanograd, build and run the XOR demo in minutes.

Prerequisites

  • Rust toolchain (1.60+)
    Install via
    curl https://sh.rustup.rs -sSf | sh
    

Clone & Build

git clone https://github.com/e3ntity/nanograd.git
cd nanograd

Build and run the XOR demo (debug build):

cargo run

For faster execution, add --release:

cargo run --release

Minimal Scalar Usage

This example shows how to

  1. create Scalar values
  2. build a computation
  3. call .backward()
  4. inspect gradients
  5. dump the graph to HTML
use nanograd::Scalar;

fn main() {
    // 1. Create leaf nodes
    let x = Scalar::new(2.0);
    let y = Scalar::new(3.0);

    // 2. Build computation: z = x * y + 5
    let z = &x * &y + Scalar::new(5.0);

    // 3. Backpropagate
    z.backward();

    // 4. Inspect gradients
    println!("dz/dx = {}", x.grad()); // should print 3.0
    println!("dz/dy = {}", y.grad()); // should print 2.0

    // 5. Dump computation graph to HTML
    z.dump_graph("graph.html");
    println!("Graph written to graph.html");
}

Save this as src/bin/minimal.rs and run:

cargo run --bin minimal

One-Liner: Build, Run & View Graph

On macOS:

cargo run --release --bin minimal && open graph.html

On Linux:

cargo run --release --bin minimal && xdg-open graph.html
## Scalar API & Differentiation Concepts

This section describes the core Scalar type, how arithmetic and math functions build the computation graph automatically, and how reverse‐mode differentiation propagates gradients via backward(). It also lists available differentiable helpers and shows how to extend Scalar with custom operations.

### Scalar Type and Graph Nodes

`Scalar` wraps a floating‐point value, a gradient accumulator, and links to parent nodes with associated gradient‐propagation closures.

Key methods and properties:
- `Scalar::new(value: f64) → Scalar`  
- `value(&self) → f64`  
- `grad(&self) → f64`  
- `backward(&self)`  

Example:
```rust
use nanograd::Scalar;

let x = Scalar::new(2.0);
assert_eq!(x.value(), 2.0);
assert_eq!(x.grad(), 0.0);

Automatic Graph Construction via Operators

Scalar implements Add, Sub, Mul, Div, Neg to build the computation graph:

use nanograd::Scalar;

let a = Scalar::new(3.0);
let b = Scalar::new(4.0);
// Graph: c = a * b + a
let c = a * b + a;
// Parents of c: [a*b, a]; parents of a*b: [a, b]

Under the hood, each operator creates a new Scalar node, captures its inputs, and registers a closure that computes local derivatives during backward().

Reverse‐Mode Differentiation (backward())

Calling backward() on a target node triggers:

  1. A topological sort (DFS) from the target to leaves.
  2. Initialization: target.grad = 1.0.
  3. In reverse order, invoke each node’s stored gradient closure with its upstream gradient.

Example – computing ∂c/∂a and ∂c/∂b for c = a*b + a:

use nanograd::Scalar;

let a = Scalar::new(3.0);
let b = Scalar::new(4.0);
let c = a * b + a;
c.backward();

assert_eq!(a.grad(), b.value() + 1.0); // ∂c/∂a = b + 1
assert_eq!(b.grad(), a.value());       // ∂c/∂b = a

Differentiable Helper Functions

These functions live in src/scalar/func.rs. Each computes a forward value and registers gradient edges automatically.

Function signatures and derivatives:

  • fn abs(x: &Scalar) -> Scalar
    Derivative: sign(x) (±1)

  • fn max(a: &Scalar, b: &Scalar) -> Scalar
    Derivatives:
    ∂/∂a = {1 if a > b else 0},
    ∂/∂b = {1 if b >= a else 0}

  • fn ln(x: &Scalar) -> Scalar
    Derivative: 1/x

  • fn sqrt(x: &Scalar) -> Scalar
    Derivative: 1/(2·√x)

  • fn norm2(a: &Scalar, b: &Scalar) -> Scalar
    Computes a² + b²
    Derivatives: ∂/∂a = 2a, ∂/∂b = 2b

  • fn sigmoid(x: &Scalar) -> Scalar
    Computes 1/(1+e^(−x))
    Derivative: s·(1−s), where s = sigmoid(x)

  • fn relu(x: &Scalar) -> Scalar
    Derivative: {1 if x > 0 else 0}

Usage Example

use nanograd::Scalar;
use nanograd::scalar::func::{sigmoid, ln, norm2};

let x = Scalar::new(0.5);
let y = Scalar::new(2.0);

let z = sigmoid(x.ln()) + norm2(x, y).sqrt();
z.backward();

println!("dz/dx = {}", x.grad());
println!("dz/dy = {}", y.grad());

Extending Scalar with Custom Operations

To add a new differentiable operation:

  1. Compute the forward value.
  2. Create an output Scalar.
  3. Register gradient closures linking back to inputs.

Pattern (pseudo‐code):

fn square(x: &Scalar) -> Scalar {
  // 1. Forward
  let value = x.value().powi(2);

  // 2. Create output node
  let mut out = Scalar::new(value);

  // 3. Register gradient: d(x²)/dx = 2x
  out.add_child(x.clone(), move |upstream_grad| {
    x.borrow_mut().grad += 2.0 * x.value() * upstream_grad;
  });

  out
}

Internal notes:

  • Each Scalar node holds:
    • value: f64
    • grad: f64
    • parents: Vec<Scalar>
    • grad_fns: Vec<Box<dyn Fn(f64)>>
  • During backward(), the recorded grad_fns execute in reverse‐topological order.
  • Follow the pattern in src/scalar/func.rs for multi‐input ops: clone inputs, compute partials, register closures for each parent.

With these primitives, you can extend the library with new element‐wise or composite scalar operations and have gradients flow automatically through your custom functions.

Computation-Graph Visualisation

This section shows how to turn any Scalar root into an interactive D3.js force-directed graph. Use dump_graph() to generate a self-contained HTML file, open it in a modern browser, and optionally customise node labels and colours in src/plot.rs.

Dumping a Graph

Import and call dump_graph() with a Scalar reference and an output path:

use e3ntity::nanograd::plot::dump_graph;
use e3ntity::nanograd::prelude::*;

fn main() {
    // Build a simple computation
    let a = Scalar::new(5.0);
    let b = &a * 2.0 + 3.0;
    // Generate graph HTML
    dump_graph(&b, "computation-graph.html");
    println!("Graph written to computation-graph.html");
}

• The file computation-graph.html appears in your project root (or at any path you specify).
• Tested on Chrome 60+, Firefox 55+, Edge 16+ (D3.js v5).

Viewing and Interacting

  1. Open the HTML file in your browser.
  2. Pan with right-drag, zoom with scroll, drag individual nodes.
  3. Hover a node to highlight incoming/outgoing edges.

Customising Node Labels and Colours

Edit src/plot.rs before rebuilding the crate.

Changing Node Labels

Locate the serialization of each node:

// in serialize_node
let label = match node.kind {
    NodeKind::Value     => format!("{:.2}", node.value),
    NodeKind::Op(ref op) => op.clone(),
};
data["name"] = json!(label);

Example: include node ID in the label

let label = match node.kind {
    NodeKind::Value     => format!("{}:{:.2}", node.id, node.value),
    NodeKind::Op(ref op) => format!("{}({})", node.id, op),
};

Customising Colours

Find the get_color helper:

fn get_color(kind: &NodeKind) -> &'static str {
    match kind {
        NodeKind::Value     => "#2ca02c", // green
        NodeKind::Op(_)     => "#1f77b4", // blue
    }
}

Add operation-specific colours:

fn get_color(kind: &NodeKind) -> &'static str {
    match kind {
        NodeKind::Value         => "#2ca02c",
        NodeKind::Op(ref op) if op == "mul" => "#d62728",
        NodeKind::Op(_)         => "#1f77b4",
    }
}

After editing, run cargo build to embed your changes into the HTML output.

Advanced Usage in CI or Tests

You can call dump_graph() in integration tests to capture evolving graphs:

Cargo.toml:

[dev-dependencies]
e3ntity = { path = "../" }

tests/graph_debug.rs:

use e3ntity::nanograd::plot::dump_graph;
use e3ntity::nanograd::prelude::*;

#[test]
fn dump_sum_graph() {
    let x = Scalar::linspace(0.0, 1.0, 3);
    let y = x.sum();
    dump_graph(&y, "/tmp/graph_debug.html");
    // Inspect /tmp/graph_debug.html after test
}

This workflow helps debug large computation graphs by visualising node relationships and data flow.