Skip to content

A tiny scalar-valued autograd engine and a neural net library on top of it with PyTorch-like API

Notifications You must be signed in to change notification settings

sloganking/micrograd-rust

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

95 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

micrograd-rust

Building my own Neural networks in Rust. Inspired by Andrej Karpathy's micrograd and related video. The code was built piece by piece from the ground up by me. I also took inspiration from rustygrad when initially figuring out how to make the Rust borrow checker happy with a DAG.

Installation

  • You must have graphviz installed for the graph pngs to be generated.

Examples

A simple equation

let a = Value::from(3.0);
let b = Value::from(4.0);
let c = Value::from(5.0);
let d = (a - b) * c;

d.backward();

graph::render_graph(&d).unwrap()

simple_operations

A single neuron with 3 inputs

let inputs = vec![Value::from(1.0), Value::from(2.0), Value::from(3.0)];
let neuron = neural::Neuron::new(inputs.len().try_into().unwrap());
let out = neuron.forward(inputs);
out.backward();
graph::render_graph(&out, neuron.get_subgraph_tree().unwrap()).unwrap();

neuron

A [4,4,1] layer MLP

let x_inputs = vec![Value::from(1.0), Value::from(2.0), Value::from(3.0)];
let y_target = Value::from(1.0);
let mlp = neural::MLP::new(x_inputs.len().try_into().unwrap(), vec![4, 4, 1]);
let preds = mlp.forward(x_inputs);
let pred = preds[0].clone();
let loss = (pred.clone() - y_target.clone()).pow(Value::from(2.0));
loss.backward();
graph::render_graph(&loss, mlp.get_subgraph_tree().unwrap()).unwrap();

mlp

About

A tiny scalar-valued autograd engine and a neural net library on top of it with PyTorch-like API

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages