Skip to content

rudrodip/timmygrad

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Timmygrad

Timmygrad is a scalar value gradient descent optimizer for Python. It is designed to be simple and easy to use, with a focus on readability and understandability. It is not designed for performance, but rather for educational purposes.

this is timmy btw

Here is a simple Linear Regression example using Timmygrad:

m = Value(0.0)
c = Value(0.0)

alpha = 0.01 # learning rate
epochs = 200

for epoch in range(epochs):
  for x, y in zip(X, Y):
    # forward pass
    y_pred = m * x + c

    # compute loss
    loss = (y - y_pred) ** 2

    # backward pass
    loss.backward()

    # update weights
    m.data -= alpha * m.grad
    c.data -= alpha * c.grad

    # reset gradients
    m.grad = 0
    c.grad = 0