Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Variance of symbolic shape #4446

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

marcellofuschi
Copy link
Contributor

@marcellofuschi marcellofuschi commented May 6, 2024

Makes test_symbolic_var pass

@marcellofuschi marcellofuschi changed the title Implement variance of symbolic shape Variance of symbolic shape May 6, 2024
square_sum = ((self - self.mean(axis=axis, keepdim=True)).square()).sum(axis=axis, keepdim=keepdim)
return square_sum.div(max(0, prod(self.shape)/prod(square_sum.shape)-correction))
squares = (self - self.mean(axis=axis, keepdim=True)).square()
n = prod([si for si, so in zip(self.shape, squares.sum(axis=axis, keepdim=True).shape) if si != so])
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For context, this is the same logic used in mean function

@chenyuxyz
Copy link
Collaborator

It's likely correct, but this and symbolic shape mean need tests in symbolic_ops and symbolic_jit to make sure it's generating a kernel that uses variable input, and jit works. Either you can add relevant tests in those files, or i can take a look later this week (busy with mlperf this week).

Copy link
Contributor

This branch currently is behind tinygrad/master. The line count difference bot is disabled.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants