Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixes error in get losses functions #2362

Open
wants to merge 28 commits into
base: main
Choose a base branch
from
Open

Conversation

canergen
Copy link
Contributor

@canergen canergen commented Dec 8, 2023

  • Closes #xxxx (Replace xxxx with the GitHub issue number)
  • Tests added and passed if fixing a bug or adding a new feature
  • All code checks passed
  • Added type annotations to new arguments/methods/functions
  • Added an entry in the latest docs/release_notes/index.md file if fixing a bug or adding a new feature
  • If the changes are patches for a version, I have added the on-merge: backport to x.x.x label

Copy link

codecov bot commented Dec 8, 2023

Codecov Report

Attention: 3 lines in your changes are missing coverage. Please review.

Comparison is base (2c63d44) 89.42% compared to head (37002f9) 88.90%.
Report is 1 commits behind head on main.

Files Patch % Lines
scvi/model/base/_log_likelihood.py 88.88% 3 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #2362      +/-   ##
==========================================
- Coverage   89.42%   88.90%   -0.52%     
==========================================
  Files         153      153              
  Lines       12564    12582      +18     
==========================================
- Hits        11235    11186      -49     
- Misses       1329     1396      +67     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@canergen canergen added the on-merge: backport to 1.1.x on-merge: backport to 1.1.x label Dec 12, 2023
Comment on lines +612 to +613
if n_mc_samples_per_pass == 1:
log_prob_sum = log_prob_sum.unsqueeze(0)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wait so does this mean that this method was not working properly before? Since the default is n_mc_samples_per_pass=1

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes but we only use it for DE genes where it's called with n_samples_per_mc of 100 (?). ScANVI wasn't supporting importance weighting in DE beforehand.
ScANVI doesn't work with multiple samples (major work to fix this) and this might be an issue for multi-GPU support (see bug report). If you want to fix this, I have a fix for broadcast labels but then dropped it as also the encoder doesn't support it.

elbo += kl_global
return elbo / n_samples
else:
return elbo + kl_global / n_samples
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Probably best to also use torch operations here since compute_reconstruction_error doesn't use numpy. Let me know what you think

Copy link
Contributor Author

@canergen canergen Jan 5, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fine with me. Can you puh those changes?

@martinkim0
Copy link
Contributor

I'll get this in by today @canergen

@martinkim0
Copy link
Contributor

#2429 needs to be addressed before this

@martinkim0 martinkim0 removed the on-merge: backport to 1.1.x on-merge: backport to 1.1.x label Jan 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants