Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP: implement quantile loss function #3285

Draft
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

trdvangraft
Copy link
Contributor

Add quantile loss to fastai, solves #3279

@trdvangraft trdvangraft requested a review from jph00 as a code owner March 27, 2021 15:17
@review-notebook-app
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@ReinierKoops
Copy link
Contributor

ReinierKoops commented Mar 27, 2021

Every file seems to have been edited, maybe a new pull request with only the actual feature change?

@ReinierKoops
Copy link
Contributor

Ah, now I see. The whole library had to be rebuild because of the addition of your one feature.

@trdvangraft trdvangraft changed the title implement quantile loss function WIP: implement quantile loss function Mar 31, 2021
@hamelsmu hamelsmu closed this Apr 8, 2021
@hamelsmu hamelsmu reopened this Apr 8, 2021
@hamelsmu hamelsmu marked this pull request as draft April 8, 2021 05:08
@hamelsmu
Copy link
Member

hamelsmu commented Apr 8, 2021

@ReinierKoops I converted this to a draft PR while you sort the conflicts (and also b/c the PR title says WIP)

@jph00
Copy link
Member

jph00 commented Apr 30, 2021

@trdvangraft please let me know when this is ready to review

@tcapelle
Copy link
Contributor

tcapelle commented May 31, 2021

Would be cool to do something like this: https://www.angioi.com/time-series-encoder-decoder-tensorflow/ using the new GaussianLoss in pytorch 1.8, basically, instead of doing a regression, you predict mean and std and fit a normal distribution that minimizes NLL. I can help you with this if you want.

@trdvangraft
Copy link
Contributor Author

Hey sorry for the long radio silence, I got tangled up in some other work. I think we can package this MR together @tcapelle to add some more loss functions. But first we will have to figure out how to get rid off the notebook duplication and get it updated with master again.

@tcapelle
Copy link
Contributor

tcapelle commented Jun 1, 2021

Just create a new PR

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants