Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Weird Beta outputs #64

Open
Darenar opened this issue Oct 3, 2019 · 0 comments
Open

Weird Beta outputs #64

Darenar opened this issue Oct 3, 2019 · 0 comments

Comments

@Darenar
Copy link

Darenar commented Oct 3, 2019

Hello!
I am implementing WTTE RNN on self-collected data from a company, where I have a censorship rate around 50 %.
There are 30 000 sequences with 500 timesteps.
The thing is that my TTE could be very large (up to 400 days), though predicted beta for each time step in train/valid/test is less than 1, which in turn seems weird for me (coz I can't even calculate mode on the resulted distribution - its 0 everywhere). Nevertheless during training loss function decreases over 15 epochs approximately. My data is normalised and censorship indicator + TTE seems to be correct.
Additionally, I tried fitting model only on non-censored sequences, and the result is still the same.
May be you have any ideas how is it possible could be?

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant