Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Curiosity] Extending word2vec to sequences with a timestamp #8

Open
octarinesun opened this issue Jan 28, 2020 · 1 comment
Open

Comments

@octarinesun
Copy link

I've heard of using sequence embedding for things like visits to a website to predict purchase behavior or, more pertinent to my work, embeddings for patient visits to predict something like a hospital admission.

How can we extend what we will cover for word2vec and word embeddings to start looking at sequence embedding when there is a timestamp associated with the "word." Similar to what is covered in this "patient2vec" paper? https://arxiv.org/abs/1810.04793 or med2vec: https://arxiv.org/abs/1602.05568

@octarinesun octarinesun changed the title [Curiosity] Extending a word2vec to sequences with a timestamp [Curiosity] Extending word2vec to sequences with a timestamp Jan 28, 2020
@dougmet
Copy link

dougmet commented Jan 29, 2020

Only scanned the abstracts there but if time ordering is important then bringing in an RNN or LSTM will allow for sequences that are time ordered.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants