Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GPT Usage #170

Open
kocemir opened this issue Mar 25, 2024 · 0 comments
Open

GPT Usage #170

kocemir opened this issue Mar 25, 2024 · 0 comments

Comments

@kocemir
Copy link

kocemir commented Mar 25, 2024

Hi,
I would like to ask a question about generative pretraining using GPT. As far as I know, GPT uses the transformer decoder as well and one of the main train scheme is to predict next tokens. In scGPT pretraining, what "next token"corresponds to? I did not get the difference between BERT architecture of scGPT architecture?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant