Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What should I do when my data volume is very large, more than 100 million? #383

Open
smockgithub opened this issue Sep 15, 2021 · 3 comments

Comments

@smockgithub
Copy link

When there is a lot of data, the relational table will generate a lot of data. How to deal with it to ensure performance?

@krtschmr
Copy link

throw more money to your postgres instance?

@seuros
Copy link
Member

seuros commented Sep 17, 2021

@smockgithub this question is very vague. You are not defining what is performance nor which database you using.

If you experience slowness, you maybe need a bigger instance or some caching mechanism.

@kbrock
Copy link
Contributor

kbrock commented Apr 10, 2023

Can this be closed?

Not sure if there are any other answers besides "ensure your indexes are meeting your needs" and "ensure the postgres instance is big enough"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants