Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

static:warm performance issues #8410

Open
JonKaric opened this issue Jul 5, 2023 · 11 comments
Open

static:warm performance issues #8410

JonKaric opened this issue Jul 5, 2023 · 11 comments

Comments

@JonKaric
Copy link
Contributor

JonKaric commented Jul 5, 2023

Bug description

When running static:warm on my droplet, it seems to be maxing out the CPU. This is fine on smaller sites where it can build the cache in a swift time but causes issues when going into the 1000’s of entries. Due to maxing out the CPU, the command would start producing 504 timeouts.

Things tried but failed:

  • Using a 1cpu, 2cpu & 4cpu droplets. All have the same issue but gets better the higher performance you have.
  • I’ve tried offloading this onto a queue (mysql), and giving it a 1 second wait between jobs but you end up just getting a spike every second when it runs the job.
  • Using warm_concurrency set to 1 without a queue
  • Using the Cool Writings starter kit has the same issues
  • Deleting everything from my templates so I’ve got a totally blank site
  • stache:refresh is ran beforehand
  • Stache watcher is off

Extra context with conversation:
https://discord.com/channels/489818810157891584/842065697709490226/1125862143187759194

How to reproduce

  • Create a new starter kit and duplicate entries so there are more than 500 at least.
  • Push to a DO $6 droplet through Ploi
  • SSH into the server and htop to see real-time CPU usage.

Environment

- Digital Ocean Droplets
- Ploi
- Various envs

Antlers Parser

runtime (new)

@jackmcdade
Copy link
Member

Just playing devi's advocate here but, is it necessary to warm to the static cache? What if you just let it cache pages as they were requested?

@JonKaric
Copy link
Contributor Author

JonKaric commented Jul 5, 2023

The site I've got is disproportionately low-traffic compared to the number of entries, so each hit to a new page is going to be waiting longer than I'd like.

I should have noted also, the CPU does seem to spike also when hitting an uncached page by browser, so either way, it seems like it's putting some sort of load on my site. That's why I kinda wanted to warm it, hitting a generated page doesn't have any sort of slowdown.

@jasonvarga
Copy link
Member

If you use a queue, the load will be spread out.

@JonKaric
Copy link
Contributor Author

JonKaric commented Jul 6, 2023

I'm using a queue with a 5-second wait between jobs, it helps but doesn't work perfectly.

There are CPU spikes when hitting uncached pages by the browser, so I'm guessing it's the initial application load that's the problem?

@JonKaric
Copy link
Contributor Author

@jackmcdade @jasonvarga I think an elegant solution could be to extend the cache:warm to feature something like --secret and have it use the laravel maintenance mode secret key

This would give us the ability to deploy in a controlled manner, we could set a prerender template so any hits to the server take up trace resources, and we can max out warm_concurrency for the fastest our server can handle. Wouldn't need to worry about a small influx of users crashing the server that way.

@jasonvarga
Copy link
Member

Could you turn on maintenance mode, warm the cache, then disable maintenance mode?

@JonKaric
Copy link
Contributor Author

JonKaric commented Jul 18, 2023

Sadly not, the command just generates 503 errors

@mnlmaier
Copy link
Contributor

mnlmaier commented Jan 22, 2024

Is there an ETA for some sort of improvement regarding this issue?

"Not warming the cache" is simply not an answer any client with some sense of modern web performance will accept and just deal with.

And I wouldn't blame them, because I am not accepting this either.

To make it even worse: Pages do time out when uncached.

It's really affecting our projects and - to be honest - is definitely something which we will consider a deal breaker in the future in case it will not be actively worked on and improved. We've already had clients lose confidence in the system because of this.

So, to keep it short: what's the plan?

@jackmcdade
Copy link
Member

We're in a heavy R&D sprint looking at new approaches to this very problem. Hang in there!

@mnlmaier
Copy link
Contributor

mnlmaier commented Jan 23, 2024

That sounds great, @jackmcdade — thanks for getting back to us regarding this issue! It's good to know you guys are actively working on this and are aware of this problem.

Do you have any ETA of some sort of update?

Maybe it was communicated through Discord or some other communication channel, but it might be a good idea to note this down somewhere in the Roadmap. Having to tell clients "we don't know…" when asked about how we plan to deal with this issue is basically the worst case for us, especially when the client took our Certified Partnership into consideration when deciding to use Statamic.

In our team, we spent hours researching the problem and even trying out various use cases. We haven't had any major success, but still, would be glad to share findings with the Core Team. We're already in contact with @joshuablum, but will be happy to provide more information if it would be helpful for you.

@ryanmitchell
Copy link
Contributor

Hopefully the approach in #9396 might help a little here. The idea is once your static cache exists for a URL you can generate a new version without clearing the existing one.

Deploying a new site version seems like a good candidate for this as you would leave the existing static cache in place, then queue the recache calls to gradually populate the new site. Maybe the static:warm command could be extended to allow a recache option.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

6 participants