You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Feb 9, 2018. It is now read-only.
A user has a blog with 100s of blog posts. She makes some small changes to a draft, or creates a new draft, or updates a published post. Then when she goes back to the post list, we upload the local changes and then download and save to db every single post again (in one gigantic request), because the post list's ETag will have changed, which is obviously horrendously inefficient. In one casual test, this scenario (one small change to a single post) on a blog with 100 fairly long posts uses 250-350 KB of data! Imagine what happens when there are 5x more posts, and the user makes lots of small edits in several different posts!
These should help:
Temporary workaround: limit the number of posts downloaded (d96fd90)
Consider a common scenario:
A user has a blog with 100s of blog posts. She makes some small changes to a draft, or creates a new draft, or updates a published post. Then when she goes back to the post list, we upload the local changes and then download and save to db every single post again (in one gigantic request), because the post list's ETag will have changed, which is obviously horrendously inefficient. In one casual test, this scenario (one small change to a single post) on a blog with 100 fairly long posts uses 250-350 KB of data! Imagine what happens when there are 5x more posts, and the user makes lots of small edits in several different posts!
These should help:
The text was updated successfully, but these errors were encountered: