Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stream rendering to reduce TTFB and CPU load #1209

Closed
ghost opened this issue Feb 19, 2017 · 51 comments
Closed

Stream rendering to reduce TTFB and CPU load #1209

ghost opened this issue Feb 19, 2017 · 51 comments
Labels
area: app App directory (appDir: true) locked
Milestone

Comments

@ghost
Copy link

ghost commented Feb 19, 2017

I suggest to stream-render pages > 50KB (hypothetical stream overhead) to reduce TTFB and CPU load.

It would supersede #767. Preact does not support streaming (preactjs/preact#23 (comment)).

@arunoda
Copy link
Contributor

arunoda commented Feb 20, 2017

One thing to mention..

Stream rendering usually won't add much CPU improvements since the amount of work to be done is the same. But it'll reduce the response time.

It's a pretty good idea to provide a way to customize the SSR rendering system. But I think for now, we'll stick with the React's renderToString() methods by default.

This is something we could do after 2.0.

@ghost
Copy link
Author

ghost commented Feb 20, 2017

Stream rendering usually won't add much CPU improvements since the amount of work to be done is the same.

from aickin/react-dom-stream

One call to ReactDOM.renderToString can dominate the CPU and starve out other requests. This is particularly troublesome on servers that serve a mix of small and large pages.

Wouldn't streaming sensibly reduce memory allocation and CPU usage for large pages by being both asynchronous and partial?

@kojuka
Copy link

kojuka commented Mar 3, 2017

Thought this was along the same lines. Has anyone tried https://github.com/FormidableLabs/rapscallion ?

It provides a streaming interface so that you can start sending content to the client immediately.

Other features from the docs:

  • Rendering is asynchronous and non-blocking.
  • Rapscallion is roughly 50% faster than renderToString.
  • It provides a streaming interface so that you can start sending content to the client immediately.
  • It provides a templating feature, so that you can wrap your component's HTML in boilerplate without giving up benefits of streaming.
  • It provides a component caching API to further speed-up your rendering.

gcpantazis added a commit to gcpantazis/next.js that referenced this issue Jun 15, 2017
This has been discussed in vercel#1334 and vercel#1209. I'm submitting this as a proof
of concept for discussion, though as I'll explain below I do believe it
triggers a need to redesign `document.js`

The issue is that `document.js` uses `renderToString()` directly,
which makes it difficult to replace. I hacked this by:

**What I did here:**

1. creating a renderToParts() method, which in turn calls doRender with
added configuration to allow for the replacement of `renderToString`,
and keeping `renderToString` as the default.

2. Jettison `document.js`, and in the example `server.js` use
Rapscallion templating to effectively do the same thing.

3. 🎉🎉🎉!

**The main issue this raised:**

* The subcomponents of `Document` (`Head`, `NextScript`) rely on
context, and thus are difficult to interface with except through
`Document`. I got around this by falling back to props when context was
not defined. Hacky of course, and probably is the main thing we'd need
to refactor to do this "right".

**Results:**

The test I created for examples was to generate a MD5 hash for the first
3000 integers. Since this is deterministic, I cached it with
Rapscallion. `renderToString` continued to render it as normal on
baseline. The test run was with apache benchmark, `ab -n 500 -c 10`.

Baseline:

```
Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    1   0.4      0       4
Processing:   239 1293 246.1   1253    2099
Waiting:      238 1291 245.8   1251    2099
Total:        240 1293 246.1   1254    2100

Percentage of the requests served within a certain time (ms)
  50%   1254
  66%   1337
  75%   1425
  80%   1444
  90%   1588
  95%   1886
  98%   1981
  99%   1981
 100%   2100 (longest request)
```

With Rapscallion:

```
Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    1   0.3      0       4
Processing:    97  204  39.6    190     318
Waiting:       97  202  38.6    189     318
Total:         98  205  39.6    191     319

Percentage of the requests served within a certain time (ms)
  50%    191
  66%    205
  75%    224
  80%    235
  90%    269
  95%    307
  98%    315
  99%    315
 100%    319 (longest request)
```

... or a roughly 6-7x improvement. Not super-surprising, since caching,
but for us a larger portion of each page is static — being able to
cache that, while leaving other areas to be dynamic, is a huge win.

I played around with Rapscallion's streaming rendering as well, but I
think that'll be most useful with something a little more real-world.
Streaming seemed to have a small perf hit that gradually lessens the
larger the CPU effort / payload becomes. If you were adding a bunch
of CSS output in your header, for example, I think streaming would
greatly improve load time.

So yeah, open to suggestions here. I think this is a good stab at a
general interface for getting "parts" for a more atomic Next render,
but the API of that could certainly be better. Looking forward to
hearing what you think!

Love,
- @gcpantazis + @thumbtack ❤️
gcpantazis added a commit to gcpantazis/next.js that referenced this issue Jun 15, 2017
This adds support for, and an example of, using Rapscallion (or
potentially other alternate React renderers) with Next. This has been
discussed in vercel#1334 and vercel#1209. I'm submitting this as a proof of
concept for discussion, though the changes are fairly minimal. I'm
hoping with some bit of feedback we can add this for real!

The issue as mentioned by @arunoda elsewhere is that `document.js`
uses `renderToString()` directly, which makes it difficult to inject an
alternate renderer method.

**What I did here:**

1. creating a `renderToParts()` method, which in turn calls doRender with
added configuration to allow for the replacement of `renderToString`,
and keeping `renderToString` as the default.

2. Jettison `document.js`, and in the example `server.js` use
[Rapscallion templating](https://github.com/FormidableLabs/rapscallion#template) to effectively do the same thing.

3. 🎉🎉🎉!

**The main issue this raised:**

* The subcomponents of `Document` (`Head`, `NextScript`) rely on
context, and thus are difficult to interface with except through
`Document`. I got around this by falling back to props when context was
not defined. Hacky of course, and probably is the main thing we'd need
to refactor to do this "right".

**Results:**

The test I created for examples was to generate a MD5 hash for the first
3000 integers. Since this is deterministic, I cached it with
Rapscallion. `renderToString` continued to render it as normal on
baseline. The test run was with apache benchmark, `ab -n 500 -c 10`.

Baseline:

```
Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    1   0.4      0       4
Processing:   239 1293 246.1   1253    2099
Waiting:      238 1291 245.8   1251    2099
Total:        240 1293 246.1   1254    2100

Percentage of the requests served within a certain time (ms)
  50%   1254
  66%   1337
  75%   1425
  80%   1444
  90%   1588
  95%   1886
  98%   1981
  99%   1981
 100%   2100 (longest request)
```

With Rapscallion:

```
Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    1   0.3      0       4
Processing:    97  204  39.6    190     318
Waiting:       97  202  38.6    189     318
Total:         98  205  39.6    191     319

Percentage of the requests served within a certain time (ms)
  50%    191
  66%    205
  75%    224
  80%    235
  90%    269
  95%    307
  98%    315
  99%    315
 100%    319 (longest request)
```

... or a roughly 6-7x improvement. Not super-surprising, since caching,
but for us a larger portion of each page is static — being able to
cache that, while leaving other areas to be dynamic, is a huge win.

I played around with Rapscallion's streaming rendering as well, but I
think that'll be most useful with something a little more real-world.
Streaming seemed to have a small perf hit that gradually lessens the
larger the CPU effort / payload becomes. If you were adding a bunch
of CSS output in your header, for example, I think streaming would
greatly improve load time.

So yeah, open to suggestions here. I think this is a good stab at a
general interface for getting "parts" for a more atomic Next render,
but the API of that could certainly be better. Looking forward to
hearing what you think!

Love,
@gcpantazis + @thumbtack ❤️
gcpantazis added a commit to gcpantazis/next.js that referenced this issue Jun 15, 2017
This adds support for, and an example of, using Rapscallion (or
potentially other alternate React renderers) with Next. This has been
discussed in vercel#1334 and vercel#1209. I'm submitting this as a proof of
concept for discussion, though the changes are fairly minimal. I'm
hoping with some bit of feedback we can add this for real!

The issue as mentioned by @arunoda elsewhere is that `document.js`
uses `renderToString()` directly, which makes it difficult to inject an
alternate renderer method.

**What I did here:**

1. creating a `renderToParts()` method, which in turn calls doRender with
added configuration to allow for the replacement of `renderToString`,
and keeping `renderToString` as the default.

2. Jettison `document.js`, and in the example `server.js` use
[Rapscallion templating](https://github.com/FormidableLabs/rapscallion#template) to effectively do the same thing.

3. 🎉🎉🎉!

**The main issue this raised:**

* The subcomponents of `Document` (`Head`, `NextScript`) rely on
context, and thus are difficult to interface with except through
`Document`. I got around this by falling back to props when context was
not defined. Hacky of course, and probably is the main thing we'd need
to refactor to do this "right".

**Results:**

The test I created for examples was to generate a MD5 hash for the first
3000 integers. Since this is deterministic, I cached it with
Rapscallion. `renderToString` continued to render it as normal on
baseline. The test run was with apache benchmark, `ab -n 500 -c 10`.

Baseline:

```
Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    1   0.4      0       4
Processing:   239 1293 246.1   1253    2099
Waiting:      238 1291 245.8   1251    2099
Total:        240 1293 246.1   1254    2100

Percentage of the requests served within a certain time (ms)
  50%   1254
  66%   1337
  75%   1425
  80%   1444
  90%   1588
  95%   1886
  98%   1981
  99%   1981
 100%   2100 (longest request)
```

With Rapscallion:

```
Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    1   0.3      0       4
Processing:    97  204  39.6    190     318
Waiting:       97  202  38.6    189     318
Total:         98  205  39.6    191     319

Percentage of the requests served within a certain time (ms)
  50%    191
  66%    205
  75%    224
  80%    235
  90%    269
  95%    307
  98%    315
  99%    315
 100%    319 (longest request)
```

... or a roughly 6-7x improvement. Not super-surprising, since caching,
but for us a larger portion of each page is static — being able to
cache that, while leaving other areas to be dynamic, is a huge win.

I played around with Rapscallion's streaming rendering as well, but I
think that'll be most useful with something a little more real-world.
Streaming seemed to have a small perf hit that gradually lessens the
larger the CPU effort / payload becomes. If you were adding a bunch
of CSS output in your header, for example, I think streaming would
greatly improve load time.

So yeah, open to suggestions here. I think this is a good stab at a
general interface for getting "parts" for a more atomic Next render,
but the API of that could certainly be better. Looking forward to
hearing what you think!

Love,
@gcpantazis + @thumbtack ❤️
@gcpantazis
Copy link
Contributor

Added an example of Rapscallion in #2279... can confirm that Rapscallion + Next is insane. Streamed/promise-based render is awesome, but Component-level caching is a game-changer for us... :godmode:

@buzinas
Copy link

buzinas commented Feb 23, 2018

Now that React 16 has its own renderToNodeStream, it would be a huge advantage for next.js to add an option to use it instead of renderToString. What do you think, @timneutkens?

@timneutkens
Copy link
Member

It's on our list of things to add already 👍

@MicroBenz
Copy link

Any news about this?

@timneutkens
Copy link
Member

#4074 (comment)

@gustavoteodoro
Copy link

Any news?

@revskill10
Copy link
Contributor

revskill10 commented Apr 15, 2019

Next.js needs to expose custom render (with renderToString as default renderer) in order for user to use their custom async renderer i think.
Lack of this feature forced me to use razzle to use an async renderer though :( (its DX is nowhere near NextJS, but i had to accept that to go on).

I love everything about Next.js except two things:

  • Custom async renderer.
  • Custom babel config for both server and client.

@bruceman
Copy link

any roadmap / plan of streaming rendering support? so expected have this in next.js .  

@timneutkens
Copy link
Member

The is pending on the React team implementing React Fizz / their plan for it.

@StarpTech
Copy link
Contributor

@timneutkens What the issue, PR to track here?

@dihmeetree
Copy link

dihmeetree commented Sep 9, 2019

From Facebook's blog post, published on August 8th 2019
https://reactjs.org/blog/2019/08/08/react-v16.9.0.html

An Update on Server Rendering
We have started the work on the new Suspense-capable server renderer, but we don’t expect it to be ready for the initial release of Concurrent Mode. This release will, however, provide a temporary solution that lets the existing server renderer emit HTML for Suspense fallbacks immediately, and then render their real content on the client. This is the solution we are currently using at Facebook ourselves until the streaming renderer is ready.

For anyone still waiting on server streaming support :)

@manan25386
Copy link

Is there any update or any other method to implement renderToNodeStream in next.js ?

@superkoh

This comment has been minimized.

@OutThisLife

This comment has been minimized.

@aluminick

This comment has been minimized.

@pepf
Copy link

pepf commented Dec 21, 2019

@StarpTech I'd looked a bit into this (curious for this feature as well!) and it looks like the react team is working on something called react-flight, which will probably be the base for the streaming solution we are waiting for here :)

react-flight:

This is an experimental package for consuming custom React streaming models.

The relevant PR's that shine some light on the inner workings, interpreted by me (not an expert in any of this 🙈 )
#17285: Basic api for flight, the server should be able to stream everything as a string, but leave placeholders for the chunks that are asynchronously resolved on the server. An incomplete, yet interesting, syntax for how react would know from a stream what data type it actually represents is over here.

#17398 More recent PR, adds an api for Chunks so (if you're feeling lucky) you could try that part out yourself. Not sure how everything would come together but nevertheless I'm kinda happy with seeing all this work being done :)

This might be slightly off-topic, but hopefully interesting for people subscribing to this issue :)

@StarpTech
Copy link
Contributor

@pepf thanks for the info!

@egemon
Copy link

egemon commented Feb 8, 2020

Hm. Thank u all guys, interesting info. I am just thinking why should NextJS wait for React support SSR for suspense and stuff, and not just use streamAsString now?

@wis
Copy link

wis commented Feb 20, 2020

@arunoda I think it will reduce memory consumption, very important for low memory lambda functions or Cloudflare Workers.

@Kavian77
Copy link

Hi
Any update on this?
No one is gonna answer us? :)

@karlAlnebratt
Copy link

Yes any update?

@timneutkens
Copy link
Member

timneutkens commented May 25, 2021

Still this: #1209 (comment)

Streaming rendering will eventually be added once the React team published the new version of the server renderer, this is currently on the React experimental channel.

@omBratteng
Copy link
Contributor

If I'm not wrong, it seems it might come in React 18?

@datlieu110
Copy link

Any update on this?

@timneutkens
Copy link
Member

Streaming rendering support is currently being worked on by @devknoll, @shuding, and @huozhi. We'll share when there are relevant proposals here.

@quyphan97
Copy link

@timneutkens Is there any way to fix this error. i am using next version 11.1.0. help me!

@yulafezmesi
Copy link

Any update on this?

@matthxc
Copy link

matthxc commented Oct 30, 2021

With the launch of Next.js 12v I think this will come out of the box now: https://nextjs.org/blog/next-12#server-side-streaming Next.js 12 introduces a bunch of new performance features, including Suspense and SSR streaming and much more, so I think this issue can be marked as resolved?

@omBratteng
Copy link
Contributor

It is under an experimental flag, and requires React 18 which is in alpha. So I'd say it is far from stable. And I wouldn't close this issue until both are available as stable.

@pats
Copy link

pats commented Oct 21, 2022

any news?

@kostia1st
Copy link

Any plans to add streaming to the older Pages routing?

After inspecting the new App routing (its capabilities and especially restrictions), I cannot justify the switch for the app I'm working on. But having streaming working with the Pages routing would solve a few performance issues we currently have on mobile.

@timneutkens
Copy link
Member

The recently launched App Router supports streaming rendering by default, you can add Suspense boundaries to flush early loading states.

@kostia1st we're currently not planning to add streaming to the Pages Router as we had to add specific APIs (i.e. metadata) in order to make early flushing work. Without Server Components you can't stream in data so you wouldn't get clear benefits from having it for Pages like you do in App Router.

I'm going to close this issue now as it has been shipped 👍

@github-actions
Copy link
Contributor

github-actions bot commented Aug 3, 2023

This closed issue has been automatically locked because it had no new activity for a month. If you are running into a similar issue, please create a new issue with the steps to reproduce. Thank you.

@github-actions github-actions bot added the locked label Aug 3, 2023
@github-actions github-actions bot locked as resolved and limited conversation to collaborators Aug 3, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
area: app App directory (appDir: true) locked
Projects
None yet
Development

No branches or pull requests