Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix top level async breaks with routes that don't stream #2280

Open
wants to merge 1 commit into
base: v1.x-2022-07
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
5 changes: 5 additions & 0 deletions .changeset/strange-radios-deliver.md
@@ -0,0 +1,5 @@
---
'@shopify/hydrogen': patch
---

Fix a critical issue where a top level asynchronous request within `App.server.jsx` would break if subsequent routes attempt to disable streaming.
25 changes: 12 additions & 13 deletions packages/hydrogen/src/entry-server.tsx
Expand Up @@ -388,9 +388,11 @@ async function runSSR({
const rscResponse = createFromReadableStream(rscReadableForFizz);
const RscConsumer = () => rscResponse.readRoot();

const canStream = response.canStream();

const AppSSR = (
<Html
template={response.canStream() ? noScriptTemplate : template}
template={canStream ? noScriptTemplate : template}
hydrogenConfig={request.ctx.hydrogenConfig!}
>
<ServerRequestProvider request={request}>
Expand All @@ -417,7 +419,7 @@ async function runSSR({

log.trace('start ssr');

const rscReadable = response.canStream()
const rscReadable = canStream
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The problem starts here, the rscReadable is setup based on the initial status of streaming. But if something suspends in App.server.jsx, this will always be true. Later in the logic down below though all of a sudden response.canStream() is false. So to keep it consistent, the variable is set once and reused everywhere.

? new ReadableStream({
start(controller) {
log.trace('rsc start chunks');
Expand Down Expand Up @@ -469,7 +471,7 @@ async function runSSR({
);
}

if (response.canStream()) log.trace('worker ready to stream');
if (canStream) log.trace('worker ready to stream');
ssrReadable.allReady.then(() => log.trace('worker complete ssr'));

const prepareForStreaming = () => {
Expand Down Expand Up @@ -509,7 +511,7 @@ async function runSSR({
return true;
};

const shouldFlushBody = response.canStream()
const shouldFlushBody = canStream
? prepareForStreaming()
: await ssrReadable.allReady.then(prepareForStreaming);

Expand All @@ -519,7 +521,7 @@ async function runSSR({

const writingSSR = bufferReadableStream(
ssrReadable.getReader(),
response.canStream()
canStream
? (chunk) => {
bufferedSsr += chunk;

Expand All @@ -542,13 +544,13 @@ async function runSSR({

const writingRSC = bufferReadableStream(
rscReadable.getReader(),
response.canStream()
canStream
? (scriptTag) => writable.write(encoder.encode(scriptTag))
: undefined
);

Promise.all([writingSSR, writingRSC]).then(([ssrHtml, rscPayload]) => {
if (!response.canStream()) {
if (!canStream) {
const html = assembleHtml({ssrHtml, rscPayload, request, template});
writable.write(encoder.encode(html));
}
Expand Down Expand Up @@ -579,7 +581,7 @@ async function runSSR({
);
}

if (response.canStream()) {
if (canStream) {
return new Response(transform.readable, responseOptions);
}

Expand Down Expand Up @@ -615,7 +617,7 @@ async function runSSR({
return nodeResponse.end();
}

if (!response.canStream()) return;
if (!canStream) return;

startWritingToNodeResponse(nodeResponse, dev ? didError() : undefined);

Expand All @@ -632,10 +634,7 @@ async function runSSR({
async onAllReady() {
log.trace('node complete ssr');

if (
!revalidate &&
(response.canStream() || nodeResponse.writableEnded)
) {
if (!revalidate && (canStream || nodeResponse.writableEnded)) {
postRequestTasks(
'str',
nodeResponse.statusCode,
Expand Down
7 changes: 7 additions & 0 deletions packages/playground/server-components/src/App.server.jsx
Expand Up @@ -5,6 +5,8 @@ import {
FileRoutes,
ShopifyProvider,
useRequestContext,
useUrl,
useSession,
} from '@shopify/hydrogen';
import {Suspense} from 'react';
import Custom1 from './customRoutes/custom1.server';
Expand All @@ -17,6 +19,11 @@ export default renderHydrogen(({request, response}) => {
response.doNotStream();
}

const url = useUrl();
if (url.href.includes('top-level-async')) {
useSession();
}

useRequestContext().test1 = true;
useRequestContext('scope').test2 = true;

Expand Down
@@ -0,0 +1,4 @@
export default function topLevelAsync({response}) {
response.doNotStream();
return <div>Hello!</div>;
}
8 changes: 8 additions & 0 deletions packages/playground/server-components/tests/e2e-test-cases.ts
Expand Up @@ -155,6 +155,14 @@ export default async function testCases({
expect(body).toContain('>footer!<');
});

it('disables streaming with a top level async operation', async () => {
const response = await fetch(getServerUrl() + '/top-level-async').then(
(resp) => resp.text()
);

expect(response).toContain('<meta data-flight="S2');
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The case where it breaks, <meta data-flight actually is embedded twice. So the value won't be S2, rather an encoded <meta data-flight

});

it('streams if server component calls doNotStream after stream has already started', async () => {
const response = await fetch(getServerUrl() + '/midwaynostream');
const streamedChunks = [];
Expand Down