Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve expect 100-continue handling #4488

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
24 changes: 15 additions & 9 deletions lib/route.js
Original file line number Diff line number Diff line change
Expand Up @@ -408,14 +408,15 @@ internals.payload = async function (request) {
return;
}

if (request._expectContinue) {
request.raw.res.writeContinue();
}

if (request.payload !== undefined) {
return internals.drain(request);
}

if (request._expectContinue) {
request._expectContinue = false;
request.raw.res.writeContinue();
}

try {
const { payload, mime } = await Subtext.parse(request.raw.req, request._tap(), request.route.settings.payload);

Expand All @@ -426,9 +427,7 @@ internals.payload = async function (request) {
catch (err) {
Bounce.rethrow(err, 'system');

if (request._isPayloadPending) {
await internals.drain(request);
}
await internals.drain(request);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we have a guarantee that there's something to drain? I feel that this function is too brittle, should we use stream.finished to make sure all scenarios are covered?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That is not a concern from this PR, right?

I tend to agree. The current drain() logic requires that the stream is active to work. An already closed stream would stall forever.

I would be wary to rely on stream.finished(), as I have seen streams that causes issues with it in the past.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not from this PR no, but removing that conditional made me look further whether it was solid or not, so I hope there's something to consume there, otherwise it's indeed stalled.

Copy link
Contributor Author

@kanongil kanongil Apr 10, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FYI, draining might not be the best approach, but I retained the current behaviour.

The alternative is to just .destroy() the stream, which should work fine, but for HTTP/1 it will also destroy the connection. Draining is more graceful, but can needlessly use up bandwidth. It works best for small pending payloads.


request.mime = err.mime;
request.payload = null;
Expand All @@ -442,8 +441,15 @@ internals.drain = async function (request) {

// Flush out any pending request payload not consumed due to errors

await Streams.drain(request.raw.req);
request._isPayloadPending = false;
if (request._expectContinue) {
request._isPayloadPending = false; // If we don't continue, client should not send a payload
request._expectContinue = false;
}

if (request._isPayloadPending) {
await Streams.drain(request.raw.req);
request._isPayloadPending = false;
}
};


Expand Down
118 changes: 118 additions & 0 deletions test/payload.js
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ const Net = require('net');
const Path = require('path');
const Zlib = require('zlib');

const Boom = require('@hapi/boom');
const Code = require('@hapi/code');
const Hapi = require('..');
const Hoek = require('@hapi/hoek');
Expand Down Expand Up @@ -309,6 +310,123 @@ describe('Payload', () => {
await server.stop();
});

it('does not continue on errors before payload processing', async () => {

const server = Hapi.server();
server.route({ method: 'POST', path: '/', handler: (request) => request.payload });
server.ext('onPreAuth', (request, h) => {

throw new Boom.forbidden();
});

await server.start();

const client = Net.connect(server.info.port);

await Events.once(client, 'connect');

client.write('POST / HTTP/1.1\r\nexpect: 100-continue\r\nhost: host\r\naccept-encoding: gzip\r\n' +
'content-type: application/json\r\ncontent-length: 14\r\nConnection: close\r\n\r\n');

let continued = false;
const lines = [];
client.setEncoding('ascii');
for await (const chunk of client) {

if (chunk.startsWith('HTTP/1.1 100 Continue')) {
client.write('{"hello":true}');
continued = true;
}
else {
lines.push(...chunk.split('\r\n'));
}
}

const res = lines.shift();

expect(res).to.equal('HTTP/1.1 403 Forbidden');
expect(continued).to.be.false();

await server.stop();
});

it('handles expect 100-continue on undefined routes', async () => {

const server = Hapi.server();
await server.start();

const client = Net.connect(server.info.port);

await Events.once(client, 'connect');

client.write('POST / HTTP/1.1\r\nexpect: 100-continue\r\nhost: host\r\naccept-encoding: gzip\r\n' +
'content-type: application/json\r\ncontent-length: 14\r\nConnection: close\r\n\r\n');

let continued = false;
const lines = [];
client.setEncoding('ascii');
for await (const chunk of client) {

if (chunk.startsWith('HTTP/1.1 100 Continue')) {
client.write('{"hello":true}');
continued = true;
}
else {
lines.push(...chunk.split('\r\n'));
}
}

const res = lines.shift();

expect(res).to.equal('HTTP/1.1 404 Not Found');
expect(continued).to.be.false();

await server.stop();
});

it('does not continue on custom request.payload', async () => {

const server = Hapi.server();
server.route({ method: 'POST', path: '/', handler: (request) => request.payload });
server.ext('onRequest', (request, h) => {

request.payload = { custom: true };
return h.continue;
});

await server.start();

const client = Net.connect(server.info.port);

await Events.once(client, 'connect');

client.write('POST / HTTP/1.1\r\nexpect: 100-continue\r\nhost: host\r\naccept-encoding: gzip\r\n' +
'content-type: application/json\r\ncontent-length: 14\r\nConnection: close\r\n\r\n');

let continued = false;
const lines = [];
client.setEncoding('ascii');
for await (const chunk of client) {

if (chunk.startsWith('HTTP/1.1 100 Continue')) {
client.write('{"hello":true}');
continued = true;
}
else {
lines.push(...chunk.split('\r\n'));
}
}

const res = lines.shift();
const payload = lines.pop();

expect(res).to.equal('HTTP/1.1 200 OK');
expect(payload).to.equal('{"custom":true}');
expect(continued).to.be.false();

await server.stop();
});

it('peeks at unparsed data', async () => {

let data = null;
Expand Down