Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cleanRequestFiles doesn't clean file when file size is reached #470

Open
2 tasks done
ThallesP opened this issue Jul 31, 2023 · 10 comments
Open
2 tasks done

cleanRequestFiles doesn't clean file when file size is reached #470

ThallesP opened this issue Jul 31, 2023 · 10 comments
Labels
need info More information is needed to resolve this issue

Comments

@ThallesP
Copy link

ThallesP commented Jul 31, 2023

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

4.21.0

Plugin version

7.7.3

Node.js version

18.16.0

Operating system

Linux

Operating system version (i.e. 20.04, 11.3, 10)

Ubuntu 22.04.2 LTS (wsl2)

Description

When calling cleanRequestFiles after the error "request file too large" happens, it doesn't clean the temp file, as i suggestion this file should be removed automatically if the file is bigger than the max size.

Steps to Reproduce

Set file maxSize, for example:

await request.saveRequestFiles({
  limits: {
    fileSize: 1, // 1 byte
  },
})

and after the error is thrown, call cleanRequestFiles.

Expected Behavior

The file should be cleared

@ThallesP
Copy link
Author

ThallesP commented Jul 31, 2023

Also, seems that the file is saved and then the file size limit is checked, sad that i can't use this library anymore because of such little thing.

@mcollina
Copy link
Member

mcollina commented Aug 1, 2023

Thanks for reporting! Would you like to send a Pull Request to address this issue? Remember to add unit tests.

@mcollina mcollina added the bug Confirmed bug label Aug 1, 2023
@gurgunday
Copy link
Member

When calling cleanRequestFiles after the error "request file too large" happens

I haven't reproduced it yet, but isn't the requestFileTooLarge error thrown while a file is being consumed and before it's even saved? So at a first glance, I can't see where the issue could be

@gurgunday
Copy link
Member

Can you please share more details? The files aren’t saved before they are fully consumed, so there should be no reason to clean up anything

fastify-multipart/index.js

Lines 411 to 422 in 8e4b5e5

files = await this.files(options)
}
this.savedRequestFiles = []
const tmpdir = (options && options.tmpdir) || os.tmpdir()
this.tmpUploads = []
for await (const file of files) {
const filepath = path.join(tmpdir, generateId() + path.extname(file.filename))
const target = createWriteStream(filepath)
try {
await pump(file.file, target)
this.savedRequestFiles.push({ ...file, filepath })
this.tmpUploads.push(filepath)

I don’t think there are any errors in the current implementation

@gurgunday gurgunday added need info More information is needed to resolve this issue and removed bug Confirmed bug labels Jan 28, 2024
@ThallesP
Copy link
Author

Hi, i'll use fastify-multipart somewhere in the next month, then I can give any information as I don't have any active projects with fastify-multipart.
*Just heads up that I saw the replies.

@yaneony
Copy link

yaneony commented Feb 23, 2024

I'm uncertain, yet my experience aligns with the original poster's. After establishing an upload field with the setting { fileSize: 1}, I attempted to upload a debian.iso file approximately 3.7Gb in size. The process lasts several minutes, influenced by connection speed, before an error message appears. However, it seems as though the entire file is completely processed prior to the error notification.

@Uzlopak
Copy link
Contributor

Uzlopak commented Feb 23, 2024

Because we have stream the file into nirvana so that we can send the response that the file is too big ?

An alternative would be to destroy the underlying socket if a specific threshold is met?

@yaneony
Copy link

yaneony commented Feb 23, 2024

Then this comment is completely wrong #470 (comment) since error is being thrown not while consuming, but after the entire file is consumed. Destroying the socket after specific threshold would probably do this while file is consumed.

@Uzlopak
Copy link
Contributor

Uzlopak commented Feb 23, 2024

Its my assumption and not to claim that the comment is wrong.

Yeah of course you have the drawback that destroying the socket will result in some bad user experience, because you wont know why the connection was interrupted. But better destroying the socket than clogging the servers bandwith.

So we have to investigate here further, what the actual case is.

@gurgunday
Copy link
Member

gurgunday commented Feb 23, 2024

Then this comment is completely wrong

It might depend on the configuration, which is why I asked for more information...

But don't take my word:

We consume the stream on line 411 before iterating on the files and saving them starting on line 416 - 411 is where the error should be thrown normally

fastify-multipart/index.js

Lines 398 to 430 in e2aaf59

async function saveRequestFiles (options) {
// Checks if this has already been run
if (this.savedRequestFiles) {
return this.savedRequestFiles
}
let files
if (attachFieldsToBody === true) {
// Skip the whole process if the body is empty
if (!this.body) {
return []
}
files = filesFromFields.call(this, this.body)
} else {
files = await this.files(options)
}
this.savedRequestFiles = []
const tmpdir = (options && options.tmpdir) || os.tmpdir()
this.tmpUploads = []
for await (const file of files) {
const filepath = path.join(tmpdir, generateId() + path.extname(file.filename))
const target = createWriteStream(filepath)
try {
await pump(file.file, target)
this.savedRequestFiles.push({ ...file, filepath })
this.tmpUploads.push(filepath)
} catch (err) {
this.log.error({ err }, 'save request file')
throw err
}
}
return this.savedRequestFiles
}

I will check again though, if you can provide a reproducible example

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
need info More information is needed to resolve this issue
Projects
None yet
Development

No branches or pull requests

5 participants