Replies: 3 comments 1 reply
-
I where about to start writing about something completely different. (about how it's better to just use blob's backed up by the filesystem but then i tough 64KB was nothing. it's so small... so it would not matter. then i investigated it and it further and tried to reproduce it (preferable without fetch - if possible) So i manage to reproduce it with just: import { Blob as fetchBlob } from 'node-fetch'
import { Blob } from 'buffer'
const buf = Buffer.from('a'.repeat(64 * 1024 + 2))
const blob = new Blob([buf])
const file = new fetchBlob([blob])
for await (const c of file.stream()) {
console.log(c)
} (therefore i'm going to close this as this issue dose not belong here - but feel free to keep commenting) This is more or less what's going on under the hood. readable web byte-streams have this kind of step where it detaches the underlying ArrayBuffer when using: ReadableStream({
type: 'bytes',
async pull (ctrl) {
ctrl.enqueue(uint8array)
}
}) And once the that's the root cause of this error: import { Blob } from 'buffer'
const buf = Buffer.from('a'.repeat(64 * 1024 + 2))
const blob = new Blob([buf])
for await (const c of blob.stream()) {
console.log(c)
}
all looks good ✅ for await (const c of blob.stream()) {
console.log(c.buffer) // log the arrayBuffer
}
that dose not look right... ❌ ArrayBuffer {
[Uint8Contents]: <61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 ... 65438 more bytes>,
byteLength: 65538
}
ArrayBuffer {
[Uint8Contents]: <61 61>,
byteLength: 2
} I took an extra look at the NodeJS source code: stream() {
if (!isBlob(this))
throw new ERR_INVALID_THIS('Blob');
const self = this;
return new lazyReadableStream({
async start() {
this[kState] = await self.arrayBuffer();
this[kIndex] = 0;
},
pull(controller) {
if (this[kState].byteLength - this[kIndex] <= kMaxChunkSize) {
controller.enqueue(new Uint8Array(this[kState], this[kIndex]));
controller.close();
this[kState] = undefined;
} else {
controller.enqueue(new Uint8Array(this[kState], this[kIndex], kMaxChunkSize));
this[kIndex] += kMaxChunkSize;
}
}
});
} the issue here is that it's using: controller.enqueue(new Uint8Array(this[kState], this[kIndex], kMaxChunkSize)); instead of doing things such as: controller.enqueue(new Uint8Array(this[kState].slice(start, end)); This is unexpected from a web spec perspective of reusing the same underlying So really this isn't a issue with return new lazyReadableStream({
async start() {
this[kState] = await self.arrayBuffer(); as that is a bad memory hog to allocate the hole arrayBuffer like that instead of creating some more stream friendlier variant But i totally forgot about even posting them to NodeJS earlier. I have always been reluctant to use NodeJS own Blob implementation and doing things such as
|
Beta Was this translation helpful? Give feedback.
-
There is a very simple solution to this issue: instead of using NodeJS own Blob implementation, use our import fetch { Blob } from 'node-fetch' you can even import a import fetch { Blob, File, FileFrom, fileFromSync, FormData } from 'node-fetch'
const file = fileFromSync('./readme.md') this 👆 is our recommendation to creating blob/file backed up by the filesystem, no data will be held in memory until you actually start reading any of it. |
Beta Was this translation helpful? Give feedback.
-
Been running into this issue myself. Kudos on taking the time to debug this. This is an extremely cryptic error 😅. |
Beta Was this translation helpful? Give feedback.
-
It will throw the error.
OS: debian 11 amd64
node: 18.10.0
node-fetch: 3.3.0
Beta Was this translation helpful? Give feedback.
All reactions