Replies: 1 comment 1 reply
-
Depends on how much you would like to
but i would start of with something like this dependency free variant for limiting the concurrency: import fs from 'node:fs'
function getFileName (url) {
const pathname = new URL(url).pathname
const index = pathname.lastIndexOf('/')
return ~index ? pathname.substring(index + 1) : pathname
}
async function download (iterator) {
for (const url of iterator) {
try {
const res = await fetch(url)
if (!res.ok) {
// not a 2xx response code... return early
}
const filename = getFileName(url)
const file = fs.createWriteStream(filename)
for await (const chunk of res.body) file.write(chunk)
file.end()
} catch (err) {
// handle err
}
}
}
const urls = [
'https://www.example.com/foo.png',
'https://www.example.com/bar.gif',
'https://www.example.com/baz.jpg'
// ... many more (~25k)
]
const iterator = urls.entries()
const workers = new Array(2).fill(iterator).map(download)
// ^--- starts two workers sharing the same iterator
Promise.allSettled(workers).then(() => console.log('done')) based on this: https://stackoverflow.com/a/51020535/1008999 |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hey,
I have an array with around 25'000 files i need to download from an other server. Im not sure how i can implement such functionality using node-fetch?
My questions are:
console.log
as soon as download failed or succeed?This i what i have so far:
Could anyone give a code example how to implement this?
Thanks.
Beta Was this translation helpful? Give feedback.
All reactions