Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

define a global maximum concurrent request limit #640

Open
mtt-artis opened this issue Oct 30, 2023 · 3 comments
Open

define a global maximum concurrent request limit #640

mtt-artis opened this issue Oct 30, 2023 · 3 comments

Comments

@mtt-artis
Copy link

Is your feature request related to a problem? Please describe.
In various use cases, I have encountered a challenge with multiple concurrent uploads initiated by the tus-js-client library. This issue has resulted in suboptimal network resource management, leading to inefficient bandwidth allocation and adversely affecting the performance of other critical network operations.

Describe the solution you'd like
I would like to propose the introduction of a global maximum concurrent request feature within the tus-js-client library. This feature would allow to set an upper limit on the number of concurrent uploads permitted across all instances of tus.Upload within a single application. This feature would ensure that the library operates within my desired bandwidth constraints, preventing network congestion, and enhancing overall network performance.

const tus = require('tus-js-client');

// Set the global maximum concurrent uploads limit.
// For simplicity 1 upload = 1 request in this example
tus.setMaxConcurrentRequests(3); // Allow a maximum of 3 concurrent uploads.

// Create tus.Upload instances and start uploads as usual.
const upload1 = new tus.Upload(file1, { endpoint: 'https://example.com/upload' });
const upload2 = new tus.Upload(file2, { endpoint: 'https://example.com/upload' });
const upload3 = new tus.Upload(file3, { endpoint: 'https://example.com/upload' });
const upload4 = new tus.Upload(file4, { endpoint: 'https://example.com/upload' });

upload1.start(); // Starts the first upload.
upload2.start(); // Starts the second upload.
upload3.start(); // Starts the third upload.

// Since the maximum concurrent uploads limit is set to 3, the fourth request
// will be queued and will start once one of the existing requests is completed.
upload4.start(); // Queues the fourth request.

Describe alternatives you've considered
An alternative approach is to upload files one at a time, initiating the next upload using the onError and onSuccess hooks. This ensures controlled, sequential uploads, preventing network congestion and providing predictable resource management.

Additional context
Applications must guarantee consistent, shared network quality of service. By preventing network congestion, it ensures an equitable distribution of resources, safeguarding the performance of both uploads and other vital network operations. I would greatly appreciate the inclusion of this feature within the tus-js-client library.

@Acconut
Copy link
Member

Acconut commented Oct 31, 2023

Thank you for bringing this up and describing the problem in great detail. While I understand the problem, I think that such a concurrency limit should rather be implemented by the user. As you mentioned, one tus upload has a maximum of 1 concurrent request (unless you are using the parallelUploads option). Therefore, one can control the number concurrent requests by managing how many uploads are started in parallel. Such a limit can easily be implemented using a queue or semaphore. The advantage of implementing a request limit outside of tus-js-client is that the user receives more feedback about the upload's state. For example, when an upload is not started because the concurrency limit is reached, the UI can be updated to reflect this fact. When we implement a limit in tus-js-client the user would just see an upload that is not progressing.

Furthermore, if you actually want to limit the number of requests (and not the number of uploads), you can implement this right now using the onBeforeRequest option. It can also return a Promise and by resolving that Promise whenever the concurrency limit allows another request, you can implement the desired functionality.

All in all, I think that there are better alternatives than having this functionality in tus-js-client.

@mtt-artis
Copy link
Author

hi @Acconut,
thanks for the reply.

As you mentioned, one tus upload has a maximum of 1 concurrent request (unless you are using the parallelUploads option).

I think I was fooled by the dev tools where I can see 4 pending requests at once for one upload.

const upload = new Upload(file, {
  endpoint: "/api/files",
  chunkSize: 1_000_000,
  retryDelays: [0, 1000, 3000, 5000],
  metadata: {
    filename: file.name,
    filetype: file.type,
    lastModified: file.lastModified.toString(),
  },
};
upload.start();

image



It can also return a Promise and by resolving that Promise whenever the concurrency limit allows another request.

do you have a repository with such functionality to show me?

@Acconut
Copy link
Member

Acconut commented Nov 22, 2023

I think I was fooled by the dev tools where I can see 4 pending requests at once for one upload.

If you call upload.start() once then there should only ever be one PATCH request. Any other behavior would indicate a bug. Can you reproduce the multiple PATCH requests? Do you call start multiple time?

do you have a repository with such functionality to show me?

No, unfortunately I do not have an example showcasing this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants