Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Put.io Download Client #6029

Draft
wants to merge 11 commits into
base: develop
Choose a base branch
from

Conversation

michaelfeinbier
Copy link

Database Migration

NO

Description

This is a first draft to implement a Put.io Download client as described in #1972. I took the initial work of #1357 and fixed a lot of compile issue (as this one was pretty old). However its a bit far from done yet but I post it here to gather feedback and maybe clarify some questions or ideas as I am not very familiar with the .NET stack (but getting used to 馃槈 )

So Feedback welcome!

Todos

  • Finish implementing the proxy
  • Tests
  • Wiki Updates

Issues Fixed or Closed by this PR

@michaelfeinbier
Copy link
Author

Hey y'all
I'm making quite good progress on this :) However I need your input on one fundamental functionality. Put.io works in a way, that finished files end up in a remote bucket and they could be downloaded to a local dir via URL which we get from the Put.io API. So in order to make Sonarr importing the file to the library, we need to download these files.

In general, I do see 2 options here

1) Download "on-demand"

Basically, during iteration on all transfers in IEnumerable<DownloadClientItem> GetItems() we could check, if finished downloads on putio are already downloaded locally and are not marked as imported yet. If the file is finished on the remote and not yet downloaded we could initiate the download with

 await _httpClient.DownloadFileAsync(url, localPath);

However, obviously these downloads might take long and I don't think the GetItems() is really a good place to be misused as a download manager 馃槈

2) Blackhole / rclone approach

The other option I do see here is to not care about downloading finished torrents from put.io at all but instead rely on 3rd party tools like rclone which can mount or sync the remote to a local folder. This is how usually all "workarounds" for put.io work and this was also the approach in the original PR #1357. Here we would just "fake" the item.OutputPath to the predicted path based on the settings of the client.

Overall I wanted to have a full-working client without the need of another tool but I am not sure if solution 1) is the best way to go here 馃
Do you guys maybe have other ideas how we could implement the downloading path? Or do we think solution 2) is good enough?

@markus101
Copy link
Member

Not relying on another service to handle the downloading would be nice, but that is effectively what we do for seedboxes already, so I don't think it's a huge deal to rely on that and consider doing something completely within Sonarr in the future. Option 1 in itself would be a pretty big undertaking to ensure it's functional and supports slower connections and/or interruptions.

@sankalpsingha
Copy link

Hi all! Any update on this? Option 1 would be great tbh!

@ZerithZA
Copy link

Personally for me using Rclone to mount the putio and have sonarr grab it from the putio folder or even have rclone download it and have sonarr look for when it's done fetching will work either way. The main problem I currently have is downloads that have stalled and having sonarr remove them and try a different one.

@paulirish
Copy link

In general, I do see 2 options here

@michaelfeinbier So glad you took this on!

Perhaps obvious but a a challenge I see here is that with putio/seedbox-style-services, users may either want to download the files locally or B) ~stream them from an rclone mount. I prefer download because I have the storage, and trust in that more than a reliable connection. But at this point, I bet more putio users are used to the stream-style.

My evolved putio+*arr setup is using https://gitlab.com/paulirish/krantor configured as a torrent blackhole, then a rclone move cmd in cron. As a result, the current PR with option 2 above roughly offers the same value as krantor, though the detailed download status is real nice!

It seems like option 1 ~= running a download manager running within a recurring checkActiveItemStatus handler.. and that's less attractive then running an out-of-process long-running utility to handle downloads, resuming, etc etc. (Is that right? Makes sense...)

Do any other seedboxy download clients try to offer options for users that either want to use the mounted location vs the post-download location?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants