Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory overload when downloading large files. #27

Open
aplekhanov opened this issue Nov 14, 2017 · 11 comments
Open

Memory overload when downloading large files. #27

aplekhanov opened this issue Nov 14, 2017 · 11 comments

Comments

@aplekhanov
Copy link

aplekhanov commented Nov 14, 2017

Hello,

I keep challenging your great product and found recently that when I try to download a large file (like ~150 Mb for instance on screenshot below) from the server, the debug navigator shows that it makes a huge impact to device memory. Moreover, it multiplies on number of simultaneous downloads.

Well, is it a bug or architecture pitfall? I mean is it possible to fix it?

Thanks and happy coding!

screen shot 2017-11-14 at 8 47 21 pm

@aplekhanov
Copy link
Author

Just start unsecured server with .serveDirectory on iPad and get file using URLSession from 'http://localhost:9000/big.pdf'. The same picture on macOS. Seems it load the whole file into memory for every downloading session.

@yvbeek
Copy link
Member

yvbeek commented Nov 16, 2017

Hi @aplekhanov, Yes big files are fully loaded in memory. The implementation of the file handler is pretty basic.

Unfortunately to properly send big files to your client you need chunked transfer encoding, I haven't implemented that in the HTTP classes. I'm not really sure how much work it would be to add it to the framework, I haven't explored the topic in too much detail yet.

Perhaps I can have a look at it in the weekend.
Or if you want to show off your coding skills, pull requests are most welcome 😉 😄

@aplekhanov
Copy link
Author

aplekhanov commented Nov 16, 2017

Hi @Zyphrax,
Yeah, I found already that part of code in the HTTPFileHandler response.body = try Data(contentsOf: fileURL) . I guess there must be NSFileHandle to get chunked data from file... and some special http header for this case... right? Very tempting 😈

But right now I need to finish A LOT OF other interesting work. Let's say it is not high priority and most important thing is that it could be realized by you, by me or by someone else in future 🍻

@yvbeek
Copy link
Member

yvbeek commented Oct 29, 2018

I've just pushed a commit that adds support for range requests on files. That might help in some cases and for example allows for streaming video on iOS. It contains FileHandle code to read parts of the files.

However this does not resolve the memory issue yet. To fix that we need to build in proper streaming, for example by adding something like a bodyStream variable to HTTPResponse.

I will probably add that later on. The problem is that the socket layer (CocoaAsyncSocket) makes it difficult to implement this. I'll probably have to replace that first.

@aplekhanov
Copy link
Author

I will probably add that later on. The problem is that the socket layer (CocoaAsyncSocket) makes it difficult to implement this. I'll probably have to replace that first.

Great to hear it! Have you dig into a new Network framework?

@yvbeek
Copy link
Member

yvbeek commented Oct 29, 2018

Great to hear it! Have you dig into a new Network framework?

Yeah, I'm working on a wrapper for the Apple Network framework and a DispatchIO implementation for iOS 11 and lower versions. The SSL implementation is going to be a nightmare, but I'll try 😅

@JUSTINMKAUFMAN
Copy link

@Zyphrax I am interested to hear whether or not you decided to pursue this. Not an urgent feature for me, but would be nice to have 😄.

@yvbeek
Copy link
Member

yvbeek commented Jul 19, 2019

Hi @JUSTINMKAUFMAN. Darwin sockets are quite complex and especially the SecureTransport SSL layer is going to be a challenge. I'm still working on it, but haven't had much time to spend on it.

As soon as I have a unified network layer, I'll dive into proper buffering for large files.

@shalom-aviv
Copy link

shalom-aviv commented Apr 26, 2021

Hi @JUSTINMKAUFMAN. Darwin sockets are quite complex and especially the SecureTransport SSL layer is going to be a challenge. I'm still working on it, but haven't had much time to spend on it.

As soon as I have a unified network layer, I'll dive into proper buffering for large files.

Hey

Do you have a progress with this issue?

PS
Telegraph is a Great work!
Thank you

@blaineam
Copy link

Welp ran into a road block for one of my apps thinking I could do something like this feature request to stream a on device decrypted mp4 file. I can decrypt the file but because it has to wait for the whole file to be processed before responding which makes the app feels super slow. Having the ability to write to the output as it decrypts chunks of a mp4 file would be incredible because the video could start playing while decryption is still occurring. Not sure how I could write to the response piece by piece but that functionality would be phenomenal.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants