Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feature(api): an API to upload large in-memory data which returns a datamap address #1598

Open
happybeing opened this issue Apr 10, 2024 · 0 comments

Comments

@happybeing
Copy link
Contributor

I've been looking for ways to upload data other than from disk:

  • Data <1MB FilesApi::get_local_payment_and_upload_chunk() will pay for and store a chunk (<~1MB).
    I'm not sure this API is ideal yet for external use as I didn't see validation of the data size being done.

  • Data >1MB I'm not aware of an API that will store data larger than a Chunk (other than from a file).

Missing APIs
I believe there's a need new APIs that encrypt and upload in-memory data which is too large for a chunk. For example, equivalents of the following for a block of data rather than a file or file-tree:

FilesUploader
start_upload()
ChunkManager
chunk_file()
encrypt_large()

This is not a simple thing to do effectively, so without this developers will probably write data to a file and upload that which is not secure and risks leaking private information.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant