Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Lambda size limit #3

Closed
amitaymolko opened this issue Dec 29, 2016 · 9 comments
Closed

Lambda size limit #3

amitaymolko opened this issue Dec 29, 2016 · 9 comments
Milestone

Comments

@amitaymolko
Copy link

What can we do to handle libraries that are too big for lambda?
For example 'scipy' (164mb) and numpy (77mb) are almost at the limit of 250mb for a lambda zip

@dschep
Copy link
Collaborator

dschep commented Dec 29, 2016

We (@unitedincome) are using both of those on a project with serverless but aren't using this plugin with it yet. We're using a custom solution and shipping it packaged with tar -cvJ (Xz compression). Getting scipy and numpy to work well with this plugin is on my list of things todo when I get time 😄 \

@amitaymolko
Copy link
Author

Hi great job!

Well I think we should wrap the custom variables like so:

custom:
  python-requirements:
    dockerizePip: true
    zipImport: true

Also i see some issues:
Files:
requirements.txt
requirements.zip
test.py.txt

So requirements.txt contains numpy, but it doesn't appear in the .requirements.zip
I attached my test.py file, which doesn't work because it can't load numpy
Also I noticed that the sizes for the libs are much smaller then i thought,
even in my virtualenv site-packages the sizes are much smaller than I previously had,
so Im not sure why scipy previously appeared as (164mb) it is now (70mb) in the site-packages.
scipy also doesn't appear in the .requirements.zip

@dschep dschep mentioned this issue Jan 4, 2017
@dschep dschep added this to the 2.0.0 milestone Jan 4, 2017
@dschep
Copy link
Collaborator

dschep commented Jan 4, 2017

@amitaymolko, what platform are you running on? and are you using the dockerizePip option?

@amitaymolko
Copy link
Author

Running on OSX El Capitan
I ran without the dockerizePip option.
I will test this some more when I get the chance.
I will also pull the last few changes

@dschep dschep closed this as completed in #4 Feb 21, 2017
@dschep
Copy link
Collaborator

dschep commented Feb 22, 2017

Reopening because I realized you can't zipimport shared objects, so my solution doesn't work with numpy or scipy 😢

ZIP import of dynamic modules (.pyd, .so) is disallowed.

https://docs.python.org/2/library/zipimport.html

@dschep dschep reopened this Feb 22, 2017
@dschep dschep closed this as completed in 2341d7f Feb 22, 2017
@marcoromelli
Copy link

Really like this plugin!
Is there some plan to improve on package size? I tried to package a function depending on numpy and tensorflow (with the dockerizePip option) but I get a .zip artifact of more than 60MB.

@dschep
Copy link
Collaborator

dschep commented Jun 7, 2017

@marcoromelli
Copy link

Actually you are right @dschep! Using the zip option the deploy works. The size of the artifact is still bigger than the 50MB limit, but for some reason AWS doesn't complain.

I still have a subtle problem related to this issue tensorflow/tensorflow#3086 which prevents the import of the Google Protobuf library.
The solution seems to be to create an empty __init__.py file in the google folder in the requirements but maybe there is a way to fix this in a more elegant way.

@dschep
Copy link
Collaborator

dschep commented Jun 8, 2017

Ah, yes.. I've actually run into problems with namespace packages and pip install --target before on another project. I'm not sure what the best way to address that is. I'll create a separate ticket for that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants