-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
run node-parquet in AWS Lambda #20
Comments
It should be possible to be smaller than 400 MB |
It allows to remove the build_deps directory after building the module, and reduce drastically the size of module, addressing skale-me#20.
Hi, can you check again, and run |
Any luck getting it to work? I am getting the following error when trying to run in lambda:
|
@alaister make a 'lib' folder on your lambda function and copy that library in there.. that worked for me |
@fzaffarana this is my lambda application layout. As you can see I just made a lib directory and copied the missing library in there. I use AWS cloud9 for lambda development, so I got the library from there, and it works when deployed.
and then I just include and use stuff normally I have successfully made parquet files on s3 with this by putting a function inside a kinesis stream as a transformation function, and then throwing away all the transformations.. so the lambda functions writes to s3, and the kinesis stream does not. It almost worked, but I got a few errors where kinesis aborted, and I couldn't really debug what was going on... ultimately I had to abandon this method because of time constraints. But it was very close. I was able to read the resulting files from athena.
|
@aib-nick thank you first of all for the help. I can see that we have similar lambdas (this is good). (i'm going to take your trick => 'give s3 the ability to read the local file and stream it'). But, i don't know if we have the same error. this is mine (in aws console):
It doesn't show any specific lib missing. On the other hand, when i test this lambda in my local environment, it works correctly. |
This would be a useful feature! |
Is there any fix? |
@aib-nick |
It's been a while since this question was originally asked, but I wanted to followup and see if anyone has a tried and true way of doing the npm install/adding lib files that always works to get node-parquet working on Lambda? I'm about to embark on this task and would love to hear the wisdom of others as far as any gotchas. |
I've managed to run node-parquet on AWS Lambda version NodeJS 10.x, I think it's worth mentioning that I couldn't build it on newer NodeJS versions. You'll also need Docker installed on your machine. Run this in the root folder of your project $ docker run --rm -it -v "$PWD":/var/task lambci/lambda:build-nodejs10.x /bin/bash This will give you an environment similar to the AWS Lambda. Inside the container run the following commands: # First we update the cmake version since this image comes with the version 2
cmake_name="cmake-3.16.1-Linux-x86_64"
cmake_tar="${cmake_name}.tar.gz"
curl -L https://github.com/Kitware/CMake/releases/download/v3.16.1/${cmake_tar} -o /opt/${cmake_tar}
mkdir -p /opt/${cmake_name}
tar xf /opt/${cmake_tar} -C /opt
chmod a+x /opt/${cmake_name}/bin/cmake
mv /bin/cmake /bin/cmake.bkp
ln -s /opt/${cmake_name}/bin/cmake /bin/cmake
# Now we install the last dependencies and build the project
yum install -y boost-devel bison flex
npm install
# Cleanup dependencies so we can actually deploy to AWS Lambda
rm -Rf ./node_modules/node-parquet/build_deps I hope this helps! |
I have done similar to what @paflopes describes and putting that into a layer which the application can use. |
Hi,
I wanted to use this wonderful module in aws lambda, the key blocker is that when I compile node-parquet module then the whole thing is over 400MB; Unfortunately AWS Lambda allows to upload ~240 MB max per lambda function.
I was wondering is there any possibility to slim the whole output down. Or is this is what we get?
In any case I'm looking through make files to understand if I can do something on my own.
Thanks for your time!
The text was updated successfully, but these errors were encountered: