Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

JavaScript heap out of memory when packaging many functions #299

Open
daniel-cottone opened this issue Dec 10, 2017 · 99 comments · Fixed by #858 · May be fixed by #570
Open

JavaScript heap out of memory when packaging many functions #299

daniel-cottone opened this issue Dec 10, 2017 · 99 comments · Fixed by #858 · May be fixed by #570
Labels

Comments

@daniel-cottone
Copy link

This is a Bug Report

Description

I'm in the process of trying to upgrade serverless-webpack version from 2.2.3, where I do not experience the following issue. Our serverless configuration has package: invididually: true set, and about 40 functions. When I try to upgrade to a later version of serverless-webpack and run sls webpack, the build will run for about a minute and then I get the following error:

lambda:daniel.cottone $ npm run build

> expert-api-lambda@0.1.0 build /Users/daniel.cottone/Projects/expert-api/lambda
> sls webpack --stage dev

Serverless: Bundling with Webpack...
ts-loader: Using typescript@2.5.2 and /Users/daniel.cottone/Projects/expert-api/lambda/tsconfig.json
ts-loader: Using typescript@2.5.2 and /Users/daniel.cottone/Projects/expert-api/lambda/tsconfig.json
ts-loader: Using typescript@2.5.2 and /Users/daniel.cottone/Projects/expert-api/lambda/tsconfig.json
ts-loader: Using typescript@2.5.2 and /Users/daniel.cottone/Projects/expert-api/lambda/tsconfig.json
ts-loader: Using typescript@2.5.2 and /Users/daniel.cottone/Projects/expert-api/lambda/tsconfig.json
ts-loader: Using typescript@2.5.2 and /Users/daniel.cottone/Projects/expert-api/lambda/tsconfig.json
ts-loader: Using typescript@2.5.2 and /Users/daniel.cottone/Projects/expert-api/lambda/tsconfig.json

<--- Last few GCs --->

[42611:0x104001600]    55964 ms: Mark-sweep 1405.7 (1508.8) -> 1405.7 (1508.8) MB, 1721.0 / 0.0 ms  allocation failure GC in old space requested
[42611:0x104001600]    57889 ms: Mark-sweep 1405.7 (1508.8) -> 1405.5 (1487.3) MB, 1923.4 / 0.0 ms  (+ 0.0 ms in 0 steps since start of marking, biggest step 0.0 ms, walltime since start of marking 1923 ms) last resort 
[42611:0x104001600]    59801 ms: Mark-sweep 1405.5 (1487.3) -> 1405.4 (1486.8) MB, 1903.6 / 0.0 ms  last resort 


<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0x37341f01ba79 <JS Object>
    1: set [native collection.js:~247] [pc=0x29d828934f21](this=0x332730f95301 <a Map with map 0x23d2df14319>,p=0x3dd499abec41 <String[11]: MediaSource>,x=0x2589b9b1c819 <a SymbolObject with map 0x399abfecde11>)
    2: /* anonymous */(aka /* anonymous */) [/Users/daniel.cottone/Projects/expert-api/lambda/node_modules/typescript/lib/typescript.js:~23166] [pc=0x29d828ba5830](this=0x37341f002241 <...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
 1: node::Abort() [/usr/local/bin/node]
 2: node::FatalException(v8::Isolate*, v8::Local<v8::Value>, v8::Local<v8::Message>) [/usr/local/bin/node]
 3: v8::Utils::ReportOOMFailure(char const*, bool) [/usr/local/bin/node]
 4: v8::internal::V8::FatalProcessOutOfMemory(char const*, bool) [/usr/local/bin/node]
 5: v8::internal::Factory::NewFixedArray(int, v8::internal::PretenureFlag) [/usr/local/bin/node]
 6: v8::internal::OrderedHashTable<v8::internal::OrderedHashMap, v8::internal::JSMapIterator, 2>::Allocate(v8::internal::Isolate*, int, v8::internal::PretenureFlag) [/usr/local/bin/node]
 7: v8::internal::OrderedHashTable<v8::internal::OrderedHashMap, v8::internal::JSMapIterator, 2>::Rehash(v8::internal::Handle<v8::internal::OrderedHashMap>, int) [/usr/local/bin/node]
 8: v8::internal::Runtime_MapGrow(int, v8::internal::Object**, v8::internal::Isolate*) [/usr/local/bin/node]
 9: 0x29d827e840bd
10: 0x29d828934f21
11: 0x29d828ba5830
12: 0x29d827e86bbb
13: 0x29d828f85beb
Abort trap: 6

If I change my serverless config to not package individually, package: individually: false then this error goes away. I have tested this with version 3.0.0 and the latest, 4.1.0 with the same results. Don't have this issue with 2.2.3.

Additional Data

  • Serverless-Webpack Version you're using: 4.1.0
  • Webpack version you're using: 3.10.0
  • Serverless Framework Version you're using: 1.24.0
  • Operating System: macOS 10.12.6
  • Stack Trace (if available): see above
@HyperBrain
Copy link
Member

Hi @daniel-cottone ,
thanks for reporting. This can be something with your configuration. SLS-webpack since 3.0.0 requires that you use slsw.lib.entries for your entry definitions and have the function handlers declared correctly in your serverless.yml in case you use individual packaging.

Can you post the function definitions from your serverless.yml and the webpack config file?

@daniel-cottone
Copy link
Author

Hey @HyperBrain thanks for quick response. Here's the webpack configuration:

var path = require('path');
var slsw = require('serverless-webpack');
var Webpack = require('webpack');

module.exports = {
    entry: slsw.lib.entries,
    resolve: {
        extensions: ['.ts', '.js', '.json']
    },
    target: 'node',
    output: {
        libraryTarget: 'commonjs',
        path: path.join(__dirname, '.webpack'),
        filename: '[name].js'
    },
    externals: [{
        'aws-sdk': 'aws-sdk',
        'mysql2': 'mysql2',
        'sqlite3': 'sqlite3',
        'tedious': 'tedious',
        'pg-native': 'pg-native'
    }],
    module: {
        loaders: [
            { test: /\.ts(x?)$/, loader: 'ts-loader' }
        ]
    }
};

The definitions for all 40 functions is too large to post, but I'll post an example:

users-list:
    handler: src/handler/UserHandler.getUsers
    events:
        - http:
            path: users
            method: get

They pretty much all look the same, I've clipped out VPC, authorizer, and environment config. I'm pretty confident that they're all configured correctly.

@HyperBrain
Copy link
Member

The handlers look good. However, version 2.x did not support individual packaging (in fact it only copied the whole artifact per function).
So you should, as next step, add node externals to your webpack configuration to let the externals be automatically determined by webpack, so that individual packaging can make use of it:

// webpack config
const nodeExternals = require('webpack-node-externals');
...
  externals: [ nodeExternals() ]
...

Additionally, webpack > 3.0.0 now uses a module: rules structure instead of module: loaders. You should change that too.

Please also check if you have set custom: webpackIncludeModules: true in your serverless.yml.

Then do a serverless package to test, if it works. You'll find the zip packages that would be uploaded in the .serverless directory.

If aws-sdk should be packaged, you can either put it into your devDependencies or use

# serverless.yml
custom:
  webpackIncludeModules:
    forceExclude:
      - aws-sdk

to keep it outside of your packages.

@daniel-cottone
Copy link
Author

I've made your suggested changes to webpack externals and have added the webpackIncludeModules configuration to serverless custom config; I still seem to be experiencing the same problem though.

@daniel-cottone
Copy link
Author

It also appears to be related to the fact that there are so many functions in this serverless project; if I comment out all but 5 then sls package works.

@HyperBrain
Copy link
Member

Hmmm... that sounds like a memory leak somewhere when using individual packaging.
We also have a project with more than 30 functions which works, but I did not check how the memory consumption is there (i.e. if we're about to hit a limit).

What you can try is, to increase node's heap memory limit (which is at 1.7GB by default) with:
node --max-old-space-size=4096 node_modules/serverless/bin/serverless package to 4GB and check if it then passes with the full amount of functions.

If that works, we have to find out, where exactly the memory leak comes from and if it can be fixed by reusing objects.

@daniel-cottone
Copy link
Author

That definitely seems to be the problem. I got much further along, looks like about 50% of the way through. If I bump it up to 12GB then the process finishes after about 8-10 minutes.

@HyperBrain
Copy link
Member

Good to know - thanks for testing this 👍 . Can you adjust the title of the issue to reflect that this will happen with many functions? Then it's more clear how to reproduce it and we can find a solution.

Is the workaround using the increased heap ok for you as long as there's no real fix?

@daniel-cottone
Copy link
Author

Sure thing. I think the 12GB heap size is probably a bit much; in addition to that it seems to run significantly slower than our build does currently. I'll just opt to not make use of individual packaging for now. If/when this does get fixed I can turn it on then.

@daniel-cottone daniel-cottone changed the title JavaScript heap out of memory JavaScript heap out of memory when deploying many functions Dec 10, 2017
@HyperBrain
Copy link
Member

The slower runtime is expected, because it takes each webpack compile's output to determine the modules that are really needed for each function and assembles only these for the function package. That takes some time (when using --verbose you should see the exact steps including their timing).
The longer build outweighs the better startup behavior (if the lambdas are cold started) and if some big dependencies are only used by one function.

@daniel-cottone
Copy link
Author

@HyperBrain That makes sense, thanks!

@BobbieBarker
Copy link

I tried rolling back versions until I found one that didn't experience this issue. I got to 2.2.2, at which point my webpack config didn't work anymore.

@HyperBrain
Copy link
Member

HyperBrain commented Dec 15, 2017

@BobbieBarker Thanks for the investigation 👍
Support for individual packaging is available since 3.0.0. Versions prior to that (2.x) where just 1.x versions that I released with the most important fixes (the project was quite dead when I took it over). But these old versions did not do invidivual at all.

So I'm quite sure that the memory leak is somewhere in the individual packaging part (maybe the file copy). Did it also happen for you with a serverless package?
Does anyone here know, if there is a good node performance analyzer (profiler), that can track the heap and the GC (best would be graphically), so that I can see when it starts to allocate objects?

@HyperBrain
Copy link
Member

HyperBrain commented Dec 15, 2017

I did some experiments with node's internal profiler node --trace_gc serverless package --verbose
with a project having 20+ functions (JS project).

The outcome is, that there seem to be no critical object remnants (or leaks) in the npm install or copy steps. The only step where memory consumption increases (but is always cleaned up by the GC) is the actual zipping of the function packaged.

This behavior matches the log above: It crashed for you at the webpack step! And it seemed to have loaded the ts-loader multiple times. For my tested JS project, the memory showed roughly the same fill state before and after the webpack run.

So for finding the root issue, we should concentrate on the webpack step and especially typescript. Did you experience the same issue without using typescript with projects that have many functions?
It seems that the webpack compile itself runs out of memory here.

@HyperBrain
Copy link
Member

HyperBrain commented Dec 15, 2017

I thought a bit about the issue. A workaround could be that the plugin would run the compiles in batches of some functions at once. However I do not know, if the webpack library will free the allocated resources after the compile again. But it could be worth a try.

According to the crash trace it already happened after 7 compiled - if every ts-loader line is for one function - and was at 1500 MB.
[42611:0x104001600] 55964 ms: Mark-sweep 1405.7 (1508.8) -> 1405.7 (1508.8) MB, 1721.0 / 0.0 ms allocation failure GC in old space requested

The first try should be to disable some plugins in the webpack.config and check if the ts-loader might allocate all the memory.

@Birowsky
Copy link

Birowsky commented Jan 7, 2018

I hit this too after setting

package:
  individually: true

I don't think I can declare anything else of significance other than having only 9 functions. Do ask tho, I'll check whatever necessary. Here's my webpack:

const {resolve} = require('path'); 
const slsWebpack = require('serverless-webpack');
const nodeExternals = require('webpack-node-externals');


module.exports = {
  target: 'node',
  devtool: 'inline-source-map',
  entry: slsWebpack.lib.entries,
  externals: [nodeExternals()],
  output: {
    libraryTarget: 'commonjs',
    path: resolve('builds/dist'),
    filename: '[name].js'
  },
  resolve: {
    extensions: ['.ts', '.js']
  },
  module: {
    rules: [
      loader({
        test: /\.ts$/,
        use: {
          loader: 'ts-loader',
          options: { }
        }
      }),
      loader({
        test: /\.graphqls$/,
        use: {
          loader: 'graphql-tag/loader',
        }
      })
    ]
  }
};



function loader(config) {
  return Object.assign(config, {
    // exclude: [/node_modules/,/builds/, /test/],
  });
}

These might be useful:

"ts-loader": "3.1.1",
"serverless": "1.25.0",
"serverless-webpack": "4.2.0",
"webpack": "3.10.0",
"serverless-plugin-cloudfront-lambda-edge": "1.0.0",

The output after running sls package:

Serverless: Bundling with Webpack...

<--- Last few GCs --->

[52295:0x103000000]    68990 ms: Mark-sweep 1407.7 (1499.7) -> 1407.6 (1477.7) MB, 2150.7 / 0.0 ms  (+ 0.0 ms in 0 steps since start of marking, biggest step 0.0 ms, walltime since start of marking 2151 ms) last resort 
[52295:0x103000000]    71182 ms: Mark-sweep 1407.6 (1477.7) -> 1407.6 (1477.7) MB, 2191.4 / 0.0 ms  last resort 


<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0x1e68da628799 <JSObject>
    1: bind(aka bind) [/Users/Birowsky/Projects/Personal/###obfuscated###/node_modules/typescript/lib/typescript.js:~21002] [pc=0x3a6d9eaeaaef](this=0x307b27782311 <undefined>,node=0x3624801cc691 <NodeObject map = 0x1a660d8c8f59>)
    2: forEachChild [/Users/Birowsky/Projects/Personal/###obfuscated###/node_modules/typescript/lib/typescript.js:~12719] [pc=0x3a6...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
 1: node::Abort() [/Users/Birowsky/.nvm/versions/node/v8.6.0/bin/node]
 2: node::FatalException(v8::Isolate*, v8::Local<v8::Value>, v8::Local<v8::Message>) [/Users/Birowsky/.nvm/versions/node/v8.6.0/bin/node]
 3: v8::internal::V8::FatalProcessOutOfMemory(char const*, bool) [/Users/Birowsky/.nvm/versions/node/v8.6.0/bin/node]
 4: v8::internal::Factory::NewByteArray(int, v8::internal::PretenureFlag) [/Users/Birowsky/.nvm/versions/node/v8.6.0/bin/node]
 5: v8::internal::TranslationBuffer::CreateByteArray(v8::internal::Factory*) [/Users/Birowsky/.nvm/versions/node/v8.6.0/bin/node]
 6: v8::internal::compiler::CodeGenerator::PopulateDeoptimizationData(v8::internal::Handle<v8::internal::Code>) [/Users/Birowsky/.nvm/versions/node/v8.6.0/bin/node]
 7: v8::internal::compiler::CodeGenerator::FinalizeCode() [/Users/Birowsky/.nvm/versions/node/v8.6.0/bin/node]
 8: v8::internal::compiler::PipelineImpl::FinalizeCode() [/Users/Birowsky/.nvm/versions/node/v8.6.0/bin/node]
 9: v8::internal::compiler::PipelineCompilationJob::FinalizeJobImpl() [/Users/Birowsky/.nvm/versions/node/v8.6.0/bin/node]
10: v8::internal::Compiler::FinalizeCompilationJob(v8::internal::CompilationJob*) [/Users/Birowsky/.nvm/versions/node/v8.6.0/bin/node]
11: v8::internal::OptimizingCompileDispatcher::InstallOptimizedFunctions() [/Users/Birowsky/.nvm/versions/node/v8.6.0/bin/node]
12: v8::internal::StackGuard::HandleInterrupts() [/Users/Birowsky/.nvm/versions/node/v8.6.0/bin/node]
13: v8::internal::Runtime_StackGuard(int, v8::internal::Object**, v8::internal::Isolate*) [/Users/Birowsky/.nvm/versions/node/v8.6.0/bin/node]
14: 0x3a6d9e7846fd
Abort trap: 6

@Birowsky
Copy link

Birowsky commented Jan 7, 2018

An update: it works when I set transpileOnly: true for ts-loader.

@HyperBrain
Copy link
Member

HyperBrain commented Jan 8, 2018

@Birowsky Thanks for the info 🥇 . @BobbieBarker , @daniel-cottone can you confirm, that this setting also works for you?

@daniel-cottone
Copy link
Author

@HyperBrain That setting does appear to be working for me. I'll look into using fork-ts-checker-webpack-plugin to maintain type checking. Thanks!

@Birowsky
Copy link

Birowsky commented Jan 8, 2018

@daniel-cottone please share your thoughts after u succeed. I was thinking on doing a single tsc --noEmit before deploying, but maybe your approach is more rational.

@daniel-cottone
Copy link
Author

@Birowsky Seems to work. Only gripe I could have is that the type checking doesn't fail fast; if you would prefer to check types before you even start the build, which could take some time, then maybe tsc --noEmit is a better option. For now I'm going to stick with just using the plugin

@VuBui83
Copy link

VuBui83 commented Jan 8, 2018

@daniel-cottone I've been dealing with the same issue for a couple weeks now. Using fork-ts-checker-webpack-plugin will spawn a thread per function to type check. I'm wondering if fork-ts-checker is smart enough to do just the type check for the specific lambda or it just type checks the entire project since it's based on tsconfig.json. My project has 20+ functions, fork-ts-checker spawns 20+ threads just for type checking. It works but I don't think it's necessary.

@VuBui83
Copy link

VuBui83 commented Jan 18, 2018

@HyperBrain with transpileOnly: true, it starts to crash around 30+ functions

@daniel-cottone
Copy link
Author

@HyperBrain @VuBui83 I've also experienced the same problem; setting transpileOnly: true makes a huge difference but I still get crashes around 30 functions. I've also gone the route of manually type checking with tsc --noEmit rather than using fork-ts-checker-webpack-plugin.

@HyperBrain
Copy link
Member

Maybe a solution would be to provide a PR for the ts-checker plugin that limits the number of spawned processes when using multi-compiles in webpack.

@j0k3r
Copy link
Member

j0k3r commented Mar 26, 2021

Can someone confirme this has been improved or fixed by 5.4.0?

@miguel-a-calles-mba
Copy link
Member

I have not seen improvements with 5.4.0. I was helping out a friend on his project and I had to rollback to 5.3.5 to see some stability with the out-of-memory issue.

I also had to roll back to an older webpack (4.46.0).

@coyoteecd
Copy link
Contributor

coyoteecd commented Mar 27, 2021

@j0k3r I can confirm that the concurrency setting added in #681 works as intended after update to 5.4.0 (i.e. limits the number of concurrent compiles in the CI system thus effectively limiting the amount of necessary memory and avoiding the out-of-memory errors).

Note that in my case I run it with a value of 3 in the CI build; I have it configured in serverless.yml as follows:

custom:
  webpack
    [other settings]
    concurrency: ${opt:compile-concurrency, 6}

In CI, I deploy as follows:
serverless deploy --compile-concurrency 3

@ADrejta
Copy link

ADrejta commented Mar 31, 2021

@j0k3r I can also confirm that setting the concurrency setting like described in #681 does do the trick in update 5.4.0

@imhazige
Copy link

I have not seen improvements with 5.4.0. I was helping out a friend on his project and I had to rollback to 5.3.5 to see some stability with the out-of-memory issue.

I also had to roll back to an older webpack (4.46.0).

Switch webpack back from 5 to 4 solve this problem for me.

@vicary
Copy link
Member

vicary commented May 25, 2021

I am the author of #681, my project is on-and-off dealing with 200 lambda functions.

Recent updates in minor versions introduced this again, subsequent builds in the same process does linear increases in bundle time. This is further confirmed when tested with thread-loader, the timer increases individually in each thread. Upgrading webpack from 5.11 to 5.37.1 slows down the increments, but, still, it is surely increasing gradually from 70s to 700s+ at the 50th entry.

Using the serverless-layers plugin and excluding with webpack-node-externals without using modulesFromFile options stops the build times of subsequent entries time from increasing.

My educated guess is that packages in node_modules contains side effects that webpack has no way to cleanup after bundling. Try to avoid having webpack to dip its toes into node_modules when Lambda Function Layers are available, otherwise pushing for #570 and helps rebasing maybe your only choice.

EDIT: Also make sure you read webpack/webpack#6389 if you are thinking of downgrading to webpack 4.

@j0k3r
Copy link
Member

j0k3r commented Jun 10, 2021

Can someone confirm this has been improved or fixed by 5.5.1?

@vicary
Copy link
Member

vicary commented Jun 11, 2021

#858 surely looks interesting, I'll give it a try next week.

@bxjw
Copy link

bxjw commented Jun 30, 2021

#858 seems to have resolved it for us.

@vicary
Copy link
Member

vicary commented Jun 30, 2021

Yes, my team has been trying deployments in the last weeks. I am fairly confident that the problem is at least minimized to unnoticeable even for 200+ lambdas.

@jpascoe
Copy link

jpascoe commented Jul 1, 2021

Adding --compile-concurrency 3 fixed problem for me

@daveykane
Copy link

@j0k3r I'm on 5.5.1 and still have this issue unfortunately

@jsefiani
Copy link

I'm experiencing the same issue with the latest versions of both serverless-webpack (5.5.1) and webpack (5.50.0). Really annoying.

PS I'm only using 1 function (NestJS API) and I constantly run into memory issues.

@janicduplessis
Copy link
Contributor

This fix will only improve memory usage when packaging many functions, anything under ~8 functions probably won't make a difference since they will be packaged concurrently.

@omry-hay
Copy link

We still get those with version 5.5.1.

@eranelbaz
Copy link

Got those on v5.5.5

@vicary
Copy link
Member

vicary commented Oct 19, 2021

@omry-hay and @eranelbaz, it would really help if you don't mind sharing more information such as the number of entries, how are they importing node_modules, resulting bundle sizes, your serverless.yml and, if exists, webpack.config.js and tsconfig.js.


I am in the middle of contemplating a multi-approach, guideline/workaround thingy. I'll start by sharing related points here.

From my own experience 90% of the time you are actually hitting TerserPlugin's optimization/compression stage terser/terser#164, and one or more of the followings should move you forward,

  1. Naively increase the memory limit via --max-old-space-size, get a spot instance of a beefy machine for the duration of the build, grab your .webpack and shutdown the server as quickly as possible to save cost.
  2. Try replacing terser plugin with another minimizer, for example using uglify, esbuild or swc as the minifier inside webpack. See optimization options in webpack.config.
  3. Reducing custom.webpack.concurrency to a lower number in your serverless.yml, to make sure it matters in your case, start with 1.
  4. As a last resort, skip the minimize stage altogether, i.e. optimization.minimize = false and try to depends only on packages that supports tree-shaking.

@eranelbaz
Copy link

sadly i can't share the rest

tsconfig.json

{
  "compilerOptions": {
    "lib": [
      "es2019",
      "ES2020.Promise"
    ],
    "moduleResolution": "node",
    "sourceMap": true,
    "target": "es2019",
    "outDir": "out",
    "experimentalDecorators": true,
    "emitDecoratorMetadata": true,
    "esModuleInterop": true,
    "allowSyntheticDefaultImports": true,
    "baseUrl": "./",
    "resolveJsonModule": true,
    "skipLibCheck": true,
    "plugins": [
      {
        "transform": "env0-ts-transform-json-schema",
        "type": "program"
      }
    ]
  },
  "exclude": [
    "node_modules"
  ]
}

env0-ts-transofmr-json-schema is fork of ts-transform-json-schema

webpack.config.js

const _ = require('lodash');
const path = require('path');
const slsw = require('serverless-webpack');
const TsconfigPathsPlugin = require('tsconfig-paths-webpack-plugin');
const webpack = require('webpack');
const CopyPlugin = require('copy-webpack-plugin');

module.exports = dirname => {
  const logzio = [{ from: 'node_modules/@env0/common-lambda/vendor/logzio', to: '.', noErrorOnMissing: true }];

  const mode = !!slsw.lib.webpack ? 'production' : slsw.lib.webpack.isLocal ? 'development' : 'production';

  return {
    mode,
    entry: slsw.lib.entries,
    devtool: mode === 'production' ? 'nosources-source-map' : 'source-map',
    externals: ['pg', 'sqlite3', 'tedious', 'pg-hstore', 'aws-sdk', 'mysql2'],
    plugins: [
      new CopyPlugin({ patterns: [...logzio] }),
      new webpack.DefinePlugin({ 'global.GENTLY': false }),
      new webpack.ContextReplacementPlugin(/moment[\/\\]locale$/, /en/)
    ],
    resolve: {
      mainFields: ['main'],
      plugins: [new TsconfigPathsPlugin({})],
      extensions: ['.js', '.jsx', '.json', '.ts', '.tsx']
    },
    output: {
      libraryTarget: 'commonjs',
      path: path.join(dirname, '.webpack'),
      filename: '[name].js',
      pathinfo: false,
      chunkLoading: 'require',
      chunkFormat: 'commonjs',
      enabledChunkLoadingTypes: ['require']
    },

    target: 'node',
    node: {
      __dirname: false,
      __filename: false
    },
    module: {
      rules: [
        {
          exclude: /node_modules/,
          test: /\.tsx?$/,
          use: [
            {
              loader: 'ts-loader',
              options: {
                compiler: 'ttypescript',
                reportFiles: ['**', '!**/__tests__/**', '!**/?(*.)(spec|test).*', '!**/node_modules/**']
              }
            }
          ]
        }
      ]
    },
    optimization: {
      minimize: false,
      splitChunks: {
        cacheGroups: {
          commons: {
            test: /[\\/]node_modules[\\/]/,
            name: 'vendors',
            chunks: 'all'
          }
        }
      }
    }
  };
};

@vicary
Copy link
Member

vicary commented Oct 19, 2021

Try enabling happyPackMode or transpileOnly in ts-loader, they behave similarly. This disables type checking and use less resources. If your build passes, use fork-ts-checker-webpack-plugin and fork-ts-checker-webpack-plugin-limiter to add a throttled version instead.

@nmlynch94
Copy link

We were hitting this issue in a 100+ function project. Can confirm that @vicary 's solution of using serverless-layers to provide dependencies + webpack-node-externals to avoid parsing node_modules quartered our RAM usage during build and halved the build time (thank you!).

Outsourcing the typechecking to fork-ts-checker-webpack-plugin helped further, but using serverless-layers + node externals was by far the biggest gain in our situation.

@vicary
Copy link
Member

vicary commented Apr 28, 2022

@nmlynch94 Caveats if you are using serverless-layers. serverless deploy function sometimes resets the layer version to 1, or whatever version of your last serverless deploy. We either run serverless deploy function again, and if that still fails we go to AWS console and manually set to the correct version.

If it gets too annoying, you may want to fallback to the include/exclude config of this plugin, where serverless-webpack performs npm install in a separate directory and pack that node_modules into the Lambda bundle instead.

@nmlynch94
Copy link

I actually just ran into this yesterday after a function deploy and was very confused as to why my layer version wasn't being set correctly. A full refresh of the stacks fixed it and I didn't investigate any further. Will def do that if we hit that problem a lot in the future. Thanks again!

@mbostwick
Copy link

mbostwick commented May 30, 2022

I ran into an issue, where webpack was exported as a class and it was getting cleaned up, while being used..

EDIT:
@vicary honestly I think its an issue with the way max-old-space-size works and webpack's config is handled in garbage collection.. I don't think there is value in calling out that issue as its own thing, honestly, I only added a comment in case anther person has there config object as a class, and swapping to a POJO helps them (you can tune down the chunking on certain things as well to help reduce the overall memory consumption, i.e. but all your doing is tuning around the root limitation ) ..

@vicary
Copy link
Member

vicary commented Jun 1, 2022

@mbostwick Sounds like a separate issue, would you mind opening a new one and provide with enough context so we can pin it down?

@ap0h
Copy link

ap0h commented Aug 9, 2022

It's worth checking if you are importing somewhere in code aws-sdk instead specific client e.g aws-sdk/clients/s3. We have 3 lambdas, one is express app, the other two are the simple cron jobs. When we added the third lambda, we got heap out of memory problem and with 16GB on mac it wasn't enough then we checked if somewhere is imported some extensive library. It was dynamoose which had as its dependency the whole aws-sdk which resulted with >100MB bundle size plus in one other file was importing the whole aws-sdk module instead of the client. After these changes everything was good.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment