Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Node stack traces truncated #2770

Closed
jamietre opened this issue Apr 19, 2017 · 3 comments
Closed

Node stack traces truncated #2770

jamietre opened this issue Apr 19, 2017 · 3 comments
Labels
type: bug a defect, confirmed by a maintainer

Comments

@jamietre
Copy link

jamietre commented Apr 19, 2017

I think this is a Mocha problem but it's hard to say, since I don't have anything to compare to, though it could be a general environmental problem. We are getting out of memory errors on a large test suite. I've researched this a lot, and it seems it could be one of several things, but at this point it's very difficult to troubleshoot because I can't get a full stack trace.

Node@6.9.5, mocha@3.2.0

<--- Last few GCs --->

   82518 ms: Mark-sweep 807.1 (1039.7) -> 802.3 (1038.7) MB, 149.2 / 0.0 ms [allocation failure] [GC in old space requested].
   82668 ms: Mark-sweep 802.3 (1038.7) -> 802.3 (1036.7) MB, 150.6 / 0.0 ms [allocation failure] [GC in old space requested].
   82838 ms: Mark-sweep 802.3 (1036.7) -> 802.2 (993.7) MB, 169.7 / 0.0 ms [last resort gc].
   82989 ms: Mark-sweep 802.2 (993.7) -> 802.2 (982.7) MB, 150.6 / 0.0 ms [last resort gc].


<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0000024EE58CFB61 <JS Object>
    1: SparseJoinWithSeparatorJS(aka SparseJoinWithSeparatorJS) [native array.js:~75] [pc=000002B8298FC057] (this=0000024EE5804381 <undefined>,w=0000011715C4D061 <JS Array[7440]>,F=000003681BBC8B19 <JS Array[7440]>,x=7440,I=0000024EE58B46F1 <JS Function ConvertToString (SharedFunctionInfo 0000024EE5852DC9)>,J=000003681BBC8AD9 <String[4]\: ,\n  >)
    2: DoJoin(aka DoJoin) [native array.js:137...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory

As you can see it's truncating it. I've tried everything I can think of, but none of the node command line options have any effect, e.g.

node --stack_trace_limit=1000 --max_stack_trace_source_length=1000 ./node_modules/mocha/bin/_mocha ./test --full-trace

I looked at all the v8 options with stack here:

node --v8-options | grep -B0 -A1 stack

.. and didn't see any others that might be related. Thoughts?

@drazisil drazisil added type: bug a defect, confirmed by a maintainer unconfirmed labels Apr 21, 2017
@jamietre
Copy link
Author

jamietre commented Apr 26, 2017

Followup: this is not a mocha bug. This is a babel bug -- or at a minimum an area that really could use some improvement in Babel.

One thing I didn't mention here is my mocha.opts which includes this: --compilers js:babel-core/register -- which is the source of the problem.

After some painful debugging I found the entry point of the v8 crash in babel-register/lib/cache.js ... v8 blows up when trying to serialize a 200 megabyte JSON object in the save() method. Why do I have a 200 megabyte JSON object? Well...

'babel-register' apparently caches data about files it compiles by default, and saves them to a file. From looking at cache.js, it also appears to never worry about how big the cache gets and never expires anything. It basically just reads a JSO from a file when it starts, adds stuff to it, and writes it when it finishes.

My .babel.json was almost 200 megabytes. I blew it away and everything started working again. You can disable the cache entirely, see babel-register doc, though wiping it once in a while seems to be the best for performance. I have never touched this in two years, so I assume it was full of all kinds of nonsense. My tests ran about 10 times faster after I blew the cache away (on the 2nd run anyway, after the cache was rebuilt with my current state).

I still don't know why the stack traces get truncated.

Bug reportl: babel/babel#5667

@dasilvacontin
Copy link
Contributor

Thanks for sharing your findings! :)

@openjr
Copy link

openjr commented Nov 28, 2017

Was happening on our CI server and failing all the builds, thanks for sharing this

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type: bug a defect, confirmed by a maintainer
Projects
None yet
Development

No branches or pull requests

4 participants