New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Segmentation Fault Crash with Specific Large Array Sizes #6234
Comments
|
|
Looks a lot like signed integer overflow (which is undefined behavior). Unfortunately, sclang still uses 32-bit signed integers (instead of However, 1,850,426 doubles (= 14,803,408 bytes) is way below Please share a minimal code example that triggers the crash. |
It also happens with Arrays with Ints. The numbers are produced with a pattern. They are produced with an Array.fill(NUMBER, {STREAM.next}). Nothing else. Larger numbers do not trigger the bug: 2,279,156
Is it possible that it is triggered by compilation with compiler optimizations? -DNATIVE=ON? |
Again, please share a minimal code example that triggers the crash. I don't want to guess, I want to copy, paste and run. |
|
|
Thanks! Unfortunately, I can't reproduce this here on Windows 10 with SC 3.13... BTW, please add the OS version and SC version to your issue description. |
Updated the ISSUE with SYSTEM DETAILS:GNU/Linux - Fedora 39 6.7.6-200 Intel x86_64 |
@JordanHendersonMusic It's more mysterious than that. It's a specific range (unrelated to anything like that), at least not related to this kind of overflow. I wonder if it has to do with optimizations because I didn't get that with another compilation with conservative flags. I understand I should not guess, but it's just strange. |
Ran the loop provided a handful of times and no crash on MacOS 14.4 M1 Pro 32GB memory, 3.14.0-dev. |
Crashes immediately over here... 3.14.0-dev (develop branch) Native and LTO. OH NO — Does not crash in debug mode... |
Further...
Seems like it might be in PWhite embedInStream? |
Here, it crashes with a debug build with native flags. It seems related to Streams since the original crash used a stream. In the past, I also experienced some rare crashes like this, specific to some linux builds. |
It’s happening in the garbage collector. Looks like there’s a memory bug been introduced somewhere, which may explain a few other issues. I’ll take a look when I get a minute. |
Tested with latest source, Release mode, MacOS 13.5.2. Does not crash. |
Debug build also does not crash |
@JordanHendersonMusic or @smoge could you do a fresh build and confirm for sanity? I'm at 28a0b12 |
Yes same commit. Build options Manjaro linux. Does not crash in debug here, but may trace is a little different, but if I remove all the When running
|
If I add this line...
It prints |
For laughs I downloaded the latest 'bleeding edge' build, which is from January 24th. That crashes, but I can't see any commits in the interim that should do that. |
Hmm. I'm using boost 1.74 from the packaged external_libraries. This did cause me a problem in hash.cpp with Xcode 15 because of std::unary_function being removed. |
After a lot of debugging... A single static cast fixed this for me. |
Okay, got it to build using -DCMAKE_CXX_FLAGS="--std=c++17 -D_LIBCPP_ENABLE_CXX17_REMOVED_FEATURES" Debug build does not crash. Release does crash. |
See my comment on #6257. I think this is not a bug, but just integer overflow on |
I believe this is due to the temporary function objects not being free'd until the entire line has been executed, which in turn, causes an overflow when trying to print the result to the post window. It actually has little to do with arrays. See #6257 (comment) |
I've just got this update, I'll check that soon. |
I think we’re good actually! Thanks |
Ok, so I think the property-based tests found something interesting.
It appears to be triggered by specific array sizes (in the context of the tests, that is, a few arrays are produced, 10 or so). Large Arrays can be used to represent sound files, so I think it's not an extreme case in principle.
The interpreter crashes with a segmentation fault when handling arrays of certain sizes. Specifically, an array size of around 1,850,426 elements consistently triggers this crash. Interestingly, the issue does not occur with all large arrays, as some larger arrays do not cause the interpreter to crash.
Initialize a DoubleArray with approximately 1,850,426 elements. Example data includes [0.0062277317047119, 0.76690030097961, -0.56259202957153, ...] (Array size: 1,850,426).
Observe that the interpreter crashes with a segmentation fault (Exit code: 11).
Notably, the crash occurs with numerical data: floats (64-bit "doubles" ) and arrays filled with simple numbers (Integers), indicating that the content of the array does not affect the crash's occurrence, just the size.
Affected Array Sizes: Crash: Array sizes around 1,850,426 and 1,824,713 elements.
No Crash: Larger arrays, such as one with 2,279,156 elements, do not trigger the crash.
Anything less than those values is consistently correct. Even larger Arrays have not been tested in the same way yet. The pattern is not clear yet.
As I understand, large objects like this have a special treatment in the language (LargeObjSizeClass).
SYSTEM DETAILS:
GNU/Linux - Fedora 39 6.7.6-200 Intel x86_64
Custom compilation latest develop branch with native flags and -DNATIVE=ON
The text was updated successfully, but these errors were encountered: