Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Performance issue (out of memory) when parsing plan file that makes use of multiple modules #627

Open
ryanbratten opened this issue Jun 14, 2022 · 7 comments
Assignees
Labels

Comments

@ryanbratten
Copy link

Description

It looks like terraform-compliance[faster-parsing] uses up a large amount of memory to handle a codebase using lots of terraform modules. I’ve slimmed it down to one simple rule and it's unable to parse the plan file and run the test after 53 mins of trying.

terraform-compliance exits with the following error:

/home/runner/work/_temp/806645c9-3a07-4f89-a20c-66536df48771.sh: line 5:  3268 Killed                  terraform-compliance -f ../terraform-compliance/ -p plan.out
Error: Process completed with exit code 137.

To Reproduce

Large plan file using multiple modules (attached)
Running on GitHub free hosted agents, currently 7GB of RAM

Plan file:
plan.out.json.txt

Used terraform-compliance Parameters:
none

Running via Docker:
No

Error Output:
Lots of warnings about ambiguous modules like this:

❗ WARNING (mounting): The reference "module.boundaries_geolive_database_ingestion[0]" in resource module.boundaries_geolive_ingestion_job.aws_glue_trigger.crawler_trigger is ambigious. It will not be mounted.

then

/home/runner/work/_temp/806645c9-3a07-4f89-a20c-66536df48771.sh: line 5:  3268 Killed                  terraform-compliance -f ../terraform-compliance/ -p plan.out
Error: Process completed with exit code 137.

Expected Behavior:
Features to be executed

Tested Versions:

  • terraform-compliance version: 1.3.33
  • terraform version: 1.1.7
@eerkunt
Copy link
Member

eerkunt commented Jun 14, 2022

Hi Ryan,

Sorry for your experience. What was the feature file you were using ? Is it possible to share that as well ?

@ryanbratten
Copy link
Author

Yep of course

Feature: Subnets

  Scenario: Ensure a multi-layered network architecture
    Given I have aws_subnet defined
    When I count them
    Then I expect the result is more than 2

@eerkunt
Copy link
Member

eerkunt commented Jun 14, 2022

Yep, I can confirm it consumes around 7.96G of memory without faster_parsing flag.

I must say I am super surprised to this :D Looking into.

@eerkunt
Copy link
Member

eerkunt commented Jun 14, 2022

Expectedly same happens without faster_parsing just slower.

@ryanbratten
Copy link
Author

Hey @eerkunt, Do you know when there might be a fix for this issue? I have now completed a big refactoring to spit out our terraform scripts into smaller chunks however we are still experiencing high memory usage in a couple areas

@nmallott
Copy link

nmallott commented Nov 3, 2022

Hi @ryanbratten, we had the same issue. Using only one feature file with all scenarios solved our memory problem.

@eerkunt
Copy link
Member

eerkunt commented Nov 4, 2022

I am sorry you are experiencing this problem guys. Life is being crazy for the last couple of months for me, unfortunately. I hope I will create some time and look into this issue asap.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants