New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
grpc: received message larger than max #28
Comments
After some investigation and discussion in hashicorp/terraform#21709, I moved this here to represent a change to add a file size limit to this provider (smaller than the 4MB limit imposed by Terraform Core so that users will never hit that generic error even when counting protocol overhead) and to document that limit for both the |
Is this still open? I'd like to pick this up if so.
|
Hello Do you plan to fix this problem? If so, when? |
I think the best fix will be to support files >4Mb |
Yes, this problem still persist. |
Yes, I ran into this issue today on the local_file data source pointing at a prospective AWS Lambda archive file. |
Hello, is there any progress on this issue or was it parked? This can become a bigger issue if we use template file from Kubernetes and must store the file to disk. Since kubernetes Yaml files can become pretty big. |
Ran into this by using
|
Is there any agreement on how we can move forward? We could possibly handle it locally by splitting files into 4mb chunks within the provider but I’m not sure if that would create it’s own issues. I can pursue that but before I waste time would that even be acceptable @apparentlymart ? |
Using Terraform 0.12.23 and aws provider 2.61.0, Getting the same error It looks as though the core package has been updated to allow 64MB - hashicorp/terraform#20906 And according to the lambda limits docs 50MB files are able to be uploaded. Would it not be best to set the saftey check to 50MB? |
Just as an FYI for anyone having this issue. If you put your zip file in a s3 bucket you shouldn't face this problem. But remember to use the |
Another option is using an external data source. for example, given a filename with the variable
and use it as such:
which should give you
|
+1 this issue, it's causing us much pain as we intentionally want to inline larger files into the terraform. I see that hashicorp/terraform#20906 has been merged over a year ago, but the symptom described above still persists. Can the limit for grpc transfer be increased all around the project to allow downstream service which can accept such payloads to work properly without workarounds? |
Still happening with Terraform 0.12.24. Any workaround to fix the GRPC limit error ? |
This is still happening with Terraform 0.13.5, when using To add more clarity, I'm using the body = file(var.body) The file in question is on 1.5MB in size. If I remove the UpdateI have used |
I still have this issue with Need
|
I also have this issue trying to deploy a Rust function to IBM Cloud. Similarly to @atamgp, I have a
But even if this succeeded (or the
|
Faced same issue with kubernetes config map resource "kubernetes_config_map" "nginx" {
metadata {
name = "geoip"
namespace = "ingress"
}
binary_data = {
"GeoLite2-Country.mmdb" = filebase64("${path.module}/config/GeoLite2-Country.mmdb")
}
}
|
I've encountered same issue - it looks like there's limitation on how many characters are in resource code. Using file uploaded to bucket (without compressing it) fixed my issue - I'm assuming, that what helped is the fact, that .body from s3 is usually a stream, opposing to .rendered (which I was using before), which generates more characters in resource source. |
@finferflu - have found the same thing, we were running into this with a 1.5mb openapi json file. I was under the impression that it was not the actual file handle on the JSON that was causing this, but the "body" of the REST API now contains this which is then included in the state - and there's probably a lot of escape characters and other items in the state - so the statefile exceeds 4mb. To avoid a local file for the swagger, we uploaded to S3 and used an s3 data object in TF and the same problem occurred - so a strong indicator to support this. |
Still getting this issue with v0.15.4 and terraform cloud. We imported some infrastructure while using terraform cloud and then tried a plan, but cannot get the state file out: ╷ |
My file is around 2.4 MB and I am facing this issue even today.
any workarounds for this please ? |
We ran into this error when using swagger JSON files and API gateway. It's not a real workaround, but maybe it helps somebody who is also close to the limit. |
Can this get more attention please? |
could we get the target timeline for these fixes or any challenges at the present architecture? |
Hi folks 👋 This issue, while not mentioned in the CHANGELOG, may have been addressed with some underlying dependency updates that would have been included in the (latest) v2.2.3 release of this provider. In particular, this limit should be closer to 256MB. Does upgrading to this version of the provider help prevent this error? |
Closing due to lack of response -- if this issue still exists after v2.2.3, please open a new issue and we'll investigate further. |
This issue was originally opened by @tebriel as hashicorp/terraform#21709. It was migrated here as a result of the provider split. The original body of the issue is below.
Terraform Version
Terraform Configuration Files
// Nothing exceptionally important at this time
Debug Output
https://gist.github.com/tebriel/08f699ce69555a2670884343f9609feb
Crash Output
No crash
Expected Behavior
It should've completed the plan
Actual Behavior
Steps to Reproduce
terraform plan on my medium sized project.
Additional Context
Running within make, but has same profile outside of make. This applies fine in 0.11.14.
References
The text was updated successfully, but these errors were encountered: