Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failures due to recoverable AWS errors have returned #1539

Closed
patricklucas opened this issue Sep 27, 2014 · 44 comments
Closed

Failures due to recoverable AWS errors have returned #1539

patricklucas opened this issue Sep 27, 2014 · 44 comments

Comments

@patricklucas
Copy link

Most of these were fixed up last fall after I opened #574 and #668, but a number of these transient AWS errors have returned, causing a steady stream of builds to fail.

For example, we are again seeing Packer fail after it can't find the instance immediately after launch:

11:59:39 ==> amazon-ebs: Inspecting the source AMI...
11:59:39 ==> amazon-ebs: Creating temporary keypair: packer 5427091b-3a8e-abeb-57d8-2c6d0793a9b6
11:59:40 ==> amazon-ebs: Launching a source AWS instance...
11:59:41 ==> amazon-ebs: Error finding source instance (i-d2dd6518): The instance ID 'i-d2dd6518' does not exist (InvalidInstanceID.NotFound)
11:59:41 ==> amazon-ebs: Deleting temporary keypair...
11:59:41 Build 'amazon-ebs' errored: Error finding source instance (i-d2dd6518): The instance ID 'i-d2dd6518' does not exist (InvalidInstanceID.NotFound)

That one is the most frequent, but we also occasionally see IO timeouts as well:

10:46:57 ==> amazon-ebs: Stopping the source instance...
10:46:59 ==> amazon-ebs: Waiting for the instance to stop...
10:47:30 ==> amazon-ebs: Creating the AMI: generictrusty-ami_1219
10:47:30     amazon-ebs: AMI: ami-c38f8786
10:47:30 ==> amazon-ebs: Waiting for AMI to become ready...
10:49:13 ==> amazon-ebs: Error waiting for AMI: read tcp 204.246.163.231:443: i/o timeout
10:49:13 ==> amazon-ebs: Terminating the source AWS instance...
10:49:14 ==> amazon-ebs: Deleting temporary keypair...
10:49:14 Build 'amazon-ebs' errored: Error waiting for AMI: read tcp 204.246.163.231:443: i/o timeout

IIRC, these all used to be covered by retries, so perhaps it is a regression.

@patricklucas patricklucas changed the title Once again plagued by transient AWS errors Failures due to recoverable AWS errors have returned Sep 27, 2014
@joegoggins
Copy link

Indeed, I'm seeing these errors too, when running packer build -debug, I've seen these failure modes:

==> amazon-ebs: Pausing after run of step 'StepSecurityGroup'. Press enter to continue.
==> amazon-ebs: Launching a source AWS instance...
==> amazon-ebs: Error launching source instance: Get https://ec2.us-east-1.amazonaws.com/?AWSAccessKeyId=OMITTED&Action=RunInstances&ClientToken=OMMITTED&ImageId=OMITTED&InstanceType=m1.small&KeyName=packer+6&MaxCount=1&MinCount=1&....bla bla bla...Version=2&Timestamp=2014-09-29T04%3A52%3A36Z&UserData=&Version=2014-05-01: read tcp 72.21.215.33:443: i/o timeout
==> amazon-ebs: Pausing before cleanup of step 'StepSecurityGroup'. Press enter to continue.

and also, this one, with no detailed error output:

==> amazon-ebs: Pausing after run of step 'StepSecurityGroup'. Press enter to continue.
==> amazon-ebs: Launching a source AWS instance...
==> amazon-ebs: Error launching source instance:
==> amazon-ebs: Pausing before cleanup of step 'StepSecurityGroup'. Press enter to continue.

I wonder if it could have to do with the AWS 9/25-9/30 maintenance, unlikely, but possible I suppose.

@patricklucas
Copy link
Author

I wonder if it could have to do with the AWS 9/25-9/30 maintenance, unlikely, but possible I suppose.

This has actually been happening since we upgraded to 0.7.1 from... some older version a few weeks ago. I'll look back at where we upgraded from tomorrow and post here.

@patricklucas
Copy link
Author

We upgraded from 0.5.1 to 0.7.0 on 12 September and these failures began very shortly thereafter.

@mitchellh
Copy link
Contributor

Thanks, we'll take a look back at this.

@renat-sabitov
Copy link

We started experience this error recently:

�[1;32m==> amazon-ebs: Inspecting the source AMI...�[0m
�[1;32m==> amazon-ebs: Creating temporary keypair: packer 543f5d5b-9d56-a335-6558-2fa7672d2719�[0m
�[1;32m==> amazon-ebs: Creating temporary security group for this instance...�[0m
�[1;32m==> amazon-ebs: Authorizing SSH access on the temporary security group...�[0m
�[1;32m==> amazon-ebs: Launching a source AWS instance...�[0m
�[1;31m==> amazon-ebs: Error launching source instance: The security group 'sg-90e22af5' does not exist in VPC 'vpc-6a4a5708' (InvalidGroup.NotFound)�[0m
�[1;32m==> amazon-ebs: Deleting temporary security group...�[0m
�[1;32m==> amazon-ebs: Deleting temporary keypair...�[0m
�[1;31mBuild 'amazon-ebs' errored: Error launching source instance: The security group 'sg-90e22af5' does not exist in VPC 'vpc-6a4a5708' (InvalidGroup.NotFound)�[0m

It seems that sometimes SG is not available straight after creation and it's worth checking if it is indeed created.

@renat-sabitov
Copy link

With log enabled:

2014/10/27 16:40:24 packer-command-build: 2014/10/27 16:40:24 Waiting on builds to complete...
2014/10/27 16:40:24 packer-command-build: 2014/10/27 16:40:24 Starting build run: amazon-ebs
2014/10/27 16:40:24 packer-command-build: 2014/10/27 16:40:24 Running builder: amazon-ebs
�[1;32m==> amazon-ebs: Inspecting the source AMI...�[0m
2014/10/27 16:40:24 ui: �[1;32m==> amazon-ebs: Inspecting the source AMI...�[0m
2014/10/27 16:40:25 ui: �[1;32m==> amazon-ebs: Creating temporary keypair: packer 544ddac8-7371-722b-864d-c0b0b0563c37�[0m
�[1;32m==> amazon-ebs: Creating temporary keypair: packer 544ddac8-7371-722b-864d-c0b0b0563c37�[0m
2014/10/27 16:40:25 ui: �[1;32m==> amazon-ebs: Creating temporary security group for this instance...�[0m
�[1;32m==> amazon-ebs: Creating temporary security group for this instance...�[0m
2014/10/27 16:40:25 packer-builder-amazon-ebs: 2014/10/27 16:40:25 Temporary group name: packer 544ddac9-4ff7-3539-f1c9-ba3a1fa58bc9
�[1;32m==> amazon-ebs: Authorizing SSH access on the temporary security group...�[0m
2014/10/27 16:40:25 ui: �[1;32m==> amazon-ebs: Authorizing SSH access on the temporary security group...�[0m
�[1;32m==> amazon-ebs: Launching a source AWS instance...�[0m
2014/10/27 16:40:25 ui: �[1;32m==> amazon-ebs: Launching a source AWS instance...�[0m
�[1;31m==> amazon-ebs: Error finding source instance (i-44cb718b): The instance ID 'i-44cb718b' does not exist (InvalidInstanceID.NotFound)�[0m
2014/10/27 16:40:26 ui error: �[1;31m==> amazon-ebs: Error finding source instance (i-44cb718b): The instance ID 'i-44cb718b' does not exist (InvalidInstanceID.NotFound)�[0m
�[1;32m==> amazon-ebs: Deleting temporary security group...�[0m
2014/10/27 16:40:26 ui: �[1;32m==> amazon-ebs: Deleting temporary security group...�[0m
2014/10/27 16:40:26 packer-builder-amazon-ebs: 2014/10/27 16:40:26 Error deleting security group: resource sg-57a87d32 has a dependent object (DependencyViolation)
2014/10/27 16:40:31 packer-builder-amazon-ebs: 2014/10/27 16:40:31 Error deleting security group: resource sg-57a87d32 has a dependent object (DependencyViolation)
2014/10/27 16:40:37 packer-builder-amazon-ebs: 2014/10/27 16:40:37 Error deleting security group: resource sg-57a87d32 has a dependent object (DependencyViolation)
2014/10/27 16:40:42 packer-builder-amazon-ebs: 2014/10/27 16:40:42 Error deleting security group: resource sg-57a87d32 has a dependent object (DependencyViolation)
2014/10/27 16:40:47 packer-builder-amazon-ebs: 2014/10/27 16:40:47 Error deleting security group: resource sg-57a87d32 has a dependent object (DependencyViolation)
�[1;31m==> amazon-ebs: Error cleaning up security group. Please delete the group manually: sg-57a87d32�[0m
2014/10/27 16:40:52 ui error: �[1;31m==> amazon-ebs: Error cleaning up security group. Please delete the group manually: sg-57a87d32�[0m
�[1;32m==> amazon-ebs: Deleting temporary keypair...�[0m
2014/10/27 16:40:52 ui: �[1;32m==> amazon-ebs: Deleting temporary keypair...�[0m
�[1;31mBuild 'amazon-ebs' errored: Error finding source instance (i-44cb718b): The instance ID 'i-44cb718b' does not exist (InvalidInstanceID.NotFound)�[0m
2014/10/27 16:40:52 ui error: �[1;31mBuild 'amazon-ebs' errored: Error finding source instance (i-44cb718b): The instance ID 'i-44cb718b' does not exist (InvalidInstanceID.NotFound)�[0m
2014/10/27 16:40:52 packer-command-build: 2014/10/27 16:40:52 Builds completed. Waiting on interrupt barrier...
2014/10/27 16:40:52 machine readable: error-count []string{"1"}

@patricklucas
Copy link
Author

"Error waiting for AMI" after successful build:

13:10:37 ==> amazon-ebs: Stopping the source instance...
13:10:38 ==> amazon-ebs: Waiting for the instance to stop...
13:11:06 ==> amazon-ebs: Creating the AMI: generictrusty-hvm-ami_2163
13:11:07     amazon-ebs: AMI: ami-d30d1c96
13:11:07 ==> amazon-ebs: Waiting for AMI to become ready...
13:11:08 ==> amazon-ebs: Error waiting for AMI:
13:11:08 ==> amazon-ebs: Terminating the source AWS instance...
13:11:14 ==> amazon-ebs: Deleting temporary keypair...
13:11:15 Build 'amazon-ebs' errored: Error waiting for AMI: 
13:11:15 
13:11:15 ==> Some builds didn't complete successfully and had errors:
13:11:15 --> amazon-ebs: Error waiting for AMI: 
13:11:15 
13:11:15 ==> Builds finished but no artifacts were created.

@renat-sabitov
Copy link

@patricklucas #1533 had a patch intended to solve the problem, but obviously it doesn't work.

To my understanding something is wrong with error processing in goamz because no error message reported

@patricklucas
Copy link
Author

@renat-sabitov-sirca: actually, since I never saw an update to this issue, I didn't realize a fix had been applied. We are still running 0.7.0, so I'll upgrade to 0.7.2 and see if there is improvement.

@renat-sabitov
Copy link

@patricklucas To make it clear, I tried that patch and it didn't work. See last logs in #1533

@clstokes
Copy link
Contributor

clstokes commented Dec 1, 2014

👍 We frequently run into Error finding source instance errors as the AWS API is sometimes slow to return recently created resources. I'm surprised more users aren't experiencing this.

@patricklucas
Copy link
Author

Got a new one today: 'Error modify AMI attributes'.

11:30:51 ==> amazon-ebs: Stopping the source instance...
11:30:52 ==> amazon-ebs: Waiting for the instance to stop...
11:31:48 ==> amazon-ebs: Creating the AMI: generictrusty-ami_2377
11:31:48     amazon-ebs: AMI: ami-75607330
11:31:48 ==> amazon-ebs: Waiting for AMI to become ready...
11:34:00 ==> amazon-ebs: Modifying attributes on AMI (ami-75607330)...
11:34:00     amazon-ebs: Modifying: users
11:34:00 ==> amazon-ebs: Error modify AMI attributes:
11:34:00 ==> amazon-ebs: Deregistering the AMI because cancelation or error...
11:34:01 ==> amazon-ebs: Terminating the source AWS instance...
11:34:08 ==> amazon-ebs: Deleting temporary keypair...
11:34:08 Build 'amazon-ebs' errored: Error modify AMI attributes: 
11:34:08 
11:34:08 ==> Some builds didn't complete successfully and had errors:
11:34:08 --> amazon-ebs: Error modify AMI attributes: 
11:34:08 
11:34:08 ==> Builds finished but no artifacts were created.

@mmcgrana
Copy link

We're seeing this "Error finding source instance" issue as well. @mitchellh is there anything I can do in terms of gathering debug info that would be helpful in diagnosing?

renat-sabitov pushed a commit to renat-sabitov/packer that referenced this issue Dec 16, 2014
Relates to hashicorp#1539

AWS is eventually consistent and instance can be not visibile for
some time after creation. This fix eliminates describe-instances
call before going to the proper wait loop
@richid
Copy link

richid commented Dec 18, 2014

Seeing this too. I have a Jenkins job that will build about a dozen AMIs concurrently using the amazon-ebs builder and I encounter this several times with each build. I've added my own bash retry logic in the Jenkins job but the end result is several orphaned EC2 instances that need to be manually killed.

FWIW, I've seen this behavior with 0.7.1 and 0.7.5.

@renat-sabitov
Copy link

Hi Rich,

Try my last two patches to packer and goamz, we stopped seeing errors. Note
we use stable Security Group, before that we also had problems with SG
eventual consistency.
On 19 Dec 2014 04:59, "Rich Schumacher" notifications@github.com wrote:

Seeing this too. I have a Jenkins job that will build about a dozen AMIs
concurrently using the amazon-ebs builder and I encounter this several
times with each build. I've added my own bash retry logic in the Jenkins
job but the end result is several orphaned EC2 instances that need to be
manually killed.

FWIW, I've seen this behavior with 0.7.1 and 0.7.5.


Reply to this email directly or view it on GitHub
#1539 (comment).

@richid
Copy link

richid commented Dec 19, 2014

@renat-sabitov-sirca

I saw that and it looks good to me. Just not sure I want to start running a forked version...

@ckelner
Copy link

ckelner commented Jan 8, 2015

👍 I've seen this a few times lately regarding instances.

@allenluce
Copy link

Seeing this several times this week (using v0.7.5).

@grayaii
Copy link

grayaii commented Jan 9, 2015

We see several times a week too (using v0.7.2). Is there anything that we can do to help fix or identify the problem better?

@elasticdog
Copy link

Just wanted to add another comment saying that we're seeing this on a very regular basis as well. Let us know if there is any additional information that would make tracking this down any easier.

@timurb
Copy link
Contributor

timurb commented Jan 30, 2015

By chance have you tried running packer in a different AWS Region?
I think we've experienced some AWS-related issues (probably somewhat different ones though) with vagrant-aws and packer in US-West (Oregon) region but they've immediately almost disappeared when we've moved to EU region.

@elasticdog
Copy link

By chance have you tried running packer in a different AWS Region?

@timurb we've seen this in both us-west-1 and us-west-2

@cosmopetrich
Copy link

I've encountered it regularly on ap-southeast-2 as well.

@ckelner
Copy link

ckelner commented Jan 31, 2015

Ours have been across us-east-1 and us-west-2, haven't executed in other regions.

@ava-miket
Copy link

We are experiencing this in us-west-1 inside a VPC with v0.7.5. This was also referenced in
#878 which was closed in Feb 2014, but users continue to report problems with.

@rhysyngsun
Copy link

Assuming this is the same issue, we've been seeing this frequently with v0.7.5 in us-west-1:

 amazon-ebs: Waiting for AMI to become ready...� 
 amazon-ebs: Error waiting for AMI: Get <REDACTED DescribeImages API>: dial tcp: i/o timeout

@elasticdog
Copy link

To add some specific frequency data...over the past 5 days, we've triggered jobs in our deployment pipeline 44 times, 11 of which have failed due to this "Error finding source instance" error. I'm hoping someone has a decent workaround soon.

@jacob-webber
Copy link

Getting this error ap-southeast-2 - annoying as its sporadic.

@mgcleveland
Copy link

+1 also seeing this error. We are running a custom build of packer with this branch merged in: #1764. 1764 resolves the errors it addresses, but we have now moved on to the (occasional) error tracked by this issue.

@renat-sabitov
Copy link

@mgcleveland Try also this patch: mitchellh/goamz#180

It fixes retry bugs and increases retry counts in goamz, which is the root cause for transient AWS errors

renat-sabitov pushed a commit to renat-sabitov/packer that referenced this issue Feb 25, 2015
Relates to hashicorp#1539

AWS is eventually consistent and instance can be not visibile for
some time after creation. This fix eliminates describe-instances
call before going to the proper wait loop
@blalor
Copy link
Contributor

blalor commented Feb 25, 2015

Yep, getting these, too. It's not just a packer problem, as Amazon spews these pretty frequently in other contexts, but this class of error really should trigger a backoff and retry. Yeah, I know that's easier said than done, since I need to build it into my own tooling… :-P

==> ami-cheffed-role-qa-testrunner: Error finding source instance (i-9a325560): The instance ID 'i-9a325560' does not exist (InvalidInstanceID.NotFound)

@jantman
Copy link

jantman commented Mar 6, 2015

I just got bitten by this too. I'm not sure I'd say this "isn't a packer problem" - it's certainly an issue with either Packer or the underlying library it uses. For better or worse, yeah, 503s aren't uncommon from AWS. It's up to the tool to perform a retry (ideally with exponential backoff) before throwing an error.

@TronPaul
Copy link

TronPaul commented Mar 9, 2015

I was hitting timeouts on every run of amazon-ebs even after premaking a security group. Patching goamz with mitchellh/goamz#180 made it work first try. Thanks for the fix, @renat-sabitov-sirca.

renat-sabitov pushed a commit to renat-sabitov/packer that referenced this issue Mar 10, 2015
Relates to hashicorp#1539

AWS is eventually consistent and instance can be not visibile for
some time after creation. This fix eliminates describe-instances
call before going to the proper wait loop
@niknaknet
Copy link

To add some observations to this, working with the really basic 'example.json' from the starter guide:
using an m3.medium in us-east-1, with no VPC, I get success.
however working in eu-west-1, in a vpc, I consistently get a failure on the security group, which 'does not exist'.

.... emitting ....
amazon-ebs output will be in this color.

==> amazon-ebs: Inspecting the source AMI...
==> amazon-ebs: Creating temporary keypair: packer 5504b87e-cf88-d720-31d6-743468549e4b
==> amazon-ebs: Creating temporary security group for this instance...
==> amazon-ebs: Authorizing SSH access on the temporary security group...
==> amazon-ebs: Launching a source AWS instance...
==> amazon-ebs: Error launching source instance: The specified instance type can only be used in a VPC. A subnet ID or network interface ID is required to carry out the request. (VPCResourceNotSpecified)
==> amazon-ebs: Deleting temporary security group...
==> amazon-ebs: Deleting temporary keypair...
Build 'amazon-ebs' errored: Error launching source instance: The specified instance type can only be used in a VPC. A subnet ID or network interface ID is required to carry out the request. (VPCResourceNotSpecified)

==> Some builds didn't complete successfully and had errors:
--> amazon-ebs: Error launching source instance: The specified instance type can only be used in a VPC. A subnet ID or network interface ID is required to carry out the request. (VPCResourceNotSpecified)

==> Builds finished but no artifacts were created.

... hopefully that adds something useful

... I am using Packer v0.7.5 from homebrew on OSX

@ehershey
Copy link
Contributor

+1. I'd love to see this merged and in a new release.

@healsjnr
Copy link

healsjnr commented Jun 2, 2015

+1 Any word on when / if the pull req above is going to get merged in?

@geekpete
Copy link

geekpete commented Jun 5, 2015

And failing is fine, but the script needs to clean up after itself.
Any created resources need to be deleted if the process has failed part way through.

@renewooller
Copy link

+1 also very keen to hear if there is a plan to fix this

@mitchellh
Copy link
Contributor

This should be fixed with our switch to the official AWS library that does this automatically.

@allenluce
Copy link

@mitchellh can you add a link to the relevant issue or PR or commit?

@mitchellh
Copy link
Contributor

#2034

@richard-hulm
Copy link

Updated to 0.8.1 and getting an intermittent failure on some of our AMI builds that seem related to this?

Fails to create a security group occasionally

See logs below

1436361832,,ui,say,amazon-ebs output will be in this color.
1436361832,,ui,say,
1436361832,,ui,say,==> amazon-ebs: Prevalidating AMI Name...
1436361832,,ui,say,==> amazon-ebs: Inspecting the source AMI...
1436361833,,ui,say,==> amazon-ebs: Creating temporary keypair: packer 559d2468-a0ce-6ef8-ddbb-f67f14cf31d9
1436361833,,ui,say,==> amazon-ebs: Creating temporary security group for this instance...
1436361833,,ui,say,==> amazon-ebs: Authorizing access to port 22 the temporary security group...
1436361833,,ui,say,==> amazon-ebs: Launching a source AWS instance...
1436361833,,ui,error,==> amazon-ebs: Error launching source instance: InvalidGroup.NotFound: The security group 'sg-78d8d51d' does not exist in VPC 'vpc-XXXXXXX'\n==> amazon-ebs:  status code: 400%!(PACKER_COMMA) request id: []
1436361833,,ui,say,==> amazon-ebs: No AMIs to cleanup
1436361833,,ui,say,==> amazon-ebs: Deleting temporary security group...
1436361833,,ui,say,==> amazon-ebs: Deleting temporary keypair...
1436361833,,ui,error,Build 'amazon-ebs' errored: Error launching source instance: InvalidGroup.NotFound: The security group 'sg-78d8d51d' does not exist in VPC 'vpc-XXXXXXX'\n status code: 400%!(PACKER_COMMA) request id: []
1436361833,,error-count,1
1436361833,,ui,error,\n==> Some builds didn't complete successfully and had errors:
1436361833,amazon-ebs,error,Error launching source instance: InvalidGroup.NotFound: The security group 'sg-78d8d51d' does not exist in VPC 'vpc-XXXXXXXX'\n status code: 400%!(PACKER_COMMA) request id: []
1436361833,,ui,error,--> amazon-ebs: Error launching source instance: InvalidGroup.NotFound: The security group 'sg-78d8d51d' does not exist in VPC 'vpc-XXXXXXXX'\n    status code: 400%!(PACKER_COMMA) request id: []

@elasticdog
Copy link

@richard-hulm sounds like an exponential back-off should be used for that query in a similar fashion to this original issue with the instance IDs, due to EC2's back-end being eventually consistent. The switch to goamz may not have helped in that regard, but I don't know enough about that code to confirm.

@renewooller
Copy link

A work around could be to specify your own security group.

From: Richard Hulm <notifications@github.commailto:notifications@github.com>
Reply-To: mitchellh/packer <reply@reply.github.commailto:reply@reply.github.com>
Date: Wednesday, 8 July 2015 11:48 pm
To: mitchellh/packer <packer@noreply.github.commailto:packer@noreply.github.com>
Cc: Rene Wooller <rene.wooller@otherlevels.commailto:rene.wooller@otherlevels.com>
Subject: Re: [packer] Failures due to recoverable AWS errors have returned (#1539)

Updated to 0.8.1 and getting an intermittent failure on some of our AMI builds that seem related to this?

Fails to create a security group occasionally

See logs below

1436361832,,ui,say,amazon-ebs output will be in this color.
1436361832,,ui,say,
1436361832,,ui,say,==> amazon-ebs: Prevalidating AMI Name...
1436361832,,ui,say,==> amazon-ebs: Inspecting the source AMI...
1436361833,,ui,say,==> amazon-ebs: Creating temporary keypair: packer 559d2468-a0ce-6ef8-ddbb-f67f14cf31d9
1436361833,,ui,say,==> amazon-ebs: Creating temporary security group for this instance...
1436361833,,ui,say,==> amazon-ebs: Authorizing access to port 22 the temporary security group...
1436361833,,ui,say,==> amazon-ebs: Launching a source AWS instance...
1436361833,,ui,error,==> amazon-ebs: Error launching source instance: InvalidGroup.NotFound: The security group 'sg-78d8d51d' does not exist in VPC 'vpc-XXXXXXX'\n==> amazon-ebs: status code: 400%!(PACKER_COMMA) request id: []
1436361833,,ui,say,==> amazon-ebs: No AMIs to cleanup
1436361833,,ui,say,==> amazon-ebs: Deleting temporary security group...
1436361833,,ui,say,==> amazon-ebs: Deleting temporary keypair...
1436361833,,ui,error,Build 'amazon-ebs' errored: Error launching source instance: InvalidGroup.NotFound: The security group 'sg-78d8d51d' does not exist in VPC 'vpc-XXXXXXX'\n status code: 400%!(PACKER_COMMA) request id: []
1436361833,,error-count,1
1436361833,,ui,error,\n==> Some builds didn't complete successfully and had errors:
1436361833,amazon-ebs,error,Error launching source instance: InvalidGroup.NotFound: The security group 'sg-78d8d51d' does not exist in VPC 'vpc-XXXXXXXX'\n status code: 400%!(PACKER_COMMA) request id: []
1436361833,,ui,error,--> amazon-ebs: Error launching source instance: InvalidGroup.NotFound: The security group 'sg-78d8d51d' does not exist in VPC 'vpc-XXXXXXXX'\n status code: 400%!(PACKER_COMMA) request id: []

Reply to this email directly or view it on GitHubhttps://github.com//issues/1539#issuecomment-119584480.

@hashicorp hashicorp locked and limited conversation to collaborators Apr 8, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests