Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Flow node connections cannot be re-mapped to admin defined connections when an environment contains a bootstrapped equivalent #266

Open
patrickcping opened this issue Feb 19, 2024 · 2 comments
Assignees
Labels
status/blocked/upstream-api The issue/PR is blocked by an upstream API status/triaged The issue/PR has completed initial triage and needs assignment type/bug Something isn't working

Comments

@patrickcping
Copy link
Contributor

Community Note

  • Please vote on this issue by adding a 馃憤 reaction to the original issue to help the community and maintainers prioritize this request
  • Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment

Thank you for opening an issue. Please note that we try to keep the Terraform issue tracker reserved for bug reports and feature requests. For general usage questions, please see: https://www.terraform.io/community.html.

DaVinci Terraform provider Version

v0.2.1

Terraform Version

v1.6.6

Affected Resource(s)

  • davinci_flow

Terraform Configuration Files

# Copy-paste your DaVinci related Terraform configurations here - for large Terraform configs,
# please use a service like Dropbox and share a link to the ZIP file. For
# security, you can also encrypt the files using our GPG public key.

# Remember to replace any account/customer sensitive information in the configuration before submitting the issue

# NOTE: PLEASE DO NOT share DaVinci JSON exports publicly without encrypting files first.  DaVinci JSON exports can contain environment/tenant specific information, and may also include secrets.

resource "davinci_connection" "test-http" {
  environment_id = pingone_environment.my_environment.id
  connector_id   = "httpConnector"
  name           = "abcd123-http"
}

resource "davinci_flow" "test-subflow-1" {
  environment_id = pingone_environment.my_environment.id

  flow_json = file("./flows/full-basic-subflow-1.json")

  // Http connector
  connection_link {
    id                           = davinci_connection.test-http.id
    name                         = davinci_connection.test-http.name
  }
}

Debug Output

N/a

Panic Output

None

Expected Behavior

davinci_flow.test-subflow-1, when imported, should compare the name provided in connection_link.name with the name in the flow, and overwrite the connection ID with the new connection specified in HCL

Actual Behavior

The flow instead was imported and the HTTP connection used a bootstrapped default, which ignored the configured mapping.

Steps to Reproduce

  1. Create a new PingOne environment through the console or via API, ensuring bootstrapped configuration is present
  2. In the DaVinci console, rename the Http connection to Http1
  3. In the flow JSON, ensure that a httpConnector node is present, and set the name to abcd123-http (matching davinci_connection.test-http.name in the HCL)
  4. terraform apply
  5. The provider should re-map the flow such that the connectionId in the JSON becomes davinci_connection.test-http.id
  6. In the DaVinci console, observe the node's connection ID and compare with the newly created davinci_connection.test-http.id attribute

Workaround

Remove bootstrapped DaVinci configuration from the environment prior to import

Important Factoids

References

@patrickcping patrickcping added type/bug Something isn't working status/triaged The issue/PR has completed initial triage and needs assignment status/blocked/upstream-api The issue/PR is blocked by an upstream API labels Feb 19, 2024
@patrickcping patrickcping self-assigned this Feb 19, 2024
@malenze
Copy link

malenze commented Mar 25, 2024

Hi @patrickcping,
I don't know, why you create extra connectors instead of using the bootstrapped or existing ones in your environment.

I have defined all connectors used in my environment in my terraform file, but before I plan or apply the configuration, I try to import all existing resources into my local terraform state.

Thus having this in my terraform file:

data "davinci_connections" "read_all" {
  environment_id = var.environment_id
  //connector_ids  = ["flowConnector", "skUserPool", "smtpConnector"]
}
resource "davinci_connection" "http_connector" {
  connector_id   = "httpConnector"
  name           = "Http"
  environment_id = var.environment_id
  depends_on     = [data.davinci_connections.read_all]
}

I use terraform import davinci_connection.http_connector <environment_id>/<connector_id> before the first terraform plan. I know it is stupid work to look up all the ids, as they are different in all environments, but that is the only supported syntax for terraform import by the provider currently. Nothing is working by name. But htis way I can use the existing connectors, don't have to create them from scratch and don't have to replace them within the flows, which you probably need to do in the UI rather then you can do this in terraform directly. I guess the UI will then update the JSON of the flow accordingly for all mapped connectors. If you donwload the flow json afterwards, the flow will probably have the correct references to the connectors you created on your own, but the provider will not update the imported JSON for the flow on the fly I guess.

@patrickcping
Copy link
Contributor Author

hi @malenze
We'll be deprecating the ability to use bootstrapped connectors as of the next release (v0.3.0) - we want the connectors to be defined and managed in your HCL explicitly.

The biggest reason for this is because the bootstrap routine on the service can change at any time to accommodate new features or example flows, as it is meant for demo configuration. It means you cannot guarantee that an environment built today will be built in the same way a few months from now, and so your HCL (and use cases) may not deploy correctly in future deployments and could error unexpectedly.

So while it gives you an ability to not need to update the flows (and not need to use the connection_link parameters), it is not a stable/reliable approach for a production deployment. We also cannot remap just by name as it is a mutable field. Each connection has it's own unique UUID that's used in the flow and this is immutable. There are features coming in the v0.3.0 release that help with this mutability issue.

The alternative option you suggest of re-exporting the JSON from the target environment (after import) with the new connector mappings might work initially, but I expect would conflict with upstream changes from the source environment.

We recognise the overheads this creates when creating HCL (and importing resources), so we're creating a CLI tool that will automate a lot of this work - you're a couple of months ahead of us in this respect, but when we're ready we can give you a beta version to try.

The CLI will help with:

  • Continual export of config from the source environment to HCL + JSON export
  • Creation of the davinci_connection resources that support the extracted flows
  • Creation of the connection_link and subflow_link blocks in the davinci_flow resources to support the need to continually re-map connections in the presented flow from the source system

For your question about import, there are two different IDs, one that represents the connector class (e.g. httpConnector) and the ID that represents the instance of that connector class (e.g. 097735c983666c73892548c3c63b1760). It's the latter you'd use for the terraform import command, and the ID you need is shown in the console on the "Connectors" page under the ID heading.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
status/blocked/upstream-api The issue/PR is blocked by an upstream API status/triaged The issue/PR has completed initial triage and needs assignment type/bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants