-
Notifications
You must be signed in to change notification settings - Fork 66
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Incremental deploy feature causes payload limit issue #374
Comments
Thanks @anfredrick for raising this issue. I will check that and consider suggested alternative. |
Hi Kamil :) Thanks! |
I tested the behaviour and I can't reproduce this error. What I've done:
All Global Parameters seem to be deployed successfully in target ADF: @anfredrick please provide more details or script which enables me to reproduce the error. |
Hi @NowinskiK, Thanks for the update and apologies for the delayed response. There is no issue while deploying the parameters. After deployment, if you try to run any pipeline then the pipeline will fail to start with the error as below, ErrorCode=FlowRunSizeLimitExceeded, ErrorMessage=Triggering the pipeline failed due to large run size. This could happen when a run has a large number of activities or large inputs used in some of the activities, including parameters. If you could provide an option to write the adftools_deployment_state value into a file (the same file as input for subsequent deployment), it would be relatively simple. We could then decide to store the file into any storage git/azure blob etc., as we need in our deployment pipelines. |
Thanks @anfredrick. Now, it's totally different Error & from this description, I know what I need to test. I will check that out. |
Hello @NowinskiK , Best regards |
@NowinskiK |
I don't think so... Repositories (like GIT) are for keeping a source of code, not status(es) of deployments. |
@NowinskiK My idea is that if we store it in storage if someone accidentally deletes it, Entire ADF will be deployed. Instead of this, we create a JSON file in the data factory git repository and update the content of the file with adftools_deployment_state every time after deployment and use that file for next deployment. |
@g20yr how large would be the file? if it is > few MB I would rather put it to storage than git. |
We have an instance of data factory with close to 1000 objects. We tried to use the incremental deploy feature, the deployment went fine. But, the pipelines could not start and failed with ErrorCode=RequestContentTooLarge. Upon investigation, we found that the global parameter "adftools_deployment_state" used to hold the MD5 digest for the objects is causing the request size to increase hitting the limit of ADF. Once the parameter value is reset, we could run the pipelines.
It would be good if there is an option to store the deployment state externally instead of the global parameters. Even if there is an option to provide the deployment state as an input json file, it will help us to utilize the feature without hitting the request payload limits.
The text was updated successfully, but these errors were encountered: