Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove multiple tasks on same template restriction #1000

Open
wants to merge 1 commit into
base: develop
Choose a base branch
from

Conversation

quantumsheep
Copy link
Contributor

@quantumsheep quantumsheep commented Jul 25, 2022

Executing the same task template in parallel is very useful in lots of use cases. I don't know why it was restricted?

@koudi
Copy link

koudi commented Nov 24, 2022

I would too like to remove this limit. (aka +1)

I don't know why it was restricted?

If I understand code correctly I guess it's because the way repositories are handled. Defined repositories are checked-out on disk to folder like repository_{repoId}_{templateId} and updated before playbook is started. Because task can specify commit to checkout, this could cause collisions when two tasks with the same template but different commit would run at the same time.

One solution I can think of is to clone fresh repository for every task to some temp folder and remove at the end. This would of course affect performance with big repositories, but cloning with --depth 1 might mitigate that.

@jdhayes
Copy link
Contributor

jdhayes commented Apr 4, 2023

Yes, this is the main use case for me, and had been using it this way for sometime.
I use the API calls to run all my systems from the same task and use --limit=HOSTNAME for each call.

That said, I can confirm that when changes were pushed to the source repo it did cause collisions within the git clones during simultaneous task runs.
Usually this resulted in missing files and the ansible playbook would die prematurely until I manually git pulled to fix missing files.

To get around this, I thought that defining my ansible playbook as a "file" instead of a "git" source within Semaphore might do the trick, since there would be no more cloning and it would be treated as a static set of files.
This would obviously have some limitations since updates of my ansible playbook (git pull) would move outside of Semaphore and to the command line, but still worth it (at least for my use case).

Is there anything I can do to help?

@jdhayes
Copy link
Contributor

jdhayes commented Apr 5, 2023

As stated previously, I am manually updating my "file" defined repo outside of semaphore, which works fine for me.
But if we are interested in adding a feature, I would suggest a method to update the repos as a on-shot (UI/API) or on a schedule.

Some of us run our semaphore tasks in large batches, thus triggering a repo update per task would be sufficient.
However, for the rest of us that run our semaphore tasks in smaller sizes (1 task per node/system), I think having a one-shot or scheduled update could be useful, since the "corruption due to parallel tasks" issue would then be avoided completely.

@thompsm4
Copy link

I am using "file" defined repos and just running another Semaphore Task Template against my Semaphore host to pull from git on a schedule/manual depending on need. My hosts "register" with Semaphore via the API and create a unique Task Template referencing the same repository/inventory/environment just limiting the run to the individual host. I still, however, cannot seem to get more than 10 Task Templates to run at the same time.

@ansibleguy
Copy link
Contributor

Greetings!

I too would love to see a 'scheduled-clone' feature in semaphore.
We currently have an external script cloning the repo every 5-10min so the overhead of cloning a huge repository (1-2GB) does not get out of hand..

Without a fix for the repo-path-collisions I don't see this parallel-execution-limitation going anywhere..

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants