You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
But the jobs might take longer than GH Actions' 6 hour limit when there is a long queue. Notably, we cannot batch the different jobs because they are distinct notebooks so using with Batch is not viable. We now use concurrency with asyncio, but we still can exceed the time limit.
So, instead, @frankharkins had the great idea of using Qiskit Serverless. On day 1, we'd submit all the notebooks to Qiskit Serverless. On day 2, a new cron job would run that retrieves the results from Qiskit Serverless.
Question: what workflow do we want for local runs?
The text was updated successfully, but these errors were encountered:
Putting this on hold for a bit as the action is actually doing OK at the moment and we might be able to get higher priority if it starts being a problem again. I'm worried serverless would add yet another layer of things to go wrong.
In our cron job, we run every notebook that submits jobs:
documentation/.github/workflows/notebook-test-cron.yml
Line 33 in 83573cc
But the jobs might take longer than GH Actions' 6 hour limit when there is a long queue. Notably, we cannot batch the different jobs because they are distinct notebooks so using
with Batch
is not viable. We now use concurrency withasyncio
, but we still can exceed the time limit.So, instead, @frankharkins had the great idea of using Qiskit Serverless. On day 1, we'd submit all the notebooks to Qiskit Serverless. On day 2, a new cron job would run that retrieves the results from Qiskit Serverless.
Question: what workflow do we want for local runs?
The text was updated successfully, but these errors were encountered: