Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deploying MONAI in Azure #144

Open
justinhorton2003 opened this issue Oct 9, 2023 · 6 comments
Open

Deploying MONAI in Azure #144

justinhorton2003 opened this issue Oct 9, 2023 · 6 comments
Assignees
Labels
deploy guideline MONAI Deploy guidelines and design help wanted Extra attention is needed

Comments

@justinhorton2003
Copy link

justinhorton2003 commented Oct 9, 2023

Hey! I've been trying to deploy MONAI deploy express with Azure Container Instances and modifying the available docker-compose.yml file, but because of inherent issues with mounting Fileshares and not fully knowing how to enable GPU for the containers when deploying with the docker-compose.yml with ACI I'm looking for suggestions.

What is the "best" or easiest way to deploy MONAI deploy (?express?) on Azure(of course, with GPU enabled)? Should I be using the HELM charts instead (with Azure Kubernetes) or should I still try the Docker method? Help appreciated

@dbericat dbericat added help wanted Extra attention is needed guideline MONAI Deploy guidelines and design deploy labels Oct 9, 2023
@MMelQin
Copy link
Collaborator

MMelQin commented Oct 9, 2023

@justinhorton2003 Thanks for the question. AWS offers EC2 instance with GPUs and those had been used for testing MONAI Deploy, including the MD Express.

MONAI Deploy itself does NOT require GPU. The applications, in particular, inference applications such as the example LiverTumor and Lung Seg example MAPs, do expect a GPU to accelerate inference. Having said that, if you can get an instance with more CPU cores, the example apps will still run (since they fall back to CPU) but at a WAY slower speed, e.g. we had seen Liver Tumor Seg ran for 30 minutes with two CPU cores vs less than 2 mins with a T4 GPU.

Also, it looks you are attempting to run Azure container to run MD Express docker compose, but the MD Express testing and target use case is that the user logs on to the host (EC2 instance or a VM or a container) to run MD Express containers. Running conainers in container comes with its own complexities, even though doable.

@MMelQin MMelQin removed their assignment Oct 9, 2023
@MMelQin
Copy link
Collaborator

MMelQin commented Oct 9, 2023

@dbericat I would say it is always better to have a single assignee to follow through, though others can be mentioned/cc'ed in the issue.

@dbericat
Copy link
Member

dbericat commented Oct 9, 2023

@dbericat I would say it is always better to have a single assignee to follow through, though others can be mentioned/cc'ed in the issue.

You are right. @JHancox @woodheadio @neildsouth

@mocsharp
Copy link
Collaborator

Hi @justinhorton2003, you may find more information on how to enable NVIDIA GPUs on Azure from GPU optimized virtual machine sizes.

As for Azure File Share, are you trying to set the data paths in the docker-compose.yml to use Azure File Share? You may find a tutorial from Azure here on how to setup an NFS Azure file share and mount it on a Linux VM.

@justinhorton2003
Copy link
Author

I created a single file share and copied the MONAI deploy express repo into it(thinking that I would have to do less work modifying the docker-compose.yml). volumes: simulation-volume: driver: azure_file driver_opts: share_name: monaivolumes storage_account_name: storage_account_key:
I read around and noted that you can't mount files, subfolders, and all sorts of issues that I linked above. My goal is to deploy the functionality of MONAI deploy express(that I can run with Docker locally with GPU). I want something semi-manageable(and GPU enabled) like ACI or AKS(though I don't have much experience with that) where I can test new MAPs and workflow simply; I'd prefer not to use a Linux VM because of this. I'm looking for the best, "easiest", and actually working way of doing this on Azure(definitely). I see the HELM Charts in this repo and wonder if that would be the preferred way, or if someone has somehow got ACI(or some other service) to work.

Is this NFS File Share a way to create a VM that serves as the file system(or am I expected to run MD express in it)? How could I mount it in docker-compose.yml? Additionally, even if I could get a file share that works, how can GPU be specified if I'm trying to use the docker-compose.yml to build the whole thing(I saw that maybe ARM templates or other specifications need to be used).

Thank you for your help.

@justinhorton2003
Copy link
Author

@JohnnyKHU may also ask questions(since I'm working on this with him).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
deploy guideline MONAI Deploy guidelines and design help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

5 participants