Skip to content

Commit

Permalink
Updated docs
Browse files Browse the repository at this point in the history
  • Loading branch information
ctuning-admin committed Mar 11, 2024
1 parent 9e2ba26 commit a1de1c4
Show file tree
Hide file tree
Showing 5 changed files with 218 additions and 7 deletions.
6 changes: 3 additions & 3 deletions cm-mlops/script/app-mlperf-inference-nvidia/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -312,14 +312,14 @@ ___
* `_gptj_,build`
- Workflow:
1. ***Read "deps" on other CM scripts***
* install,pytorch,from.src,_for-nvidia-mlperf-inference-v3.1-gptj
* install,pytorch,from.src,_for-nvidia-mlperf-inference-v3.1
- CM script: [install-pytorch-from-src](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/install-pytorch-from-src)
* get,cmake
- CM script: [get-cmake](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/get-cmake)
* `_gptj_,build_engine`
- Workflow:
1. ***Read "deps" on other CM scripts***
* install,pytorch,from.src,_for-nvidia-mlperf-inference-v3.1-gptj
* install,pytorch,from.src,_for-nvidia-mlperf-inference-v3.1
- CM script: [install-pytorch-from-src](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/install-pytorch-from-src)
* get,cmake
- CM script: [get-cmake](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/get-cmake)
Expand All @@ -332,7 +332,7 @@ ___
- Workflow:
1. ***Read "deps" on other CM scripts***
* install,pytorch,from.src,_for-nvidia-mlperf-inference-v3.1-gptj
- CM script: [install-pytorch-from-src](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/install-pytorch-from-src)
- *Warning: no scripts found*
* get,cmake
- CM script: [get-cmake](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/get-cmake)
* `_gpu_memory.16,3d-unet_,offline,run_harness`
Expand Down
2 changes: 1 addition & 1 deletion cm-mlops/script/app-mlperf-inference/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -178,7 +178,7 @@ ___
* `if (CM_CUDA_DEVICE_PROP_GLOBAL_MEMORY not in ['yes', 'on'])`
- CM script: [get-cuda-devices](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/get-cuda-devices)
1. ***Read "prehook_deps" on other CM scripts***
* reproduce,mlperf,nvidia,inference
* reproduce,mlperf,nvidia,inference,_run_harness
* `if (CM_SKIP_RUN != True)`
* CM names: `--adr.['nvidia-original-mlperf-inference', 'nvidia-harness', 'mlperf-inference-implementation']...`
- CM script: [app-mlperf-inference-nvidia](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/app-mlperf-inference-nvidia)
Expand Down
2 changes: 1 addition & 1 deletion cm-mlops/script/install-pytorch-from-src/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -174,7 +174,7 @@ ___
* get,generic,conda-package,_package.libstdcxx-ng,_source.conda-forge
* CM names: `--adr.['conda-package', 'libstdcxx-ng']...`
- CM script: [install-generic-conda-package](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/install-generic-conda-package)
* `_for-nvidia-mlperf-inference-v3.1-gptj`
* `_for-nvidia-mlperf-inference-v3.1`
- Workflow:
1. ***Read "deps" on other CM scripts***
* get,cmake
Expand Down
208 changes: 208 additions & 0 deletions cm-mlops/script/install-torchvision-from-src/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,208 @@
<details>
<summary>Click here to see the table of contents.</summary>

* [About](#about)
* [Summary](#summary)
* [Reuse this script in your project](#reuse-this-script-in-your-project)
* [ Install CM automation language](#install-cm-automation-language)
* [ Check CM script flags](#check-cm-script-flags)
* [ Run this script from command line](#run-this-script-from-command-line)
* [ Run this script from Python](#run-this-script-from-python)
* [ Run this script via GUI](#run-this-script-via-gui)
* [ Run this script via Docker (beta)](#run-this-script-via-docker-(beta))
* [Customization](#customization)
* [ Variations](#variations)
* [ Default environment](#default-environment)
* [Script workflow, dependencies and native scripts](#script-workflow-dependencies-and-native-scripts)
* [Script output](#script-output)
* [New environment keys (filter)](#new-environment-keys-(filter))
* [New environment keys auto-detected from customize](#new-environment-keys-auto-detected-from-customize)
* [Maintainers](#maintainers)

</details>

*Note that this README is automatically generated - don't edit!*

### About

*Build pytorchvision from sources.*

#### Summary

* Category: *Compiler automation.*
* CM GitHub repository: *[mlcommons@ck](https://github.com/mlcommons/ck/tree/master/cm-mlops)*
* GitHub directory for this script: *[GitHub](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/install-torchvision-from-src)*
* CM meta description for this script: *[_cm.json](_cm.json)*
* CM "database" tags to find this script: *install,get,src,from.src,pytorchvision,torchvision,src-pytorchvision*
* Output cached? *True*
___
### Reuse this script in your project

#### Install CM automation language

* [Installation guide](https://github.com/mlcommons/ck/blob/master/docs/installation.md)
* [CM intro](https://doi.org/10.5281/zenodo.8105339)

#### Pull CM repository with this automation

```cm pull repo mlcommons@ck```


#### Run this script from command line

1. `cm run script --tags=install,get,src,from.src,pytorchvision,torchvision,src-pytorchvision[,variations] `

2. `cmr "install get src from.src pytorchvision torchvision src-pytorchvision[ variations]" `

* `variations` can be seen [here](#variations)

#### Run this script from Python

<details>
<summary>Click here to expand this section.</summary>

```python

import cmind

r = cmind.access({'action':'run'
'automation':'script',
'tags':'install,get,src,from.src,pytorchvision,torchvision,src-pytorchvision'
'out':'con',
...
(other input keys for this script)
...
})

if r['return']>0:
print (r['error'])

```

</details>


#### Run this script via GUI

```cmr "cm gui" --script="install,get,src,from.src,pytorchvision,torchvision,src-pytorchvision"```

Use this [online GUI](https://cKnowledge.org/cm-gui/?tags=install,get,src,from.src,pytorchvision,torchvision,src-pytorchvision) to generate CM CMD.

#### Run this script via Docker (beta)

`cm docker script "install get src from.src pytorchvision torchvision src-pytorchvision[ variations]" `

___
### Customization


#### Variations

* *No group (any variation can be selected)*
<details>
<summary>Click here to expand this section.</summary>

* `_branch.#`
- Environment variables:
- *CM_GIT_CHECKOUT*: `#`
- Workflow:
* `_cuda`
- Environment variables:
- *CUDA_HOME*: `<<<CM_CUDA_INSTALLED_PATH>>>`
- *CUDA_NVCC_EXECUTABLE*: `<<<CM_NVCC_BIN_WITH_PATH>>>`
- *CUDNN_INCLUDE_PATH*: `<<<CM_CUDA_PATH_INCLUDE_CUDNN>>>`
- *CUDNN_LIBRARY_PATH*: `<<<CM_CUDA_PATH_LIB_CUDNN>>>`
- *USE_CUDA*: `1`
- *USE_CUDNN*: `1`
- *TORCH_CUDA_ARCH_LIST*: `Ampere Ada Hopper`
- *TORCH_CXX_FLAGS*: `-D_GLIBCXX_USE_CXX11_ABI=1`
- Workflow:
1. ***Read "deps" on other CM scripts***
* get,cuda,_cudnn
* CM names: `--adr.['cuda']...`
- CM script: [get-cuda](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/get-cuda)
* `_for-nvidia-mlperf-inference-v3.1`
- Workflow:
1. ***Read "deps" on other CM scripts***
* install,pytorch,from.src,_for-nvidia-mlperf-inference-v3.1
- CM script: [install-pytorch-from-src](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/install-pytorch-from-src)
* `_sha.#`
- Environment variables:
- *CM_GIT_CHECKOUT_SHA*: `#`
- Workflow:
* `_tag.#`
- Environment variables:
- *CM_GIT_CHECKOUT_TAG*: `#`
- Workflow:

</details>


* Group "**repo**"
<details>
<summary>Click here to expand this section.</summary>

* `_repo.#`
- Environment variables:
- *CM_GIT_URL*: `#`
- Workflow:
* **`_repo.https://github.com/pytorch/vision`** (default)
- Environment variables:
- *CM_GIT_URL*: `https://github.com/pytorch/vision`
- Workflow:

</details>


#### Default variations

`_repo.https://github.com/pytorch/vision`
#### Default environment

<details>
<summary>Click here to expand this section.</summary>

These keys can be updated via `--env.KEY=VALUE` or `env` dictionary in `@input.json` or using script flags.


</details>

___
### Script workflow, dependencies and native scripts

<details>
<summary>Click here to expand this section.</summary>

1. ***Read "deps" on other CM scripts from [meta](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/install-torchvision-from-src/_cm.json)***
* detect,os
- CM script: [detect-os](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/detect-os)
* detect,cpu
- CM script: [detect-cpu](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/detect-cpu)
* get,python3
* `if (CM_CONDA_ENV != yes)`
* CM names: `--adr.['python', 'python3']...`
- CM script: [get-python3](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/get-python3)
* get,git,repo
* CM names: `--adr.['pytorchision-src-repo', 'torchision-src-repo']...`
- CM script: [get-git-repo](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/get-git-repo)
1. ***Run "preprocess" function from [customize.py](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/install-torchvision-from-src/customize.py)***
1. Read "prehook_deps" on other CM scripts from [meta](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/install-torchvision-from-src/_cm.json)
1. ***Run native script if exists***
* [run.sh](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/install-torchvision-from-src/run.sh)
1. Read "posthook_deps" on other CM scripts from [meta](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/install-torchvision-from-src/_cm.json)
1. ***Run "postrocess" function from [customize.py](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/install-torchvision-from-src/customize.py)***
1. Read "post_deps" on other CM scripts from [meta](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/install-torchvision-from-src/_cm.json)
</details>

___
### Script output
`cmr "install get src from.src pytorchvision torchvision src-pytorchvision[,variations]" -j`
#### New environment keys (filter)

* `CM_PYTORCHVISION_*`
#### New environment keys auto-detected from customize

___
### Maintainers

* [Open MLCommons taskforce on automation and reproducibility](https://github.com/mlcommons/ck/blob/master/docs/taskforce.md)
7 changes: 5 additions & 2 deletions cm-mlops/script/run-mlperf-inference-app/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -239,7 +239,7 @@ ___
* --**device** MLPerf device {cpu,cuda,rocm,qaic} (*cpu*)
* --**model** MLPerf model {resnet50,retinanet,bert-99,bert-99.9,3d-unet-99,3d-unet-99.9,rnnt,dlrm-v2-99,dlrm-v2-99.9,gptj-99,gptj-99.9,sdxl,llama2-70b-99,llama2-70b-99.9,mobilenet,efficientnet} (*resnet50*)
* --**precision** MLPerf model precision {float32,float16,bfloat16,int8,uint8}
* --**implementation** MLPerf implementation {mlcommons-python,mlcommons-cpp,nvidia,intel,qualcomm,ctuning-cpp-tflite} (*reference*)
* --**implementation** MLPerf implementation {mlcommons-python,mlcommons-cpp,nvidia,intel,qualcomm,ctuning-cpp-tflite} (*mlcommons-python*)
* --**backend** MLPerf framework (backend) {onnxruntime,tf,pytorch,deepsparse,tensorrt,glow,tvm-onnx} (*onnxruntime*)
* --**scenario** MLPerf scenario {Offline,Server,SingleStream,MultiStream} (*Offline*)
* --**mode** MLPerf benchmark mode {,accuracy,performance}
Expand Down Expand Up @@ -373,10 +373,13 @@ ___

1. ***Read "deps" on other CM scripts from [meta](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/run-mlperf-inference-app/_cm.yaml)***
* detect,os
* `if (CM_MLPERF_USE_DOCKER != True)`
- CM script: [detect-os](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/detect-os)
* detect,cpu
* `if (CM_MLPERF_USE_DOCKER != True)`
- CM script: [detect-cpu](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/detect-cpu)
* get,python3
* `if (CM_MLPERF_USE_DOCKER != True)`
* CM names: `--adr.['python', 'python3']...`
- CM script: [get-python3](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/get-python3)
* get,mlcommons,inference,src
Expand All @@ -385,7 +388,7 @@ ___
* get,sut,description
- CM script: [get-mlperf-inference-sut-description](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/get-mlperf-inference-sut-description)
* get,mlperf,inference,results,dir
* `if (OUTPUT_BASE_DIR != True)`
* `if (CM_MLPERF_USE_DOCKER == False) AND (OUTPUT_BASE_DIR != True)`
* CM names: `--adr.['get-mlperf-inference-results-dir']...`
- CM script: [get-mlperf-inference-results-dir](https://github.com/mlcommons/ck/tree/master/cm-mlops/script/get-mlperf-inference-results-dir)
* install,pip-package,for-cmind-python,_package.tabulate
Expand Down

0 comments on commit a1de1c4

Please sign in to comment.