Skip to content

Commit

Permalink
Fix images and installation wording on README (intel#21)
Browse files Browse the repository at this point in the history
Signed-off-by: Deb Taylor <deb.taylor@intel.com>
Reviewed-by: Feng Tian <feng.tian@intel.com>

Co-authored-by: Deb Taylor <deb.taylor@intel.com>
  • Loading branch information
ftian1 and deb-intel committed May 11, 2021
1 parent eb6d373 commit feac2ae
Showing 1 changed file with 52 additions and 24 deletions.
76 changes: 52 additions & 24 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,46 +7,63 @@ The Intel® Low Precision Optimization Tool (Intel® LPOT) is an open-source Pyt
>
> GPU support is under development.
| Infrastructure | Workflow |
| - | - |
| ![LPOT Infrastructure](./docs/imgs/infrastructure.png "Infrastructure") | ![LPOT Workflow](./docs/imgs/workflow.png "Workflow") |
**Visit the Intel® LPOT online document website at: <https://intel.github.io/lpot>.**

## Architecture

Intel® LPOT features an infrastructure and workflow that aids in increasing performance and faster deployments across architectures.


#### Infrastructure

<a target="_blank" href="docs/imgs/infrastructure.png">
<img src="docs/imgs/infrastructure.png" alt="Infrastructure" width=800 height=360>
</a>

Click the image to enlarge it.

#### Workflow

<a target="_blank" href="docs/imgs/workflow.png">
<img src="docs/imgs/workflow.png" alt="Workflow" width=800 height=360>
</a>

Click the image to enlarge it.

#### Supported Frameworks

Supported Intel-optimized DL frameworks are:
* [TensorFlow\*](https://github.com/Intel-tensorflow/tensorflow), including [1.15.0 UP2](https://github.com/Intel-tensorflow/tensorflow/tree/v1.15.0up2), [1.15.0 UP1](https://github.com/Intel-tensorflow/tensorflow/tree/v1.15.0up1), [2.1.0](https://github.com/Intel-tensorflow/tensorflow/tree/v2.1.0), [2.2.0](https://github.com/Intel-tensorflow/tensorflow/tree/v2.2.0), [2.3.0](https://github.com/Intel-tensorflow/tensorflow/tree/v2.3.0), [2.4.0](https://github.com/Intel-tensorflow/tensorflow/tree/v2.4.0)
* [PyTorch\*](https://pytorch.org/), including [1.5.0+cpu](https://download.pytorch.org/whl/torch_stable.html), [1.6.0+cpu](https://download.pytorch.org/whl/torch_stable.html)
* [Apache\* MXNet](https://mxnet.apache.org), including [1.6.0](https://github.com/apache/incubator-mxnet/tree/1.6.0), [1.7.0](https://github.com/apache/incubator-mxnet/tree/1.7.0)
* [ONNX\* Runtime](https://github.com/microsoft/onnxruntime), including [1.6.0](https://github.com/microsoft/onnxruntime/tree/v1.6.0)

**Visit the Intel® LPOT website at: <https://intel.github.io/lpot>.**

## Installation

The Intel® LPOT library is released as part of the
[Intel® oneAPI AI Analytics Toolkit](https://software.intel.com/content/www/us/en/develop/tools/oneapi/ai-analytics-toolkit.html) (AI Kit).
The AI Kit provides a consolidated package of Intel's latest deep learning and
machine optimizations all in one place for ease of development. Along with
LPOT, the AI Kit includes Intel-optimized versions of deep learning frameworks
(such as TensorFlow and PyTorch) and high-performing Python libraries to
streamline end-to-end data science and AI workflows on Intel architectures.
Select the installation based on your operating system.


### Linux Installation

You can install just the LPOT library from binary or source, or you can get
the Intel-optimized framework together with the LPOT library by installing the
Intel® oneAPI AI Analytics Toolkit.
You can install LPOT using one of three options: Install just the LPOT library
from binary or source, or get the Intel-optimized framework together with the
LPOT library by installing the [Intel® oneAPI AI Analytics Toolkit](https://software.intel.com/content/www/us/en/develop/tools/oneapi/ai-analytics-toolkit.html).

#### Install from binary
#### Option 1 Install from binary

```Shell
# install from pip
# install stable version from pip
pip install lpot

# install from conda
# install nightly version from pip
pip install -i https://test.pypi.org/simple/ lpot

# install stable version from from conda
conda install lpot -c conda-forge -c intel
```

#### Install from source
#### Option 2 Install from source

```Shell
git clone https://github.com/intel/lpot.git
Expand All @@ -55,10 +72,17 @@ Intel® oneAPI AI Analytics Toolkit.
python setup.py install
```

#### Install from AI Kit
#### Option 3 Install from AI Kit

The Intel® LPOT library is released as part of the
[Intel® oneAPI AI Analytics Toolkit](https://software.intel.com/content/www/us/en/develop/tools/oneapi/ai-analytics-toolkit.html) (AI Kit).
The AI Kit provides a consolidated package of Intel's latest deep learning and
machine optimizations all in one place for ease of development. Along with
LPOT, the AI Kit includes Intel-optimized versions of deep learning frameworks
(such as TensorFlow and PyTorch) and high-performing Python libraries to
streamline end-to-end data science and AI workflows on Intel architectures.

The AI Kit, which includes the LPOT
library, is distributed through many common channels,
The AI Kit is distributed through many common channels,
including from Intel's website, YUM, APT, Anaconda, and more.
Select and [download](https://software.intel.com/content/www/us/en/develop/tools/oneapi/ai-analytics-toolkit/download.html)
the AI Kit distribution package that's best suited for you and follow the
Expand All @@ -85,18 +109,22 @@ The following prerequisites and requirements must be satisfied for a successful
conda create -n lpot python=3.7
conda activate lpot
```
**Installation options**

#### Install from binary
#### Option 1 Install from binary

```Shell
# install from pip
# install stable version from pip
pip install lpot

# install nightly version from pip
pip install -i https://test.pypi.org/simple/ lpot

# install from conda
conda install lpot -c conda-forge -c intel
```

#### Install from source
#### Option 2 Install from source

```shell
git clone https://github.com/intel/lpot.git
Expand Down

0 comments on commit feac2ae

Please sign in to comment.