Skip to content
This repository was archived by the owner on Jul 1, 2024. It is now read-only.

Commit f1d883c

Browse files
ck-intelsspintel
andauthored
Update documentation for releases 2.1.0 (#323)
* Update documentation for releases 2.1.0 *Update Installation and Build docs *Update main Readme and Python Readme *Update all other files for TF, OVTF versions, and Links if required * Minor cosmetic changes in Build and Install.md files * Reorganize env var documentation order, and add details of the OPENVINO_TF_MAX_CLUSTERS var in Usage.md * Fix Build Instructions for macOS (#330) * Update mandarin files for main readme, docker readme, and usage.md *Prelim attempt as the changes were not much in these files *These changes would be reviewed by the team handling the Translations * Update installation table * Update build and install mandarin docs * Fix review comments * Fix a typo in mandarin install doc Co-authored-by: Suryaprakash Shanmugam <[email protected]>
1 parent 61f53c9 commit f1d883c

18 files changed

+364
-406
lines changed

README.md

Lines changed: 10 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ This product delivers [OpenVINO™](https://software.intel.com/content/www/us/en
2525

2626
- Ubuntu 18.04, 20.04, macOS 11.2.3 or Windows<sup>1</sup> 10 - 64 bit
2727
- Python* 3.7, 3.8 or 3.9
28-
- TensorFlow* v2.8.0
28+
- TensorFlow* v2.9.1
2929

3030
<sup>1</sup>Windows package supports only Python3.9
3131

@@ -38,35 +38,33 @@ The **OpenVINO™ integration with TensorFlow** package comes with pre-built lib
3838

3939

4040
pip3 install -U pip
41-
pip3 install tensorflow==2.8.0
42-
pip3 install openvino-tensorflow==2.0.0
41+
pip3 install tensorflow==2.9.1
42+
pip3 install openvino-tensorflow==2.1.0
4343

44-
For installation instructions on Windows please refer to [**OpenVINO™ integration with TensorFlow** for Windows ](docs/INSTALL.md#InstallOpenVINOintegrationwithTensorFlowalongsideTensorFlow)
44+
For installation instructions on Windows please refer to [**OpenVINO™ integration with TensorFlow** for Windows ](docs/INSTALL.md#windows)
4545

4646
To use Intel<sup>®</sup> integrated GPUs for inference, make sure to install the [Intel® Graphics Compute Runtime for OpenCL™ drivers](https://docs.openvino.ai/latest/openvino_docs_install_guides_installing_openvino_linux.html#install-gpu)
4747

48-
To leverage Intel® Vision Accelerator Design with Movidius™ (VAD-M) for inference, install [**OpenVINO™ integration with TensorFlow** alongside the Intel® Distribution of OpenVINO™ Toolkit](docs/INSTALL.md#12-install-openvino-integration-with-tensorflow-alongside-the-intel-distribution-of-openvino-toolkit).
48+
To leverage Intel® Vision Accelerator Design with Movidius™ (VAD-M) for inference, install [**OpenVINO™ integration with TensorFlow** alongside the Intel® Distribution of OpenVINO™ Toolkit](./docs/INSTALL.md#install-openvino-integration-with-tensorflow-pypi-release-alongside-the-intel-distribution-of-openvino-toolkit-for-vad-m-support).
4949

5050
For more details on installation please refer to [INSTALL.md](docs/INSTALL.md), and for build from source options please refer to [BUILD.md](docs/BUILD.md)
5151

5252
## Configuration
5353

5454
Once you've installed **OpenVINO™ integration with TensorFlow**, you can use TensorFlow* to run inference using a trained model.
5555

56-
For further performance improvements, it is advised to enable [oneDNN Deep Neural Network Library (oneDNN)](https://github.com/oneapi-src/oneDNN) by setting the environment variable `TF_ENABLE_ONEDNN_OPTS=1`.
57-
5856
To see if **OpenVINO™ integration with TensorFlow** is properly installed, run
5957

6058
python3 -c "import tensorflow as tf; print('TensorFlow version: ',tf.__version__);\
6159
import openvino_tensorflow; print(openvino_tensorflow.__version__)"
6260

6361
This should produce an output like:
6462

65-
TensorFlow version: 2.8.0
66-
OpenVINO integration with TensorFlow version: b'2.0.0'
63+
TensorFlow version: 2.9.1
64+
OpenVINO integration with TensorFlow version: b'2.1.0'
6765
OpenVINO version used for this build: b'2022.1.0'
68-
TensorFlow version used for this build: v2.8.0
69-
CXX11_ABI flag used for this build: 0
66+
TensorFlow version used for this build: v2.9.1
67+
CXX11_ABI flag used for this build: 1
7068

7169
By default, Intel<sup>®</sup> CPU is used to run inference. However, you can change the default option to either Intel<sup>®</sup> integrated GPU or Intel<sup>®</sup> VPU for AI inferencing. Invoke the following function to change the hardware on which inferencing is done.
7270

@@ -77,7 +75,7 @@ Supported backends include 'CPU', 'GPU', 'GPU_FP16', 'MYRIAD', and 'VAD-M'.
7775
To determine what processing units are available on your system for inference, use the following function:
7876

7977
openvino_tensorflow.list_backends()
80-
For more API calls and environment variables, see [USAGE.md](docs/USAGE.md).
78+
For further performance improvements, it is advised to set the environment variable `OPENVINO_TF_CONVERT_VARIABLES_TO_CONSTANTS=1`. For more API calls and environment variables, see [USAGE.md](docs/USAGE.md).
8179

8280

8381
## Examples

README_cn.md

Lines changed: 10 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@
2525

2626
- Ubuntu 18.04, 20.04, macOS 11.2.3 or Windows<sup>1</sup> 10 - 64 bit
2727
- Python* 3.7, 3.8 or 3.9
28-
- TensorFlow* v2.8.0
28+
- TensorFlow* v2.9.1
2929

3030
<sup>1</sup>Windows安装包仅支持Python3.9
3131

@@ -38,35 +38,33 @@
3838

3939

4040
pip3 install -U pip
41-
pip3 install tensorflow==2.8.0
42-
pip3 install openvino-tensorflow==2.0.0
41+
pip3 install tensorflow==2.9.1
42+
pip3 install openvino-tensorflow==2.1.0
4343

44-
关于在Windows上的安装步骤,请参考 [**OpenVINO™ integration with TensorFlow** for Windows ](docs/INSTALL_cn.md#InstallOpenVINOintegrationwithTensorFlowalongsideTensorFlow)
44+
关于在Windows上的安装步骤,请参考 [**OpenVINO™ integration with TensorFlow** for Windows ](docs/INSTALL_cn.md#windows)
4545

4646
如果您想使用Intel<sup>®</sup> 集成显卡进行推理,请确保安装[Intel® Graphics Compute Runtime for OpenCL™ drivers](https://docs.openvino.ai/latest/openvino_docs_install_guides_installing_openvino_linux.html#install-gpu)
4747

48-
如果您想使用支持 Movidius™ (VAD-M)进行推理的英特尔® 视觉加速器设计 (VAD-M) 进行推理,请安装 [**OpenVINO™ integration with TensorFlow** 以及英特尔® OpenVINO™ 工具套件发布版](docs/INSTALL_cn.md#12-install-openvino-integration-with-tensorflow-alongside-the-intel-distribution-of-openvino-toolkit)
48+
如果您想使用支持 Movidius™ (VAD-M)进行推理的英特尔® 视觉加速器设计 (VAD-M) 进行推理,请安装 [**OpenVINO™ integration with TensorFlow** 以及英特尔® OpenVINO™ 工具套件发布版](docs/INSTALL_cn.md#安装-openvino-integration-with-tensorflow-pypi-发布版与独立安装intel-openvino-发布版以支持vad-m)
4949

5050
更多安装详情,请参阅 [INSTALL.md](docs/INSTALL_cn.md), 更多源构建选项请参阅 [BUILD.md](docs/BUILD_cn.md)
5151

5252
## 配置
5353

5454
安装 **OpenVINO™ integration with TensorFlow** 后,您可以在TensorFlow* 上对训练好的模型运行推理操作。
5555

56-
为了进一步提高性能,建议通过设置环境变量 `TF_ENABLE_ONEDNN_OPTS=1` 来启用[oneDNN Deep Neural Network Library (oneDNN)](https://github.com/oneapi-src/oneDNN)
57-
5856
如要查看 **OpenVINO™ integration with TensorFlow** 是否安装正确,请运行
5957

6058
python3 -c "import tensorflow as tf; print('TensorFlow version: ',tf.__version__);\
6159
import openvino_tensorflow; print(openvino_tensorflow.__version__)"
6260

6361
它会生成以下输出:
6462

65-
TensorFlow version: 2.8.0
66-
OpenVINO integration with TensorFlow version: b'2.0.0'
63+
TensorFlow version: 2.9.1
64+
OpenVINO integration with TensorFlow version: b'2.1.0'
6765
OpenVINO version used for this build: b'2022.1.0'
68-
TensorFlow version used for this build: v2.8.0
69-
CXX11_ABI flag used for this build: 0
66+
TensorFlow version used for this build: v2.9.1
67+
CXX11_ABI flag used for this build: 1
7068

7169
默认情况下,英特尔<sup>®</sup> CPU 用于运行推理。您也可以将默认选项改为英特尔<sup>®</sup> 集成 GPU 或英特尔<sup>®</sup> VPU 来进行 AI 推理。调用以下函数,更改执行推理的硬件。
7270

@@ -77,8 +75,7 @@
7775
如要确定系统上的哪些处理单元用于推理,可使用以下函数:
7876

7977
openvino_tensorflow.list_backends()
80-
如欲了解更多 API 调用和环境变量的信息,请查看 [USAGE.md](docs/USAGE_cn.md)
81-
78+
为了进一步提高性能,建议通过设置环境变量 `OPENVINO_TF_CONVERT_VARIABLES_TO_CONSTANTS=1` 来启用。如欲了解更多 API 调用和环境变量的信息,请查看 [USAGE.md](docs/USAGE_cn.md)
8279

8380

8481
## 示例

docker/README.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,6 @@
11

2+
<p>English | <a href="./README_cn.md">简体中文</a></p>
3+
24
# **OpenVINO™ integration with TensorFlow Runtime** Dockerfiles for Ubuntu* 18.04 and Ubuntu* 20.04
35

46

docker/README_cn.md

Lines changed: 37 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -1,27 +1,33 @@
11

22
[English](./README.md) | 简体中文
3-
#Ubuntu* 18.04和Ubuntu* 20.04 Docker文件
3+
# **OpenVINO™ integration with TensorFlow Runtime** Docker文件 Ubuntu* 18.04和Ubuntu* 20.04
44

55

66
我们提供Ubuntu* 18.04和Ubuntu* 20.04 Dockerfiles, 可用来构建用于CPU、GPU、VPU和VAD-M上**OpenVINO™ integration with TensorFlow**的运行时Docker*图像。
77
它们包含所有运行时python所需安装包及共享库,以支持使用OpenVINO™后端执行TensorFlow Python应用程序。默认条件下,它可托管一个Jupyter服务器,该服务器附带Image Classification及演示在CPU上使用OpenVINO™ integration with TensorFlow的性能优势的Object Detection示例。
88

9+
以下 ARGS 可用于配置 docker build
10+
11+
TF_VERSION:要使用的 TensorFlow 版本。默认为“v2.9.1”
12+
OPENVINO_VERSION:要使用的 OpenVINO 版本。默认为“2022.1.0”
13+
OVTF_BRANCH:要使用的 OpenVINO™ integration with TensorFlow 分支。默认为“releases/2.1.0”
14+
915
构建docker镜像
1016

11-
docker build -t openvino/openvino_tensorflow_ubuntu20_runtime:2.0.0 - < ubuntu20/openvino_tensorflow_cgvh_runtime_2.0.0.dockerfile
17+
docker build -t openvino/openvino_tensorflow_ubuntu20_runtime:2.1.0 - < ubuntu20/openvino_tensorflow_cgvh_runtime_2.1.0.dockerfile
1218
启动可访问**CPU**的Jupyter服务器:
1319

1420
docker run -it --rm \
1521
-p 8888:8888 \
16-
openvino/openvino_tensorflow_ubuntu20_runtime:2.0.0
22+
openvino/openvino_tensorflow_ubuntu20_runtime:2.1.0
1723

1824
启动可访问**iGPU**的Jupyter服务器:
1925

2026
docker run -it --rm \
2127
-p 8888:8888 \
2228
--device-cgroup-rule='c 189:* rmw' \
2329
--device /dev/dri:/dev/dri \
24-
openvino/openvino_tensorflow_ubuntu20_runtime:2.0.0
30+
openvino/openvino_tensorflow_ubuntu20_runtime:2.1.0
2531

2632

2733
启动可访问**MYRIAD**的Jupyter服务器:
@@ -30,7 +36,7 @@
3036
-p 8888:8888 \
3137
--device-cgroup-rule='c 189:* rmw' \
3238
-v /dev/bus/usb:/dev/bus/usb \
33-
openvino/openvino_tensorflow_ubuntu20_runtime:2.0.0
39+
openvino/openvino_tensorflow_ubuntu20_runtime:2.1.0
3440

3541
启动可访问**VAD-M**的Jupyter服务器:
3642

@@ -40,7 +46,7 @@
4046
--mount type=bind,source=/var/tmp,destination=/var/tmp \
4147
--device /dev/ion:/dev/ion \
4248
-v /dev/bus/usb:/dev/bus/usb \
43-
openvino/openvino_tensorflow_ubuntu20_runtime:2.0.0
49+
openvino/openvino_tensorflow_ubuntu20_runtime:2.1.0
4450

4551
启动可以访问“所有“计算单元的容器,并通过/bin/bash 提供容器shell访问:
4652

@@ -50,18 +56,25 @@
5056
--device /dev/dri:/dev/dri \
5157
--mount type=bind,source=/var/tmp,destination=/var/tmp \
5258
-v /dev/bus/usb:/dev/bus/usb \
53-
openvino/openvino_tensorflow_ubuntu20_runtime:2.0.0 /bin/bash
59+
openvino/openvino_tensorflow_ubuntu20_runtime:2.1.0 /bin/bash
5460

5561
如果在英特尔第10和11代设备iGPU上执行失败, 请设定docker构建参数INTEL_OPENCL为20.35.17767
5662

57-
docker build -t openvino/openvino_tensorflow_ubuntu20_runtime:2.0.0 --build-arg INTEL_OPENCL=20.35.17767 - < ubuntu20/openvino_tensorflow_cgvh_runtime_2.0.0.dockerfile
63+
docker build -t openvino/openvino_tensorflow_ubuntu20_runtime:2.1.0 --build-arg INTEL_OPENCL=20.35.17767 - < ubuntu20/openvino_tensorflow_cgvh_runtime_2.1.0.dockerfile
5864

5965
# Dockerfiles for [TF-Serving](#https://github.com/tensorflow/serving) with OpenVINO<sup>TM</sup> integration with Tensorflow
6066

67+
The TF Serving dockerfile requires the **OpenVINO™ integration with TensorFlow Runtime** image to be built. Refer to the section above for instructions on building it.
68+
69+
以下 ARGS 可用于配置 docker build
70+
71+
TF_SERVING_VERSION: 用于构建模型服务可执行文件的 TF Serving 映像的标记。默认为“v2.9.0”
72+
OVTF_VERSION: 要使用的 **OpenVINO™ integration with TensorFlow Runtime** 集成图像的标签。认为"2.1.0"
73+
6174
构建服务docker镜像:
62-
1. 构建运行时docker镜像。该docker文件可构建OpenVINO<sup>TM</sup> integration with Tensorflow运行时镜像并在上面安装tensorflow模型服务器二进制文件。
75+
1. 该docker文件可构建OpenVINO<sup>TM</sup> integration with Tensorflow运行时镜像并在上面安装tensorflow模型服务器二进制文件。
6376

64-
docker build -t openvino/openvino_tensorflow_ubuntu20_runtime:2.0.0-serving -f ubuntu20/openvino_tensorflow_cgvh_runtime_2.0.0-serving.dockerfile .
77+
docker build -t openvino/openvino_tensorflow_ubuntu20_runtime:2.1.0-serving -f ubuntu20/openvino_tensorflow_cgvh_runtime_2.1.0-serving.dockerfile .
6578

6679
此处为Resnet50模型使用OpenVINO Integration with Tensorflow实例,提供了REST API相关客户端脚本。
6780

@@ -75,7 +88,7 @@
7588
-p 8501:8501 \
7689
-v <path to resnet_v2_50_classifiation>:/models/resnet \
7790
-e MODEL_NAME=resnet \
78-
openvino/openvino_tensorflow_ubuntu20_runtime:2.0.0-serving
91+
openvino/openvino_tensorflow_ubuntu20_runtime:2.1.0-serving
7992

8093
在**iGPU**上运行:
8194

@@ -86,7 +99,7 @@
8699
-v <path to resnet_v2_50_classifiation>:/models/resnet \
87100
-e MODEL_NAME=resnet \
88101
-e OPENVINO_TF_BACKEND=GPU \
89-
openvino/openvino_tensorflow_ubuntu20_runtime:2.0.0-serving
102+
openvino/openvino_tensorflow_ubuntu20_runtime:2.1.0-serving
90103

91104
在**MYRIAD**上运行:
92105

@@ -97,7 +110,7 @@
97110
-v <path to resnet_v2_50_classifiation>:/models/resnet \
98111
-e MODEL_NAME=resnet \
99112
-e OPENVINO_TF_BACKEND=MYRIAD \
100-
openvino/openvino_tensorflow_ubuntu20_runtime:2.0.0-serving
113+
openvino/openvino_tensorflow_ubuntu20_runtime:2.1.0-serving
101114

102115
在**MYRIAD**上运行:
103116

@@ -110,12 +123,22 @@
110123
-v <path to resnet_v2_50_classifiation>:/models/resnet \
111124
-e OPENVINO_TF_BACKEND=VAD-M \
112125
-e MODEL_NAME=resnet \
113-
openvino/openvino_tensorflow_ubuntu20_runtime:2.0.0-serving
126+
openvino/openvino_tensorflow_ubuntu20_runtime:2.1.0-serving
114127

115128
3. 运行脚本从客户端发送推理请求并从服务器获取预测。
116129
wget https://raw.githubusercontent.com/tensorflow/serving/master/tensorflow_serving/example/resnet_client.py
117130
python resnet_client.py
118131

132+
在执行 **OpenVINO™ integration with TensorFlow** 集成时应用的所有相关环境变量在通过容器运行时也适用。例如,要在启动 TensorFlow Serving 容器时禁用 **OpenVINO™ integration with TensorFlow 的集成**,只需提供 OPENVINO_TF_DISABLE=1 作为 `docker run` 命令的环境变量之一。有关更多此类环境变量,请参见 [USAGE.md](../docs/USAGE_cn.md)
133+
134+
135+
docker run -it --rm \
136+
-p 8501:8501 \
137+
-v <path to resnet_v2_50_classifiation>:/models/resnet \
138+
-e MODEL_NAME=resnet \
139+
-e OPENVINO_TF_DISABLE=1 \
140+
openvino/openvino_tensorflow_ubuntu20_runtime:2.1.0-serving
141+
119142
# 预构建镜像
120143

121144
- [Ubuntu18 runtime image on Docker* Hub](https://hub.docker.com/r/openvino/openvino_tensorflow_ubuntu18_runtime)

0 commit comments

Comments
 (0)