Skip to content

Commit a3c7e15

Browse files
tadamczxmsmykx-intelsgolebiewski-intelkblaszczak-intelakopytko
authored
[DOCS] Docs file structure update with fixes (openvinotoolkit#23343)
- Updated paths in an entire documentation, - updated scripts used in docs building process, - updated docs cmake to handle new scripts, - fixed links, - fixed other errors found in docs. --------- Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com> Co-authored-by: Sebastian Golebiewski <sebastianx.golebiewski@intel.com> Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com> Co-authored-by: Andrzej Kopytko <andrzejx.kopytko@intel.com> Co-authored-by: Tatiana Savina <tatiana.savina@intel.com> Co-authored-by: Vishniakov Nikolai <nikolai.vishniakov@intel.com>
1 parent a78d914 commit a3c7e15

File tree

296 files changed

+6150
-6056
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

296 files changed

+6150
-6056
lines changed

README.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -160,9 +160,9 @@ You can also check out [Awesome OpenVINO](https://github.com/openvinotoolkit/awe
160160
## System requirements
161161

162162
The system requirements vary depending on platform and are available on dedicated pages:
163-
- [Linux](https://docs.openvino.ai/2024/get-started/install-openvino-overview/install-openvino-linux-header.html)
164-
- [Windows](https://docs.openvino.ai/2024/get-started/install-openvino-overview/install-openvino-windows-header.html)
165-
- [macOS](https://docs.openvino.ai/2024/get-started/install-openvino-overview/install-openvino-macos-header.html)
163+
- [Linux](https://docs.openvino.ai/2024/get-started/install-openvino/install-openvino-linux.html)
164+
- [Windows](https://docs.openvino.ai/2024/get-started/install-openvino/install-openvino-windows.html)
165+
- [macOS](https://docs.openvino.ai/2024/get-started/install-openvino/install-openvino-macos.html)
166166

167167
## How to build
168168

docs/CMakeLists.txt

+4-2
Original file line numberDiff line numberDiff line change
@@ -22,11 +22,13 @@ function(build_docs)
2222

2323
set(DOCS_BUILD_DIR "${CMAKE_CURRENT_BINARY_DIR}")
2424
set(DOCS_SOURCE_DIR "${OpenVINO_SOURCE_DIR}/docs")
25+
set(ARTICLES_EN_DIR "${OpenVINO_SOURCE_DIR}/docs/articles_en")
2526
set(SCRIPTS_DIR "${DOCS_SOURCE_DIR}/scripts")
2627

2728
# Preprocessing scripts
2829
set(REMOVE_XML_SCRIPT "${SCRIPTS_DIR}/remove_xml.py")
2930
set(FILE_HELPER_SCRIPT "${SCRIPTS_DIR}/filehelper.py")
31+
set(ARTICLES_HELPER_SCRIPT "${SCRIPTS_DIR}/articles_helper.py")
3032
set(COPY_IMAGES_SCRIPT "${SCRIPTS_DIR}/copy_images.py")
3133
set(DOXYGEN_MAPPING_SCRIPT "${SCRIPTS_DIR}/create_mapping.py")
3234
set(BREATHE_APIDOC_SCRIPT "${SCRIPTS_DIR}/apidoc.py")
@@ -38,9 +40,9 @@ function(build_docs)
3840
set(SPHINX_OUTPUT "${DOCS_BUILD_DIR}/_build")
3941

4042
list(APPEND commands COMMAND ${CMAKE_COMMAND} -E cmake_echo_color --green "STARTED preprocessing OpenVINO articles")
41-
list(APPEND commands COMMAND ${Python3_EXECUTABLE} ${FILE_HELPER_SCRIPT}
43+
list(APPEND commands COMMAND ${Python3_EXECUTABLE} ${ARTICLES_HELPER_SCRIPT}
4244
--filetype=rst
43-
--input_dir=${OpenVINO_SOURCE_DIR}
45+
--input_dir=${ARTICLES_EN_DIR}
4446
--output_dir=${SPHINX_SOURCE_DIR}
4547
--exclude_dir=${SPHINX_SOURCE_DIR})
4648
list(APPEND commands COMMAND ${CMAKE_COMMAND} -E cmake_echo_color --green "FINISHED preprocessing OpenVINO articles")

docs/articles_en/about-openvino.rst

+16-16
Original file line numberDiff line numberDiff line change
@@ -8,11 +8,11 @@ About OpenVINO
88
:maxdepth: 1
99
:hidden:
1010

11-
openvino_docs_performance_benchmarks
12-
compatibility_and_support
13-
system_requirements
14-
Release Notes <openvino_release_notes>
15-
Additional Resources <resources>
11+
about-openvino/performance-benchmarks
12+
about-openvino/compatibility-and-support
13+
about-openvino/system-requirements
14+
Release Notes <about-openvino/release-notes-openvino>
15+
Additional Resources <about-openvino/additional-resources>
1616

1717
OpenVINO is a toolkit for simple and efficient deployment of various deep learning models.
1818
In this section you will find information on the product itself, as well as the software
@@ -24,39 +24,39 @@ OpenVINO (Open Visual Inference and Neural network Optimization) is an open-sour
2424
Features
2525
##############################################################
2626

27-
One of the main purposes of OpenVINO is to streamline the deployment of deep learning models in user applications. It optimizes and accelerates model inference, which is crucial for such domains as Generative AI, Large Language models, and use cases like object detection, classification, segmentation, and many others.
27+
One of the main purposes of OpenVINO is to streamline the deployment of deep learning models in user applications. It optimizes and accelerates model inference, which is crucial for such domains as Generative AI, Large Language models, and use cases like object detection, classification, segmentation, and many others.
2828

29-
* :doc:`Model Optimization <openvino_docs_model_optimization_guide>`
29+
* :doc:`Model Optimization <openvino-workflow/model-optimization>`
3030

3131
OpenVINO provides multiple optimization methods for both the training and post-training stages, including weight compression for Large Language models and Intel Optimum integration with Hugging Face.
3232

33-
* :doc:`Model Conversion and Framework Compatibility <openvino_docs_model_processing_introduction>`
33+
* :doc:`Model Conversion and Framework Compatibility <openvino-workflow/model-preparation>`
3434

35-
Supported models can be loaded directly or converted to the OpenVINO format to achieve better performance. Supported frameworks include ONNX, PyTorch, TensorFlow, TensorFlow Lite, Keras, and PaddlePaddle.
35+
Supported models can be loaded directly or converted to the OpenVINO format to achieve better performance. Supported frameworks include ONNX, PyTorch, TensorFlow, TensorFlow Lite, Keras, and PaddlePaddle.
3636

37-
* :doc:`Model Inference <openvino_docs_OV_UG_OV_Runtime_User_Guide>`
37+
* :doc:`Model Inference <openvino-workflow/running-inference>`
3838

3939
OpenVINO accelerates deep learning models on various hardware platforms, ensuring real-time, efficient inference.
4040

4141
* `Deployment on a server <https://github.com/openvinotoolkit/model_server>`__
4242

43-
A model can be deployed either locally using OpenVINO Runtime or on a model server. Runtime is a set of C++ libraries with C and Python bindings providing a common API to deliver inference solutions. The model server enables quick model inference using external resources.
43+
A model can be deployed either locally using OpenVINO Runtime or on a model server. Runtime is a set of C++ libraries with C and Python bindings providing a common API to deliver inference solutions. The model server enables quick model inference using external about-openvino/additional-resources.
4444

4545
Architecture
4646
##############################################################
4747

4848
To learn more about how OpenVINO works, read the Developer documentation on its `architecture <https://github.com/openvinotoolkit/openvino/blob/master/src/docs/architecture.md>`__ and `core components <https://github.com/openvinotoolkit/openvino/blob/master/src/README.md>`__.
4949

50-
OpenVINO Ecosystem
50+
OpenVINO Ecosystem
5151
##############################################################
5252

5353
Along with the primary components of model optimization and runtime, the toolkit also includes:
5454

5555
* `Neural Network Compression Framework (NNCF) <https://github.com/openvinotoolkit/nncf>`__ - a tool for enhanced OpenVINO™ inference to get performance boost with minimal accuracy drop.
56-
* :doc:`Openvino Notebooks <tutorials>`- Jupyter Python notebook tutorials, which demonstrate key features of the toolkit.
56+
* :doc:`Openvino Notebooks <learn-openvino/interactive-tutorials-python>`- Jupyter Python notebook, which demonstrate key features of the toolkit.
5757
* `OpenVINO Model Server <https://github.com/openvinotoolkit/model_server>`__ - a server that enables scalability via a serving microservice.
58-
* :doc:`OpenVINO Training Extensions <ote_documentation>` – a convenient environment to train Deep Learning models and convert them using the OpenVINO™ toolkit for optimized inference.
59-
* :doc:`Dataset Management Framework (Datumaro) <datumaro_documentation>` - a tool to build, transform, and analyze datasets.
58+
* :doc:`OpenVINO Training Extensions <documentation/openvino-ecosystem/openvino-training-extensions>` – a convenient environment to train Deep Learning models and convert them using the OpenVINO™ toolkit for optimized inference.
59+
* :doc:`Dataset Management Framework (Datumaro) <documentation/openvino-ecosystem/datumaro>` - a tool to build, transform, and analyze datasets.
6060

6161
Community
6262
##############################################################
@@ -66,7 +66,7 @@ OpenVINO community plays a vital role in the growth and development of the open-
6666
* `OpenVINO GitHub issues, discussions and pull requests <https://github.com/openvinotoolkit/openvino>`__
6767
* `OpenVINO Blog <https://blog.openvino.ai/>`__
6868
* `Community Forum <https://community.intel.com/t5/Intel-Distribution-of-OpenVINO/bd-p/distribution-openvino-toolkit>`__
69-
* `OpenVINO video tutorials <https://www.youtube.com/watch?v=_Jnjt21ZDS8&list=PLg-UKERBljNxdIQir1wrirZJ50yTp4eHv>`__
69+
* `OpenVINO video <https://www.youtube.com/watch?v=_Jnjt21ZDS8&list=PLg-UKERBljNxdIQir1wrirZJ50yTp4eHv>`__
7070
* `Support Information <https://www.intel.com/content/www/us/en/support/products/96066/software/development-software/openvino-toolkit.html>`__
7171

7272
Case Studies

docs/articles_en/about-openvino/additional-resources.rst

+7-7
Original file line numberDiff line numberDiff line change
@@ -13,19 +13,19 @@ Additional Resources
1313
:maxdepth: 1
1414
:hidden:
1515

16-
openvino_docs_OV_Glossary
17-
openvino_docs_Legal_Information
18-
openvino_docs_telemetry_information
16+
additional-resources/glossary
17+
additional-resources/legal-information
18+
additional-resources/telemetry
1919
Case Studies <https://www.intel.com/openvino-success-stories>
2020

2121

22-
:doc:`Performance Benchmarks <openvino_docs_performance_benchmarks>` contain results from benchmarking models with OpenVINO on Intel hardware.
22+
:doc:`Performance Benchmarks <performance-benchmarks>` contain results from benchmarking models with OpenVINO on Intel hardware.
2323

24-
:doc:`Glossary <openvino_docs_OV_Glossary>` contains terms used in OpenVINO.
24+
:doc:`Glossary <additional-resources/glossary>` contains terms used in OpenVINO.
2525

26-
:doc:`Legal Information <openvino_docs_Legal_Information>` has trademark information and other legal statements.
26+
:doc:`Legal Information <additional-resources/legal-information>` has trademark information and other legal statements.
2727

28-
:doc:`OpenVINO™ Telemetry <openvino_docs_telemetry_information>` has detailed information on the telemetry data collection.
28+
:doc:`OpenVINO™ Telemetry <additional-resources/telemetry>` has detailed information on the telemetry data collection.
2929

3030
`Case Studies <https://www.intel.com/openvino-success-stories>`__ are articles about real-world examples of OpenVINO™ usage.
3131

docs/articles_en/about-openvino/additional-resources/glossary.rst

+2-2
Original file line numberDiff line numberDiff line change
@@ -119,7 +119,7 @@ Glossary of terms used in OpenVINO™
119119
still find this term in some articles. Because of their role in the software,
120120
they are now referred to as Devices and Modes ("virtual" devices). For a detailed
121121
description of the concept, refer to
122-
:doc:`Inference Devices and Modes <openvino_docs_Runtime_Inference_Modes_Overview>`.
122+
:doc:`Inference Devices and Modes <../../openvino-workflow/running-inference/inference-devices-and-modes>`.
123123
124124
| *Tensor*
125125
| A memory container used for storing inputs and outputs of the model, as well as
@@ -128,4 +128,4 @@ Glossary of terms used in OpenVINO™
128128

129129
See Also
130130
#################################################
131-
* :doc:`Available Operations Sets <openvino_docs_ops_opset>`
131+
* :doc:`Available Operations Sets <../../documentation/openvino-ir-format/operation-sets/available-opsets>`

docs/articles_en/about-openvino/compatibility-and-support.rst

+8-8
Original file line numberDiff line numberDiff line change
@@ -8,19 +8,19 @@ Compatibility and Support
88
:maxdepth: 1
99
:hidden:
1010

11-
openvino_supported_models
12-
openvino_supported_devices
13-
openvino_resources_supported_operations
14-
openvino_resources_supported_operations_frontend
11+
compatibility-and-support/supported-models
12+
compatibility-and-support/supported-devices
13+
compatibility-and-support/supported-operations-inference-devices
14+
compatibility-and-support/supported-operations-framework-frontend
1515

1616

17-
:doc:`Supported Devices <openvino_supported_devices>` - compatibility information for supported hardware accelerators.
17+
:doc:`Supported Devices <compatibility-and-support/supported-devices>` - compatibility information for supported hardware accelerators.
1818

19-
:doc:`Supported Models <openvino_supported_models>` - a table of models officially supported by OpenVINO.
19+
:doc:`Supported Models <compatibility-and-support/supported-models>` - a table of models officially supported by OpenVINO.
2020

21-
:doc:`Supported Operations <openvino_resources_supported_operations>` - a listing of framework layers supported by OpenVINO.
21+
:doc:`Supported Operations <compatibility-and-support/supported-operations-inference-devices>` - a listing of framework layers supported by OpenVINO.
2222

23-
:doc:`Supported Operations <openvino_resources_supported_operations_frontend>` - a listing of layers supported by OpenVINO inference devices.
23+
:doc:`Supported Operations <compatibility-and-support/supported-operations-framework-frontend>` - a listing of layers supported by OpenVINO inference devices.
2424

2525

2626

docs/articles_en/about-openvino/compatibility-and-support/supported-devices.rst

+25-25
Original file line numberDiff line numberDiff line change
@@ -11,45 +11,45 @@ Inference Device Support
1111

1212
The OpenVINO™ runtime enables you to use a selection of devices to run your
1313
deep learning models:
14-
:doc:`CPU <openvino_docs_OV_UG_supported_plugins_CPU>`,
15-
:doc:`GPU <openvino_docs_OV_UG_supported_plugins_GPU>`,
16-
:doc:`NPU <openvino_docs_OV_UG_supported_plugins_NPU>`.
14+
:doc:`CPU <../../openvino-workflow/running-inference/inference-devices-and-modes/cpu-device>`,
15+
:doc:`GPU <../../openvino-workflow/running-inference/inference-devices-and-modes/gpu-device>`,
16+
:doc:`NPU <../../openvino-workflow/running-inference/inference-devices-and-modes/npu-device>`.
1717

18-
| For their usage guides, see :doc:`Devices and Modes <openvino_docs_Runtime_Inference_Modes_Overview>`.
19-
| For a detailed list of devices, see :doc:`System Requirements <system_requirements>`.
18+
| For their usage guides, see :doc:`Devices and Modes <../../openvino-workflow/running-inference/inference-devices-and-modes>`.
19+
| For a detailed list of devices, see :doc:`System Requirements <../system-requirements>`.
2020
2121
Beside running inference with a specific device,
2222
OpenVINO offers the option of running automated inference with the following inference modes:
2323

24-
* :doc:`Automatic Device Selection <openvino_docs_OV_UG_supported_plugins_AUTO>` - automatically selects the best device
24+
* :doc:`Automatic Device Selection <../../openvino-workflow/running-inference/inference-devices-and-modes/auto-device-selection>` - automatically selects the best device
2525
available for the given task. It offers many additional options and optimizations, including inference on
2626
multiple devices at the same time.
27-
* :doc:`Heterogeneous Inference <openvino_docs_OV_UG_Hetero_execution>` - enables splitting inference among several devices
27+
* :doc:`Heterogeneous Inference <../../openvino-workflow/running-inference/inference-devices-and-modes/hetero-execution>` - enables splitting inference among several devices
2828
automatically, for example, if one device doesn't support certain operations.
29-
* :doc:`Multi-device Inference <openvino_docs_OV_UG_Running_on_multiple_devices>` - executes inference on multiple devices.
29+
* :doc:`Multi-device Inference <../../openvino-workflow/running-inference/inference-devices-and-modes/multi-device>` - executes inference on multiple devices.
3030
Currently, this mode is considered a legacy solution. Using Automatic Device Selection is advised.
31-
* :doc:`Automatic Batching <openvino_docs_OV_UG_Automatic_Batching>` - automatically groups inference requests to improve
31+
* :doc:`Automatic Batching <../../openvino-workflow/running-inference/inference-devices-and-modes/automatic-batching>` - automatically groups inference requests to improve
3232
device utilization.
3333

3434

3535

3636
Feature Support and API Coverage
3737
#################################
3838

39-
================================================================================== ======= ========== ===========
40-
Supported Feature CPU GPU NPU
41-
================================================================================== ======= ========== ===========
42-
:doc:`Heterogeneous execution <openvino_docs_OV_UG_Hetero_execution>` Yes Yes No
43-
:doc:`Multi-device execution <openvino_docs_OV_UG_Running_on_multiple_devices>` Yes Yes Partial
44-
:doc:`Automatic batching <openvino_docs_OV_UG_Automatic_Batching>` No Yes No
45-
:doc:`Multi-stream execution <openvino_docs_deployment_optimization_guide_tput>` Yes Yes No
46-
:doc:`Models caching <openvino_docs_OV_UG_Model_caching_overview>` Yes Partial Yes
47-
:doc:`Dynamic shapes <openvino_docs_OV_UG_DynamicShapes>` Yes Partial No
48-
:doc:`Import/Export <openvino_ecosystem>` Yes No Yes
49-
:doc:`Preprocessing acceleration <openvino_docs_OV_UG_Preprocessing_Overview>` Yes Yes No
50-
:doc:`Stateful models <openvino_docs_OV_UG_stateful_models_intro>` Yes No Yes
51-
:doc:`Extensibility <openvino_docs_Extensibility_UG_Intro>` Yes Yes No
52-
================================================================================== ======= ========== ===========
39+
=============================================================================================================================== ======= ========== ===========
40+
Supported Feature CPU GPU NPU
41+
=============================================================================================================================== ======= ========== ===========
42+
:doc:`Heterogeneous execution <../../openvino-workflow/running-inference/inference-devices-and-modes/hetero-execution>` Yes Yes No
43+
:doc:`Multi-device execution <../../openvino-workflow/running-inference/inference-devices-and-modes/multi-device>` Yes Yes Partial
44+
:doc:`Automatic batching <../../openvino-workflow/running-inference/inference-devices-and-modes/automatic-batching>` No Yes No
45+
:doc:`Multi-stream execution <../../openvino-workflow/running-inference/optimize-inference/optimizing-throughput>` Yes Yes No
46+
:doc:`Models caching <../../openvino-workflow/running-inference/optimize-inference/optimizing-latency/model-caching-overview>` Yes Partial Yes
47+
:doc:`Dynamic shapes <../../openvino-workflow/running-inference/dynamic-shapes>` Yes Partial No
48+
:doc:`Import/Export <../../documentation/openvino-ecosystem>` Yes No Yes
49+
:doc:`Preprocessing acceleration <../../openvino-workflow/running-inference/optimize-inference/optimize-preprocessing>` Yes Yes No
50+
:doc:`Stateful models <../../openvino-workflow/running-inference/stateful-models>` Yes No Yes
51+
:doc:`Extensibility <../../documentation/openvino-extensibility>` Yes Yes No
52+
=============================================================================================================================== ======= ========== ===========
5353

5454

5555
+-------------------------+-----------+------------------+-------------------+
@@ -82,11 +82,11 @@ Devices similar to the ones used for benchmarking can be accessed using
8282
`Intel® DevCloud for the Edge <https://devcloud.intel.com/edge/>`__,
8383
a remote development environment with access to Intel® hardware and the latest versions
8484
of the Intel® Distribution of OpenVINO™ Toolkit.
85-
`Learn more <https://devcloud.intel.com/edge/get_started/devcloud/>`__ or
85+
`Learn more <https://devcloud.intel.com/edge/../../get-started/devcloud/>`__ or
8686
`Register here <https://inteliot.force.com/DevcloudForEdge/s/>`__.
8787

8888
For setting up a relevant configuration, refer to the
89-
:doc:`Integrate with Customer Application <openvino_docs_OV_UG_Integrate_OV_with_your_application>`
89+
:doc:`Integrate with Customer Application <../../openvino-workflow/running-inference/integrate-openvino-with-your-application>`
9090
topic (step 3 "Configure input and output").
9191

9292

0 commit comments

Comments
 (0)