@@ -18,18 +18,33 @@ AMD\* GPU, OpenPOWER\* Power ISA (PPC64), IBMz\* (s390x), and RISC-V.
18
18
19
19
oneDNN is intended for deep learning applications and framework
20
20
developers interested in improving application performance on CPUs and GPUs.
21
- Deep learning practitioners should use one of the
22
- [ applications enabled with oneDNN] ( #applications-enabled-with-onednn ) .
21
+
22
+ Deep learning practitioners should use one of the applications enabled with oneDNN:
23
+
24
+ * [ Apache SINGA] ( https://singa.apache.org )
25
+ * [ DeepLearning4J\* ] ( https://deeplearning4j.konduit.ai )
26
+ * [ Flashlight\* ] ( https://github.com/flashlight/flashlight )
27
+ * [ MATLAB\* Deep Learning Toolbox] ( https://www.mathworks.com/help/deeplearning )
28
+ * [ ONNX Runtime] ( https://onnxruntime.ai )
29
+ * [ OpenVINO(TM) toolkit] ( https://github.com/openvinotoolkit/openvino )
30
+ * [ PaddlePaddle\* ] ( http://www.paddlepaddle.org )
31
+ * [ PyTorch\* ] ( https://pytorch.org ) . Intel GPU support and additional
32
+ optimizations are available with [ Intel® Extension for PyTorch* ] .
33
+ * [ Tensorflow\* ] ( https://www.tensorflow.org ) . Intel GPU support and additional
34
+ optimizations are available with [ Intel® Extension for TensorFlow* ] .
35
+
36
+ [ Intel® Extension for PyTorch* ] : https://github.com/intel/intel-extension-for-pytorch
37
+ [ Intel® Extension for TensorFlow* ] : https://github.com/intel/intel-extension-for-tensorflow
23
38
24
39
[ UXL Foundation ] : http://www.uxlfoundation.org
25
- [ oneAPI specification ] : https://spec.oneapi.io
40
+ [ oneAPI specification ] : https://oneapi- spec.uxlfoundation.org/specifications/oneapi/latest/elements/onednn/source/
26
41
27
42
# Table of Contents
28
43
29
44
- [ Documentation] ( #documentation )
30
- - [ Installation] ( #installation )
31
45
- [ System Requirements] ( #system-requirements )
32
- - [ Applications Enabled with oneDNN] ( #applications-enabled-with-onednn )
46
+ - [ Installation] ( #installation )
47
+ - [ Validated Configurations] ( #validated-configurations )
33
48
- [ Governance] ( #governance )
34
49
- [ Support] ( #support )
35
50
- [ Contributing] ( #contributing )
@@ -39,32 +54,19 @@ Deep learning practitioners should use one of the
39
54
40
55
# Documentation
41
56
42
- * [ Developer Guide] explains the programming model, supported functionality,
43
- and implementation details, and includes annotated examples.
44
- * [ API Reference] provides a comprehensive reference of the library API.
57
+ * [ oneDNN Developer Guide and Reference] explains the programming
58
+ model, supported functionality, implementation details, and includes
59
+ annotated examples.
60
+ * [ API Reference] provides a comprehensive reference of the library
61
+ API.
62
+ * [ Release Notes] explains the new features, performance
63
+ optimizations, and improvements implemented in each version of
64
+ oneDNN.
45
65
46
- [ Developer Guide ] : https://oneapi-src.github.io/oneDNN
66
+ [ oneDNN Developer Guide and Reference ] : https://oneapi-src.github.io/oneDNN
47
67
[ API Reference ] : https://oneapi-src.github.io/oneDNN/group_dnnl_api.html
68
+ [ Release Notes ] : https://github.com/oneapi-src/oneDNN/releases
48
69
49
- # Installation
50
-
51
- Binary distribution of this software is available in:
52
- * [ Anaconda]
53
- * [ Intel oneAPI]
54
-
55
- The packages do not include library dependencies and these need to be resolved
56
- in the application at build time. See the [ System Requirements] section below
57
- and the [ Build Options] section in the [ Developer Guide] for more details on
58
- CPU and GPU runtimes.
59
-
60
- If the configuration you need is not available, you can
61
- [ build the library from source] [ Build from Source ] .
62
-
63
- [ Anaconda ] : https://anaconda.org/conda-forge/onednn
64
- [ Intel oneAPI ] : https://www.intel.com/content/www/us/en/developer/tools/oneapi/onednn.html
65
- [ System Requirements ] : #system-requirements
66
- [ Build Options ] : https://oneapi-src.github.io/oneDNN/dev_guide_build_options.html
67
- [ Build from Source ] : https://oneapi-src.github.io/oneDNN/dev_guide_build.html
68
70
69
71
# System Requirements
70
72
@@ -239,12 +241,12 @@ is enabled:
239
241
[ timeout detection and recovery ] : https://learn.microsoft.com/en-us/windows-hardware/drivers/display/timeout-detection-and-recovery
240
242
[ TdrDelay ] : https://learn.microsoft.com/en-us/windows-hardware/drivers/display/tdr-registry-keys#tdrdelay
241
243
242
- ### Runtime Dependencies
244
+ ## Runtime Dependencies
243
245
244
246
When oneDNN is built from source, the library runtime dependencies and specific
245
247
versions are defined by the build environment.
246
248
247
- #### Linux
249
+ ### Linux
248
250
249
251
Common dependencies:
250
252
* GNU C Library (` libc.so ` )
@@ -265,7 +267,7 @@ Runtime-specific dependencies:
265
267
| ` DNNL_GPU_RUNTIME=OCL ` | any | OpenCL loader (` libOpenCL.so ` )
266
268
| ` DNNL_GPU_RUNTIME=SYCL ` | Intel oneAPI DPC++ Compiler | Intel oneAPI DPC++ Compiler runtime (` libsycl.so ` ), OpenCL loader (` libOpenCL.so ` ), oneAPI Level Zero loader (` libze_loader.so ` )
267
269
268
- #### Windows
270
+ ### Windows
269
271
270
272
Common dependencies:
271
273
* Microsoft Visual C++ Redistributable (` msvcrt.dll ` )
@@ -281,7 +283,7 @@ Runtime-specific dependencies:
281
283
| ` DNNL_GPU_RUNTIME=OCL ` | any | OpenCL loader (` OpenCL.dll ` )
282
284
| ` DNNL_GPU_RUNTIME=SYCL ` | Intel oneAPI DPC++ Compiler | Intel oneAPI DPC++ Compiler runtime (` sycl.dll ` ), OpenCL loader (` OpenCL.dll ` ), oneAPI Level Zero loader (` ze_loader.dll ` )
283
285
284
- #### macOS
286
+ ### macOS
285
287
286
288
Common dependencies:
287
289
* System C/C++ runtime (` libc++.dylib ` , ` libSystem.dylib ` )
@@ -293,7 +295,28 @@ Runtime-specific dependencies:
293
295
| ` DNNL_CPU_RUNTIME=OMP ` | Intel C/C++ Compiler | Intel OpenMP runtime (` libiomp5.dylib ` )
294
296
| ` DNNL_CPU_RUNTIME=TBB ` | any | TBB (` libtbb.dylib ` )
295
297
296
- ### Validated Configurations
298
+ # Installation
299
+
300
+ You can download and install the oneDNN library using one of the following options:
301
+
302
+ - Binary Distribution: You can download pre-built binary packages from
303
+ the following sources:
304
+ - [ conda-forge] : If the configuration you need is not available on
305
+ the conda-forge channel, you can build the library using the
306
+ Source Distribution.
307
+ - Intel oneAPI:
308
+ - [ Intel® oneAPI Base Toolkit] ( https://www.intel.com/content/www/us/en/developer/tools/oneapi/base-toolkit-download.htm )
309
+ - [ Intel® oneDNN standalone package] ( https://www.intel.com/content/www/us/en/developer/tools/oneapi/onednn-download.html )
310
+
311
+ - Source Distribution: You can build the library from source by
312
+ following the instructions on the [ Build from Source] page.
313
+
314
+ [ conda-forge ] : https://anaconda.org/conda-forge/onednn
315
+ [ System Requirements ] : #system-requirements
316
+ [ Build Options ] : https://oneapi-src.github.io/oneDNN/dev_guide_build_options.html
317
+ [ Build from Source ] : https://oneapi-src.github.io/oneDNN/dev_guide_build.html
318
+
319
+ # Validated Configurations
297
320
298
321
x86-64 CPU engine was validated on RedHat\* Enterprise Linux 8 with
299
322
* GNU Compiler Collection 8.5, 9.5, 11.1, 11.3
@@ -334,24 +357,6 @@ time of release
334
357
[ Intel Arc & Iris Xe Graphics Driver ] : https://www.intel.com/content/www/us/en/download/785597/intel-arc-iris-xe-graphics-windows.html
335
358
[ Arm Compiler for Linux ] : https://developer.arm.com/Tools%20and%20Software/Arm%20Compiler%20for%20Linux
336
359
337
- # Applications Enabled with oneDNN
338
-
339
- * [ Apache\* MXNet] ( https://mxnet.apache.org )
340
- * [ Apache SINGA] ( https://singa.apache.org )
341
- * [ DeepLearning4J\* ] ( https://deeplearning4j.konduit.ai )
342
- * [ Flashlight\* ] ( https://github.com/flashlight/flashlight )
343
- * [ Korali] ( https://github.com/cselab/korali )
344
- * [ MATLAB\* Deep Learning Toolbox] ( https://www.mathworks.com/help/deeplearning )
345
- * [ ONNX Runtime] ( https://onnxruntime.ai )
346
- * [ OpenVINO(TM) toolkit] ( https://github.com/openvinotoolkit/openvino )
347
- * [ PaddlePaddle\* ] ( http://www.paddlepaddle.org )
348
- * [ PyTorch\* ] ( https://pytorch.org ) . Intel GPU support and additional
349
- optimizations are available with [ Intel Extension for PyTorch] .
350
- * [ Tensorflow\* ] ( https://www.tensorflow.org ) . Intel GPU support and additional
351
- optimizations are available with [ Intel Extension for Tensorflow] .
352
-
353
- [ Intel Extension for PyTorch ] : https://github.com/intel/intel-extension-for-pytorch
354
- [ Intel Extension for Tensorflow ] : https://github.com/intel/intel-extension-for-tensorflow
355
360
356
361
# Support
357
362
@@ -387,21 +392,13 @@ schedule and work already in progress towards future milestones in Github's
387
392
[ Milestones] section. If you are looking for a specific task to start,
388
393
consider selecting from issues that are marked with the [ help wanted] label.
389
394
390
- If you have an idea on how to improve the library:
391
- * For changes impacting the public API or library overall, such as adding new
392
- primitives or changes to the architecture, submit an [ RFC pull request] .
393
- * Ensure that the changes are consistent with the [ code contribution guidelines]
394
- and [ coding standards] .
395
- * Ensure that you can build the product and run all the examples with your
396
- patch.
397
- * Submit a [ pull request] .
398
-
399
- For additional details, see [ contribution guidelines] ( CONTRIBUTING.md ) . You can
400
- also contact oneDNN developers and maintainers via [ UXL Foundation Slack] using
401
- [ #onednn] channel.
402
395
403
- This project is intended to be a safe, welcoming space for collaboration, and
404
- contributors are expected to adhere to the
396
+ See [ contribution guidelines] ( CONTRIBUTING.md ) to start contributing
397
+ to oneDNN. You can also contact oneDNN developers and maintainers via
398
+ [ UXL Foundation Slack] using [ #onednn] channel.
399
+
400
+ This project is intended to be a safe, welcoming space for
401
+ collaboration, and contributors are expected to adhere to the
405
402
[ Contributor Covenant] ( CODE_OF_CONDUCT.md ) code of conduct.
406
403
407
404
[ RFC pull request ] : https://github.com/oneapi-src/oneDNN/tree/rfcs
@@ -411,13 +408,15 @@ contributors are expected to adhere to the
411
408
[ Milestones ] : https://github.com/oneapi-src/oneDNN/milestones
412
409
[ help wanted ] : https://github.com/oneapi-src/oneDNN/issues?q=is%3Aissue+is%3Aopen+label%3A%22help+wanted%22
413
410
411
+
414
412
# License
415
413
416
- oneDNN is licensed under [ Apache License Version 2.0] ( LICENSE ) . Refer to the
417
- "[ LICENSE] ( LICENSE ) " file for the full license text and copyright notice.
414
+ oneDNN is licensed under [ Apache License Version 2.0] ( LICENSE ) . Refer
415
+ to the "[ LICENSE] ( LICENSE ) " file for the full license text and
416
+ copyright notice.
418
417
419
- This distribution includes third party software governed by separate license
420
- terms.
418
+ This distribution includes third party software governed by separate
419
+ license terms.
421
420
422
421
3-clause BSD license:
423
422
* [ Xbyak] ( https://github.com/herumi/xbyak )
@@ -446,17 +445,17 @@ and OpenCL Driver](https://github.com/intel/compute-runtime)
446
445
Interface] ( https://github.com/intel/metrics-discovery )
447
446
* [ spdlog] ( https://github.com/gabime/spdlog )
448
447
449
- This third party software, even if included with the distribution of
450
- the Intel software, may be governed by separate license terms, including
451
- without limitation, third party license terms, other Intel software license
452
- terms, and open source software license terms. These separate license terms
453
- govern your use of the third party programs as set forth in the
454
- "[ THIRD-PARTY-PROGRAMS] ( THIRD-PARTY-PROGRAMS ) " file.
448
+ This third- party software, even if included with the distribution of
449
+ the Intel software, may be governed by separate license terms,
450
+ including without limitation,third party license terms, other Intel
451
+ software license terms, and open source software license terms. These
452
+ separate license terms govern your use of the third party programs as
453
+ set forth in the "[ THIRD-PARTY-PROGRAMS] ( THIRD-PARTY-PROGRAMS ) " file.
455
454
456
455
# Security
457
456
458
457
[ Security Policy] ( SECURITY.md ) outlines our guidelines and procedures
459
- for ensuring the highest level of Security and trust for our users
458
+ for ensuring the highest level of security and trust for our users
460
459
who consume oneDNN.
461
460
462
461
# Trademark Information
0 commit comments