1
- Deep Neural Network Library (DNNL )
2
- ==================================
1
+ oneAPI Deep Neural Network Library (oneDNN )
2
+ ===========================================
3
3
4
4
> ** Note**
5
5
>
@@ -11,15 +11,15 @@ Deep Neural Network Library (DNNL)
11
11
> Version 1.0 brings incompatible changes to the 0.20 version. Please read
12
12
> [ Version 1.0 Transition Guide] ( https://oneapi-src.github.io/oneDNN/dev_guide_transition_to_v1.html ) .
13
13
14
- Deep Neural Network Library (DNNL ) is an
14
+ oneAPI Deep Neural Network Library (oneDNN ) is an
15
15
open-source performance library for deep learning applications. The library
16
16
includes basic building blocks for neural networks optimized
17
17
for Intel Architecture Processors and Intel Processor Graphics.
18
18
19
- DNNL is intended for deep learning applications and framework
19
+ oneDNN is intended for deep learning applications and framework
20
20
developers interested in improving application performance
21
21
on Intel CPUs and GPUs. Deep learning practitioners should use one of the
22
- applications enabled with DNNL :
22
+ applications enabled with oneDNN :
23
23
* [ Apache\* MXNet] ( https://mxnet.apache.org )
24
24
* [ BigDL] ( https://github.com/intel-analytics/BigDL )
25
25
* [ Caffe\* Optimized for Intel Architecture] ( https://github.com/intel/caffe )
@@ -77,7 +77,7 @@ If the configuration you need is not available, you can
77
77
78
78
# System Requirements
79
79
80
- DNNL supports systems based on
80
+ oneDNN supports systems based on
81
81
[ Intel 64 or AMD64 architecture] ( https://en.wikipedia.org/wiki/X86-64 ) .
82
82
83
83
The library is optimized for the following CPUs:
@@ -89,13 +89,13 @@ The library is optimized for the following CPUs:
89
89
* Intel Xeon Scalable processor (formerly Skylake and Cascade Lake)
90
90
* future Intel Xeon Scalable processor (code name Cooper Lake)
91
91
92
- DNNL detects instruction set architecture (ISA) in the runtime and uses
92
+ oneDNN detects instruction set architecture (ISA) at runtime and uses
93
93
just-in-time (JIT) code generation to deploy the code optimized
94
94
for the latest supported ISA.
95
95
96
96
> ** WARNING**
97
97
>
98
- > On macOS, applications that use DNNL may need to request special
98
+ > On macOS, applications that use oneDNN may need to request special
99
99
> entitlements if they use the hardened runtime. See the
100
100
> [ linking guide] ( https://oneapi-src.github.io/oneDNN/dev_guide_link.html )
101
101
> for more details.
@@ -107,7 +107,7 @@ The library is optimized for the following GPUs:
107
107
108
108
## Requirements for Building from Source
109
109
110
- DNNL supports systems meeting the following requirements:
110
+ oneDNN supports systems meeting the following requirements:
111
111
* Operating system with Intel 64 architecture support
112
112
* C++ compiler with C++11 standard support
113
113
* [ CMake] ( https://cmake.org/download/ ) 2.8.11 or later
@@ -120,7 +120,7 @@ dependencies.
120
120
### CPU Engine
121
121
122
122
Intel Architecture Processors and compatible devices are supported by the
123
- DNNL CPU engine. The CPU engine is built by default and cannot
123
+ oneDNN CPU engine. The CPU engine is built by default and cannot
124
124
be disabled at build time. The engine can be configured to use the OpenMP or
125
125
TBB threading runtime. The following additional requirements apply:
126
126
* OpenMP runtime requires C++ compiler with OpenMP 2.0 or later standard support
@@ -133,7 +133,7 @@ the Intel C++ Compiler for the best performance results.
133
133
134
134
### GPU Engine
135
135
136
- Intel Processor Graphics is supported by the DNNL GPU engine. The GPU
136
+ Intel Processor Graphics is supported by the oneDNN GPU engine. The GPU
137
137
engine is disabled in the default build configuration. The following
138
138
additional requirements apply when GPU engine is enabled:
139
139
* OpenCL\* runtime library (OpenCL version 1.2 or later)
@@ -142,7 +142,7 @@ additional requirements apply when GPU engine is enabled:
142
142
143
143
### Runtime Dependencies
144
144
145
- When DNNL is built from source, the library runtime dependencies
145
+ When oneDNN is built from source, the library runtime dependencies
146
146
and specific versions are defined by the build environment.
147
147
148
148
#### Linux
@@ -241,7 +241,7 @@ You may reach out to project maintainers privately at dnnl.maintainers@intel.com
241
241
242
242
# Contributing
243
243
244
- We welcome community contributions to DNNL . If you have an idea on how
244
+ We welcome community contributions to oneDNN . If you have an idea on how
245
245
to improve the library:
246
246
247
247
* For changes impacting the public API, submit
@@ -261,7 +261,7 @@ contributors are expected to adhere to the
261
261
262
262
# License
263
263
264
- DNNL is licensed under [ Apache License Version 2.0] ( LICENSE ) . Refer to the
264
+ oneDNN is licensed under [ Apache License Version 2.0] ( LICENSE ) . Refer to the
265
265
"[ LICENSE] ( LICENSE ) " file for the full license text and copyright notice.
266
266
267
267
This distribution includes third party software governed by separate license
0 commit comments