This repository was archived by the owner on Jan 3, 2023. It is now read-only.
Commit 0995b71
authored
Cyphers/master to r20 (#2978)
* nbench special handle op name for "NNP_XXX" (#2883)
* remove throw from dtor (#2854)
* add special handle for the 'NNP_xx' op name
* style
* add special handle for the 'NNP_xx' op name
* style
* use description as suggest by bob
* Remove parent from PlaidML tensor initializer (#2923)
* Remove parent from PlaidML tensor initializer
* Remove plaidml tensor parent plumbing
* style
* Add support for move semantics to AlignedBuffer (#2956)
* Add move operations to AlignedBuffer
* unit test
* Create mkldnn primitives at first iteration for codegen - part2 (#2859)
* Create mkldnn primitives at first iteration for CODEGEN.
OPs: add, lstm, and rnn.
* OPs: batchnorm.
* OPs: concat and lrn.
Remove dead code.
* Skip in place concat, relu, reshape, and slice when building node_primitive_string_deps_index map.
* Change NGRAPH_ASSERT to NGRAPH_CHECK.
* Address PR Feedback.
* Create mkldnn primitives at first iteration for CODEGEN.
OPs: convertlayout, relu, leakyrelu, boundedrelu, sigmoid, softmax, slice.
* Fix bugs.
* OPs: quantizedconcat.
Check if there are descriptors before emitting code to read desc_file.
* OPs: convolution backward.
Use macro to write mkldnn memory dims to generated file.
* OPs: MaxPoolWithIndices and MaxPoolWithIndicesBackprop.
Add unit tests for MaxPoolWithIndices, MaxPoolWithIndicesBackprop, and MaxPoolBackprop.
* Fix style error.
* OPs: AvgPoolBackprop and MaxPoolBackprop.
Add unit test for AvgPoolBackprop.
* OPs: DeconvolutionBias.
* OPs: Quantize and Dequantize.
* OPs: QuantizedDot and QuantizedDotBias.
* Use reference kernel for QuantizedConvolution for CODEGEN when mkldnn does not support the parameter types.
Get scales for quantization ops in cpu_emitter.
* Fix Windows build error: add CPU_BACKEND_API.
* Use template for quantization ops.
* OPs: QuantizedMatmul.
Emit referece kernel for QuantizedDot in CODEGEN.
* Remove QuantizedDot from get_scale_index.
* Address PR feedback.
* [FusedOps] Split (#2951)
* Split op skeleton
* Two ways to construct a fused Split to be able to use it in onnx importer
* refactor: move the util::split() helper functions to the core
* Split's decompose_op() implementation using a helper function
* Use fused Split in the onnx_importer
* Code formatting
* PR feedback
* Split helpers moved to ngraph/builder
* Basic UT - split a 1D tensor to 3 equal parts
* UT: Split 2D tensor into variable length parts
* Code formatting
* Catch the proper type of exception in the onnx_importer split()
* Initialize members in the correct order
* Type prop tests for Split
* Code formatting
* PR feedback
* Add more infrastructure for specialization of cloned graphs (#2949)
* Virtualize some things that crash when layout descriptor is missing
* More shape specialization
* (very bare) skeleton for dyn elimination
* Miscellaneous
* Lift i32->int64-only restriction on constant folding for Convert
* Add constant folding for ShapeOf, and some tests for new constant folders
* Tests for DynElimination
* Rename specialize_shapes to specialize_function, and add a unit test for value substitution
* Roll back overeager API change in dyn slice bprop (it has to handle right-indexed axes; bummer)
* Add a test for dynamic usage of transpose op
* Fix warning/error about variable shadowing
* Strengthen checks in apply_permutation
* Propagate Constant shapes through Transpose
* Add CHANGE_DYNAMIC_STATE where appropriate
* PR feedback, and fix unit test failure
* Fix PR reference in comment
* PR comments
* Comments for helper funcs
* Remove unique_ptr indirection for the AlignedBuffers
* Fix incorrect indexing of AlignedBuffer vector (whoops\!)
* Remove unnecessary CHANGE_DYAMIC_STATEs
* De-update pass property unit test for const folding
* Replace mystery runes with all_pass_property_off
* Change FusionType to enum class and use EnumMask (#2957)
* constexpr ctor for EnumMask
* added pass properties to core passes.
* change fusion type to have better type safety.
* refactor to use enum mask.
* remove extra code.
* added constants for FusionType backward compatibility.
* spelling.
* grammar fix.
* update visualize tree file extenstions and output formats (#2954)
* update visualize tree file extenstions and output formats
* fix runtime error
* Update version, clean up ToC, add more detail to section on inspectin… (#2947)
* Update version, clean up ToC, add more detail to section on inspecting graphs...
* Minor adjustments to version module
* Move distributed training page to extras since there's not much there
* Fix links that break when doing that
* Consistent casing on section titles
* Orphan governance page so we don't have blank/empty links
* Update release notes with new version module structure
* PR feedback
* Allow NGRAPH_VISUALIZE_TREE_OUTPUT_SHAPES to output partial shapes (#2959)
* Remove functions from cpu which were moved to core (#2962)
* Remove functions from cpu which were moved to core
* Fix a typo
* Remove unused function
* Move zero padded conv fusions from CPUFusion to CoreFusion. (#2969)
* Move zero padded conv fusions from CPUFusion to CoreFusion.
* Address PR feedback: move unit tests to core_fusion.
* Fix Convert for boolean output type in CODEGEN. (#2958)
* Create tensor for the primary backend (#2970)
* create tensor for the primary backend
* move private objects to protected
* [Fused] LeakyRelu op (#2919)
* [Fused] LeakyRelu op
* Add LeakyRelu to serializer
* Add unit tests
* Fix merge branch 'master' into mkarzyns/fused_leaky_relu
* Change broadcasting rules to NumPy style
* Remove std:: and ngraph:: prefixes
* Rename CPU Runtime LeakyRelu to CPULeakyRelu
* Style apply
* Fix cpu_fusion.fuse_leaky_relu test
* Use eigen's tanh in the fused sigmoid multiply kernel (#2946)
* Merge branch 'master' into mkarzyns/fused_leaky_relu
* Add LeakyRelu to Intel GPU backend op list
* Add LeakyRelu to Intel GPU backend op list
* Make private members protected in hybrid classes (#2975)
* make private members protected in hybrid classes
* allow overriding the passes
* [ONNX] Unit tests for QLinearMatMul (#2706)
* [ONNX] Unit test models for QLinearMatMul
* [ONNX] Extended types support for NgraphTestCase
* [ONNX] Move the value comparators to the NgraphTestCase class
* Add test cases
* Add shape checking
* disable GPU tests
* IntelGPU backend: Switch to clDNN which is compatible with gcc4.8 (#2961)
* Added accessor methods for layer op attributes (#2964)
* Added accessor methods for layer op attributes
* style fixes and addressed PR feedback
* Add save/load API to runtime (#2955)
* API defined
* add unit test for save/load with INTERPRETER
* Update per review comments
* fix compiler error
* Backport fix from #2973 (#2976)
* CTCGreedyDecoder layer op (#2965)
* Added CTCGreedyDecoder layer op
* Added comment on seq_len validation checks
* Switch some get_inputs uses to use the newer inputs (#2968)
* Switch some get_inputs uses to use the newer inputs
* Review comments
* update a few files to build on windows (#2974)
* update a few files to build on windows
* more fixes1 parent 4751846 commit 0995b71
File tree
138 files changed
+5495
-3188
lines changed- doc/sphinx
- ngraph_theme
- source
- core/constructing-graphs
- distr
- inspection
- project
- tutorials
- src
- ngraph
- builder
- quantization
- frontend/onnx_import
- op
- utils
- op
- experimental
- layers
- fused
- util
- pass
- pattern
- runtime
- cpu
- builder
- kernel
- op
- pass
- dynamic
- gpu
- hybrid
- intelgpu
- interpreter
- tools/nbench
- test
- models/onnx
- onnx
Some content is hidden
Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.
138 files changed
+5495
-3188
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
2 | 2 | | |
3 | 3 | | |
4 | 4 | | |
| 5 | + | |
5 | 6 | | |
6 | 7 | | |
7 | 8 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
73 | 73 | | |
74 | 74 | | |
75 | 75 | | |
76 | | - | |
| 76 | + | |
77 | 77 | | |
78 | 78 | | |
79 | 79 | | |
80 | | - | |
| 80 | + | |
81 | 81 | | |
82 | 82 | | |
83 | 83 | | |
| |||
121 | 121 | | |
122 | 122 | | |
123 | 123 | | |
124 | | - | |
125 | | - | |
| 124 | + | |
| 125 | + | |
126 | 126 | | |
127 | 127 | | |
128 | 128 | | |
129 | 129 | | |
130 | | - | |
131 | | - | |
| 130 | + | |
| 131 | + | |
132 | 132 | | |
133 | 133 | | |
134 | 134 | | |
135 | | - | |
136 | 135 | | |
137 | 136 | | |
138 | 137 | | |
139 | | - | |
140 | | - | |
141 | 138 | | |
142 | 139 | | |
143 | 140 | | |
| |||
165 | 162 | | |
166 | 163 | | |
167 | 164 | | |
| 165 | + | |
| 166 | + | |
| 167 | + | |
| 168 | + | |
| 169 | + | |
| 170 | + | |
| 171 | + | |
| 172 | + | |
| 173 | + | |
| 174 | + | |
| 175 | + | |
| 176 | + | |
| 177 | + | |
| 178 | + | |
| 179 | + | |
| 180 | + | |
| 181 | + | |
| 182 | + | |
| 183 | + | |
168 | 184 | | |
169 | 185 | | |
170 | 186 | | |
| |||
227 | 243 | | |
228 | 244 | | |
229 | 245 | | |
230 | | - | |
231 | 246 | | |
232 | 247 | | |
233 | 248 | | |
| |||
239 | 254 | | |
240 | 255 | | |
241 | 256 | | |
242 | | - | |
| 257 | + | |
243 | 258 | | |
244 | 259 | | |
245 | 260 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
1 | 1 | | |
2 | 2 | | |
3 | 3 | | |
4 | | - | |
5 | | - | |
| 4 | + | |
| 5 | + | |
6 | 6 | | |
7 | 7 | | |
8 | 8 | | |
9 | 9 | | |
10 | 10 | | |
11 | 11 | | |
12 | | - | |
| 12 | + | |
| 13 | + | |
13 | 14 | | |
14 | 15 | | |
15 | 16 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
23 | 23 | | |
24 | 24 | | |
25 | 25 | | |
26 | | - | |
| 26 | + | |
27 | 27 | | |
28 | 28 | | |
29 | 29 | | |
| |||
58 | 58 | | |
59 | 59 | | |
60 | 60 | | |
61 | | - | |
| 61 | + | |
62 | 62 | | |
63 | 63 | | |
64 | 64 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
73 | 73 | | |
74 | 74 | | |
75 | 75 | | |
76 | | - | |
| 76 | + | |
77 | 77 | | |
78 | 78 | | |
79 | 79 | | |
80 | | - | |
| 80 | + | |
81 | 81 | | |
82 | 82 | | |
83 | 83 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
26 | 26 | | |
27 | 27 | | |
28 | 28 | | |
29 | | - | |
30 | | - | |
| 29 | + | |
31 | 30 | | |
32 | 31 | | |
33 | 32 | | |
| |||
This file was deleted.
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
72 | 72 | | |
73 | 73 | | |
74 | 74 | | |
75 | | - | |
76 | | - | |
77 | | - | |
78 | | - | |
79 | | - | |
80 | | - | |
81 | | - | |
82 | 75 | | |
83 | 76 | | |
84 | 77 | | |
85 | 78 | | |
86 | 79 | | |
87 | 80 | | |
88 | | - | |
89 | | - | |
90 | | - | |
91 | | - | |
92 | | - | |
93 | | - | |
94 | 81 | | |
95 | 82 | | |
96 | 83 | | |
97 | 84 | | |
98 | 85 | | |
99 | | - | |
100 | 86 | | |
| 87 | + | |
101 | 88 | | |
102 | | - | |
103 | 89 | | |
104 | 90 | | |
105 | 91 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
3 | 3 | | |
4 | 4 | | |
5 | 5 | | |
6 | | - | |
7 | | - | |
8 | | - | |
| 6 | + | |
| 7 | + | |
| 8 | + | |
| 9 | + | |
| 10 | + | |
9 | 11 | | |
10 | 12 | | |
11 | | - | |
| 13 | + | |
| 14 | + | |
| 15 | + | |
| 16 | + | |
| 17 | + | |
| 18 | + | |
| 19 | + | |
| 20 | + | |
12 | 21 | | |
13 | | - | |
14 | | - | |
15 | | - | |
16 | | - | |
17 | | - | |
| 22 | + | |
| 23 | + | |
| 24 | + | |
| 25 | + | |
| 26 | + | |
| 27 | + | |
| 28 | + | |
| 29 | + | |
18 | 30 | | |
19 | 31 | | |
20 | 32 | | |
| |||
23 | 35 | | |
24 | 36 | | |
25 | 37 | | |
26 | | - | |
27 | | - | |
28 | | - | |
29 | | - | |
30 | | - | |
31 | | - | |
| 38 | + | |
| 39 | + | |
| 40 | + | |
| 41 | + | |
| 42 | + | |
| 43 | + | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
1 | 1 | | |
2 | 2 | | |
3 | 3 | | |
4 | | - | |
| 4 | + | |
5 | 5 | | |
6 | 6 | | |
7 | 7 | | |
| |||
0 commit comments