Why register the relu operator in AutogradPrivateUse1 ? #48
Replies: 5 comments 3 replies
-
|
Good question: dlprimitives provide several functions for activation so I tried to do something generic. After some time I reverted back to directly implementation but this remained. So the answer is - historically, and probably it can be implemented as all other functions without a problem. |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
|
As far as I remember That is why custom backward function is applied. I used autograd private use any time I need little bit different backpropogation. I don't recall why I used it for ReLU but I assume I use it this way for max pooling because of different way I handle the location of "max" item for backpropogation. I assume it can be rewritten differently but likely you'll need to change the kernels themselves in dlprimitives and run all tests. Can you explain why it bothers you? |
Beta Was this translation helpful? Give feedback.
-
|
I assume you replacing |
Beta Was this translation helpful? Give feedback.
-
|
Moved to discussions Also note: Would not work for strided tensors. Better method is this: It would work on strided inputs |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
In the pointwise_ops.cpp file, you register the relu operator in AutogradPrivateUse1, which is different with the relu_ operator. Why not register the relu operator in PrivateUse1 ? Just like the tan and tanh_ operators, which are registered in PrivateUse1.
Beta Was this translation helpful? Give feedback.
All reactions