You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I noticed that some of my tensors such as blk.0.attn_q.weight are not going through the set_tensor function while tensors with the naming scheme of the backend such as zDNN#attn_norm-0#0 are going through correctly. This is causing quite a problem for my backend during inference on operations such as GGML_OP_MUL_MAT because the abovementioned 2 tensors would come in and only 1 of them are initialised correctly via set_tensor.
As a sanity check, I have ensured that my backend is only broadcasting buffer types it can support i.e.,
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
I noticed that some of my tensors such as
blk.0.attn_q.weight
are not going through theset_tensor
function while tensors with the naming scheme of the backend such aszDNN#attn_norm-0#0
are going through correctly. This is causing quite a problem for my backend during inference on operations such asGGML_OP_MUL_MAT
because the abovementioned 2 tensors would come in and only 1 of them are initialised correctly viaset_tensor
.As a sanity check, I have ensured that my backend is only broadcasting buffer types it can support i.e.,
Pardon my ignorance if any. I have been trying to debug this for hours and still have no idea what is wrong here.
Any help would be greatly appreciated!
Beta Was this translation helpful? Give feedback.
All reactions