Open
Description
I am having issues with logitcrossentropy calculated on two onehotmatrix variables on GPU. Here is my code
ŷ = onehotbatch(rand([-1, 1], 100), [-1, 1]) |> gpu
y = onehotbatch(rand([-1, 1], 100), [-1, 1]) |> gpu
Flux.logitcrossentropy(ŷ, y)
The last lines gives the following error
ERROR: This object is not a GPU array
Stacktrace:
[1] error(s::String)
@ Base ./error.jl:33
[2] backend(#unused#::Type)
@ GPUArrays ~/.julia/packages/GPUArrays/Zecv7/src/device/execution.jl:15
[3] backend(x::Matrix{Float64})
@ GPUArrays ~/.julia/packages/GPUArrays/Zecv7/src/device/execution.jl:16
[4] _copyto!
@ ~/.julia/packages/GPUArrays/Zecv7/src/host/broadcast.jl:73 [inlined]
[5] materialize!
@ ~/.julia/packages/GPUArrays/Zecv7/src/host/broadcast.jl:51 [inlined]
[6] materialize!
@ ./broadcast.jl:868 [inlined]
[7] logsoftmax!(out::Matrix{Float64}, x::Flux.OneHotArray{UInt32, 2, 1, 2, CuArray{UInt32, 1, CUDA.Mem.DeviceBuffer}}; dims::Int64)
@ NNlib ~/.julia/packages/NNlib/SGHdM/src/softmax.jl:114
[8] #logsoftmax#166
@ ~/.julia/packages/NNlib/SGHdM/src/softmax.jl:107 [inlined]
[9] logitcrossentropy(ŷ::Flux.OneHotArray{UInt32, 2, 1, 2, CuArray{UInt32, 1, CUDA.Mem.DeviceBuffer}}, y::Flux.OneHotArray{UInt32, 2, 1, 2, CuArray{UInt32, 1, CUDA.Mem.DeviceBuffer}}; dims::Int64, agg::typeof(mean))
@ Flux.Losses ~/.julia/packages/Flux/js6mP/src/losses/functions.jl:255
[10] logitcrossentropy(ŷ::Flux.OneHotArray{UInt32, 2, 1, 2, CuArray{UInt32, 1, CUDA.Mem.DeviceBuffer}}, y::Flux.OneHotArray{UInt32, 2, 1, 2, CuArray{UInt32, 1, CUDA.Mem.DeviceBuffer}})
@ Flux.Losses ~/.julia/packages/Flux/js6mP/src/losses/functions.jl:254
[11] top-level scope
@ ~/dnn-mc/prova.jl:19
If I create a CuArray virtually the same as a onehotencoded matrix for example with ŷ = (CUDA.rand(size(y)...) .< 0.5 ) .* 1.
I don't get an error.