Skip to content

LayerNorm layer in dllibb is very slow, especially in backward phase #8854

Open
@emartinezs44

Description

@emartinezs44

LayerNorm is a layer used in Transformer implementation. Some tests over the Scala 2.13 BigDL dllib implementation show response times of 0.5 seg or more in the backward phase:

Code:

val layer = LayerNorm[Float]()
layer.build(Shape(1, 514, 768))

val inputTensor = Tensor[Float](Array(1, 514, 768)).rand()

val t0      = System.nanoTime
val result = layer.forward(inputTensor)
val duration0 = (System.nanoTime - t0) / 1e9d
println("Forward: " + duration0)

val t1     = System.nanoTime
layer.backward(inputTensor, result)
val duration1 = (System.nanoTime - t1) / 1e9d
println("Backward: " + duration1)

Doing some fixes, I get a time in the forward phase like:

Forward: 0.070696203 seg

And in the backward phase:

Backward: 0.692437933 !! , which is very slow.

The tests for the 2.12 version shows very bad response too although it seems that the 2.13 is worse. Any suggestion for changing that implementation?

P.D: The test have been done in a Intel(R) Core(TM) i7-6820HQ CPU @ 2.70GHz

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions