Skip to content

TF2-style layer losses #1774

Open
Open
@drdozer

Description

@drdozer

Is it possible to declare layer-associated losses, as per tensorflow? In the TF2 API, layers can be declared with optional weight and activation losses. It also provides some standard loss functions including L1 and L2. During training, as well as collecting the parameters, it collects the losses. Perhaps there would be some way to retro-fit this to flux? Perhaps using the functor API to walk the graph, and collect losses if that layer provides them?

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions