Skip to content
This repository was archived by the owner on Jul 1, 2023. It is now read-only.
This repository was archived by the owner on Jul 1, 2023. It is now read-only.

Add more optimizers and losses #127

@Shashi456

Description

@Shashi456

Just a small-ish roadmap to different Optimizers and losses we can look at to add :

Optimizers:-

Losses :-

  • L1Loss
  • L2Loss
  • MeanSquaredError
  • SoftmaxCrossEntropy
  • SoftmaxCrossEntropyWithLogits
  • SigmoidCrossEntropy
  • MeanAbsoluteError
  • MeanAbsolutePercentageError
  • MeanSquaredLogarithmicError
  • HingeLoss
  • SquaredHinge
  • CategoricalHinge
  • Logcosh
  • CategoricalCrossEntropy
  • Kullback Leibler Divergence
    NegativeLogLikelihood
  • Cosine
  • TripletMarginLoss
  • Poisson

Removed NegativeLogLikelihood. Check this discussion for more.

Note: If you have suggestions for losses and optimizers, please suggest them in this issue.

Metadata

Metadata

Assignees

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions