Skip to content

Document how to run local Inference, for the subset of models that support it #82

Open
@julien-c

Description

@julien-c

i.e. a how-to guide (or set of guides) on how to use TFJS or onnxruntime.js (or other alternatives) on either client or server JS

see those Twitter threads for instance:

Metadata

Metadata

Assignees

No one assigned

    Labels

    documentationImprovements or additions to documentation

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions