Skip to content

Use URL and FileBlob/WebBlob to load binary data in Inference API instead of Blob #114

Open
@Aschen

Description

@Aschen

The Inference API use Blob or ArrayBuffer as argument for the request who result in loading huge file into RAM before sending them to the API.

We should rather use the new FileBlob or LazyBlob as for the commit method of the Hub API.

I already started to externalize the first version of the FileBlob and I could publish the WebBlob as well as the createBlob utility method

I would also understand if you want to:

  1. keep the package in the huggingface namespace
  2. create a shared package between hub and inference package in this monorepo.

Metadata

Metadata

Assignees

No one assigned

    Labels

    hub@huggingface/hub relatedtechnicalAdvanced stuff!

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions