-
Notifications
You must be signed in to change notification settings - Fork 104
Adds custom inference service API docs #4852
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
/** | ||
* Specifies the JSON parser that is used to parse the response from the custom service. | ||
* Different task types require different json_parser parameters. | ||
* For example: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@jonathan-buttner Do you think we should specify a JsonParser class for each task type, or is this list sufficient?
} | ||
|
||
export enum CustomServiceType { | ||
custom |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@jonathan-buttner Should the ServiceType be custom
whenever it's specified for this service type? Or can it be anything, for example custom-model
?
/** | ||
* Create a custom inference endpoint. | ||
* | ||
* You can create an inference endpoint to perform an inference task with a custom model that supports the HTTP format. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@jonathan-buttner Please suggest an alternative description if you think this is not sufficient. I tried to come up with something that is meaningful to me based on my limited knowledge.
* The chunking configuration object. | ||
* @ext_doc_id inference-chunking | ||
*/ | ||
chunking_settings?: InferenceChunkingSettings |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are chunking settings relevant for this service?
…ticsearch-specification into szabosteve/infer-put-custom
Following you can find the validation changes against the target branch for the APIs. No changes detected. You can validate these APIs yourself by using the |
…ticsearch-specification into szabosteve/infer-put-custom
WIP (I'll update this comment with a bunch of examples). Here are some examples: OpenAI Text Embedding
Cohere APIv2 Rerank
Cohere APIv2 Text Embedding
Jina AI Rerank
Hugging Face Text Embedding for model Qwen/Qwen3-Embedding-8B (other will be very similar)
TODO
|
Overview
Related issue: https://github.com/elastic/developer-docs-team/issues/307
This PR adds documentation about the custom inference service.
@jonathan-buttner Could you please provide an example request that I can add to the docs?