Skip to content

[Backport 8.19] Adds input_type parameter to POST inference docs at the root level #4847

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jul 10, 2025
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 13 additions & 0 deletions specification/inference/inference/InferenceRequest.ts
Original file line number Diff line number Diff line change
Expand Up @@ -82,6 +82,19 @@ export interface Request extends RequestBase {
* > Inference endpoints for the `completion` task type currently only support a single string as input.
*/
input: string | Array<string>
/**
* Specifies the input data type for the text embedding model. The `input_type` parameter only applies to Inference Endpoints with the `text_embedding` task type. Possible values include:
* * `SEARCH`
* * `INGEST`
* * `CLASSIFICATION`
* * `CLUSTERING`
* Not all services support all values. Unsupported values will trigger a validation exception.
* Accepted values depend on the configured inference service, refer to the relevant service-specific documentation for more info.
*
* > info
* > The `input_type` parameter specified on the root level of the request body will take precedence over the `input_type` parameter specified in `task_settings`.
*/
input_type?: string
/**
* Task settings for the individual inference request.
* These settings are specific to the task type you specified and override the task settings specified when initializing the service.
Expand Down