Skip to content

Empty and Undeterministic Responses from LLM #50

@kshitij79

Description

@kshitij79

Bug Report 🐛

Users are frequently receiving empty and undeterministic responses from the LLM. The responses are inconsistent and unpredictable, affecting the reliability of the output. To improve the quality of the responses, it is necessary to optimize the temperature and other hyperparameters to enhance determinism.

Expected Behavior

The LLM should provide consistent and predictable output. Responses should not be empty, and the results should be more deterministic.

Current Behavior

Possible Solution

Optimize the temperature and other hyperparameters to enhance determinism in the LLM output.
Explore providers API documentations for the fix.

Steps to Reproduce

  1. Query the LLM through prompt provider.
  2. Observe that some responses are empty.
  3. Note the variability in responses for similar inputs.

Context (Environment)

Application

  • VSCode (All versions)

Detailed Description

Possible Implementation

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions