Skip to content

Dynamic prompts (e.g. using a vector store for memory) #15

@foogunlana

Description

@foogunlana

Hi @mistercrunch

Thanks for building this, it's sorely needed.

I noticed it's targeted at static prompts (You can write the prompt directly into the code).

This means it's useful for unit tests, but not e2e tests of prompts that load some of their text from a similarity search on a vector DB or other.

I think this could be easy to fix by adding another test case that relies on an external function or API call to call the LLM. Right now after skimming the code, my guess is that the PromptCase speaks directly to OpenAI and so there's no room for external calls to modify the prompt.

I think this would be a super useful feature as a lot of applications are being built with this architecture and have the same problem you're trying to solve as you build it for Preset. I'm happy to work on it as well if you think it's something you'd add to the repo.

class DynamicPromptCase:
      pass

# Later
from promtimize.prompts import DynamicPromptCase, 
from promptimize import evals

def enhance(prompt: str) -> str:
    context = "I like the following music the most: Organize by Asake (Afrobeats), Ride by Twenty-One Pilots (Band)."
    return f"Use the following context to answer my question\n{context}\n{prompt}"

simple_prompts = [
    DynamicPromptCase(
        "name my personal favourite band", enhance, lambda x: evals.all_words(x, ["Twenty-One Pilots"])
    ),
]

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions