Skip to content

Conversation

Curricane
Copy link
Contributor

Pull Request: Support environment variable references in config.yaml

Summary

This PR adds support for environment variable references in the configuration file using the $ENV_VAR or ${ENV_VAR} syntax. When a configuration field value starts with $, it will be treated as an environment variable reference and resolved accordingly.

Motivation

Although aichat by default reads the {ModelName}_API_KEY environment variable to get the api_key, sometimes I prefer to use environment variables with other names.

For example, when I use deepseek-chat, which is an OpenAI-compatible model, I have uniformly set it to OPENAI_API_KEY, but I have to configure a DEEPSEEK_API_KEY to make it work.

Moreover, when I first configured it, I didn't name the model "deepseek", so aichat kept prompting me with authentication errors until I read the source code.

Many other tools support configurations like ${MY_API_KEY}, and I think aichat should also support this. It would be more friendly to people who are new to aichat. At least when I was configuring it, I spent some time getting used to it.

Changes

Modified Files

  • src/client/macros.rs: Updated the config_get_fn! macro to support environment variable references

Implementation Details

The config_get_fn! macro has been enhanced with the following priority logic:

  1. Environment Variable Reference (Highest Priority): If a configuration field value starts with $, it's treated as an environment variable reference

    • Supports both $VAR and ${VAR} syntax
    • Extracts the environment variable name by trimming $, {, and } characters
    • Returns the environment variable value or an error if not found
  2. Standard Environment Variable: If no environment variable reference is found, falls back to the standard environment variable lookup using the pattern {CLIENT_NAME}_{FIELD_NAME}

  3. Configuration Value: If standard environment variable is not found, uses the original configuration value

Example Usage

Before (config.yaml)

# LLM clients
clients:
  # Use the generic openai-compatible client type
  - type: openai-compatible
    # Give this client a name, which we refer to in the 'model' key above
    name: deepseek
    # The 'models' key expects a list of objects (structs).
    models:
      - name: deepseek-chat
    # The API key, loaded from an environment variable
    api_key: sk-xxxxxxx
    # The base URL for the Deepseek API
    api_base: https://api.deepseek.com

After (config.yaml)

# LLM clients
clients:
  # Use the generic openai-compatible client type
  - type: openai-compatible
    # Give this client a name, which we refer to in the 'model' key above
    name: deepseek
    # The 'models' key expects a list of objects (structs).
    models:
      - name: deepseek-chat
    # The API key, loaded from an environment variable, which can not be DEEPSEEK_API_KEY
    api_key: $OPEN_API_KEY
    # The base URL for the Deepseek API
    api_base: https://api.deepseek.com

@sempervictus
Copy link

Thank you - i've got a toy search/research rust lib growing into something useful and env-var search provider/configuration details are handy for quickly pointing to a searx/duckcduckgo/etc provider as well as configuring how scraping and search query composition are handled. I think the last part is probalby best suited to agent-level config but the first two are often quick-n-dirty per-session affairs meriting env var handling.

@cori
Copy link

cori commented Sep 24, 2025

I thought i was already using this feature, having set VERTEXAI_PROJECT in a shared-and-sourced .env file, but now i'm getting

Error: Failed to call chat-completions api

Caused by:
    Permission denied on resource project $VERTEX_PROJECT. (status: PERMISSION_DENIED)

so I must have some sort of change along the way.

Further than that, I've tried symlinking my existing .env file to the default location, which is being used (from .info in an aichat env):

 .env -> /Users/cori/code/personal/dotfiles/.env

and that file has my value set:

❯ cat "/Users/cori/Library/Application Support/aichat/.env"
...
export VERTEX_PROJECT=IAMGROOT

but that still doesn't work, which makes me wonder if i'm misunderstanding this bit

AIChat supports env file (/.env) for managing environment variables.

You can put all your secret environment variables in the .env file.

In any case if this is necessary to use secrets in the env i am looking forward to its inclusion.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants