Agent Configuration
Agent Configuration
The Envoy Agent is the central orchestration service, responsible for handling job searches, AI processing, and managing application workflows. To function correctly, it requires specific configuration, primarily handled through environment variables.
This section details the essential environment variables you'll need to set up your Agent instance, covering database settings, API keys for AI services, and paths for profile management.
General Configuration Principles
The Agent service reads its configuration from environment variables. This approach allows for flexible deployment, whether you're running it locally, in a Docker container, or on a server.
To configure your Agent, you typically create a .env file in the root directory where you run the Agent, or set these variables directly in your shell environment.
Database Configuration
The Agent uses SQLite for persistence, storing job listings, application states, and profile interview data.
-
SQLITE_PATH: Specifies the path to the SQLite database file.- Description: This variable determines where the Agent stores its data. If the file does not exist, it will be created.
- Default (Testing):
:memory:(in-memory database, data is lost on restart). - Recommended for Production: A file path, e.g.,
/path/to/envoy.sqliteor./data/envoy.sqlite. Ensure the Agent process has write permissions to this path.
SQLITE_PATH=./data/envoy.sqlite
AI Service Configuration
The Agent leverages large language models (LLMs) for tasks like generating cover letters, assessing job fit, and conducting profile interviews. Envoy is designed to work with OpenAI-compatible APIs.
-
OPENAI_API_KEY: Your API key for the LLM service.- Description: This is required to authenticate with the chosen LLM provider (e.g., OpenAI, or a self-hosted compatible service).
- Example:
sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
OPENAI_API_KEY=your_openai_api_key_here -
OPENAI_BASE_URL: The base URL for the LLM API endpoint.- Description: Use this to point to a different OpenAI-compatible API endpoint, such as a local LLM or another provider. If not set, it defaults to OpenAI's official API endpoint.
- Example (for local LLM via LiteLLM/Ollama):
http://localhost:8000/v1
OPENAI_BASE_URL=https://api.openai.com/v1 # Default, can be omitted # OR for local LLM: # OPENAI_BASE_URL=http://localhost:8000/v1 -
OPENAI_MODEL: The specific LLM model to use.- Description: Specifies which model the Agent should use for its AI tasks.
- Example (OpenAI):
gpt-4o-mini,gpt-3.5-turbo - Example (local Ollama):
llama3,mistral
OPENAI_MODEL=gpt-4o-mini
Inter-Service Communication
Envoy consists of multiple services (Portal, Agent, Tools) that communicate with each other. A shared secret ensures secure communication between the Agent and the Tools service.
-
INTERNAL_AUTH_SECRET: A shared secret for internal API authentication.- Description: This secret is used to authenticate requests between the Agent (Python) and the Tools (Node/Playwright) services. Both services must be configured with the same secret.
- Recommendation: Generate a strong, random string for this.
- Example:
super-secret-string-12345
INTERNAL_AUTH_SECRET=a_long_random_string_for_internal_auth -
TOOLS_BASE_URL: The base URL of the Tools service.- Description: The Agent uses this URL to send requests to the Tools service for browser automation tasks (e.g., job searching, applying).
- Default:
http://127.0.0.1:4320(assuming the Tools service runs on its default port locally).
TOOLS_BASE_URL=http://127.0.0.1:4320
Profile Management Paths
The Agent manages your professional profile, which is used to personalize applications and answer interview questions.
-
PROFILE_PATH: Path to your main canonical profile JSON file.- Description: This file stores your structured professional profile. The Agent will read from and write to this file during profile interviews.
- Example:
./data/my_profile.json
PROFILE_PATH=./data/my_profile.json -
RAW_PROFILE_PATH: Path for raw profile data and snapshots.- Description: Used internally by the Agent to store intermediate or raw profile data during processes like profile interviews.
- Example:
./data/raw_profile.json
RAW_PROFILE_PATH=./data/raw_profile.json
Example .env File
Here's an example of a .env file that you might use to configure your Agent:
# Database Configuration
SQLITE_PATH=./data/envoy.sqlite
# AI Service Configuration (using OpenAI)
OPENAI_API_KEY=sk-your-openai-api-key-here
OPENAI_BASE_URL=https://api.openai.com/v1 # Can be omitted for default OpenAI endpoint
OPENAI_MODEL=gpt-4o-mini
# Inter-Service Communication
INTERNAL_AUTH_SECRET=a_long_and_random_secret_string_for_envoy
TOOLS_BASE_URL=http://127.0.0.1:4320
# Profile Management
PROFILE_PATH=./data/my_profile.json
RAW_PROFILE_PATH=./data/raw_profile.json