EchoStash
Docs

Core Concepts

Prompts, messages, tools, versions, and variables

Prompts

A prompt is a reusable template for AI interactions. Each prompt can contain static text and Echo DSL expressions for dynamic content -- variables, conditionals, sections, and AI gates. Prompts are the core unit of work in Echostash.

Messages

Prompts are structured as role-based messages: system, user, and assistant. Each message has a role and content blocks that can include text, images, and tool calls. This mirrors how modern LLM APIs expect conversations to be structured.

Tools

Function-calling tool definitions can be attached to prompts. Each tool has a name, description, and JSON Schema parameters. Tools are delivered in OpenAI function-calling format and automatically converted to the correct format by each provider converter.

Schema

Define a JSON Schema for structured LLM output. When a schema is attached, it forces the model to respond in a specific format -- ensuring type-safe, parseable responses from the LLM.

Meta / Model Config

Each prompt carries model settings: provider, model name, temperature, and maxTokens. Meta can also be a dynamic template with Echo DSL conditions -- for example, choosing a different model based on a variable value. The resolved config is returned alongside messages when you use the SDK.

Variables

Variables are dynamic values substituted at render time. They support defaults, nested access (e.g., {{user.name}}), and typed values. Pass variables via the SDK or the Playground's Variables tab to personalize prompt output for each request.

Projects

Projects organize related prompts together. Use projects to group prompts by application, team, or use case. Each project has its own prompt list and settings.

Versions

Every save creates an immutable version. Versions are numbered sequentially. You can commit with a changelog message, tag versions, and deploy specific versions to named targets (dev, staging, production). The SDK can pin to any version or target.