Open Source · MIT License

Your AI agents need secrets.
Keep them secret.

Store API keys in a local AES-256-GCM encrypted vault. Reference them as {{PLACEHOLDER}} symbols. Your LLM never sees the real values.

Get Started Free See how it works
EnigmAgent Vault
> add OPENAI_KEY @openai.com sk-proj-abc123...
✓ Stored OPENAI_KEY bound to openai.com
 
> add GITHUB_TOKEN @github.com ghp_xyz789...
✓ Stored GITHUB_TOKEN bound to github.com
 
# In your agent code:
client = OpenAI(api_key="{{OPENAI_KEY}}")
# → resolved at runtime, never in plaintext

How it works

Four steps. No accounts. No cloud. Fully local.

1

Store your secret

Add any credential to your local encrypted vault via chat or CLI.

add API_KEY @openai.com sk-...
2

Use a placeholder

Reference it symbolically in your code, prompts, or configs.

api_key = "{{API_KEY}}"
3

EnigmAgent resolves it

At runtime, the vault injects the real value — locally, instantly.

→ "sk-proj-abc123..."
4

LLM stays clean

Your AI agent runs with the real credential. No secrets in context windows, logs, or cloud services.

The problem EnigmAgent solves

Without EnigmAgent, every credential ends up where it shouldn't be.

WITHOUT EnigmAgent
You sk-proj-abc123 hardcoded in code Prompt / Code api_key="sk-proj-abc" key exposed! LLM / AI sees sk-proj-abc in context window! Cloud APIs & Logs API calls logged with key Git history, crash reports... KEY LEAKED
WITH EnigmAgent
You add KEY @site.com once, that's it EnigmAgent Vault API_KEY = •••••••• GITHUB_TOKEN = •••• AES-256-GCM encrypted resolves Prompt / Code {{API_KEY}} ✓ placeholder only LLM / AI sees {{API_KEY}} ✓ no real secret Runtime ✓ vault injects ✓ locally only

1. Chat Interface

Open the vault in your browser. Type one line to store any secret. That's it.

2. Encrypted Locally

AES-256-GCM + Argon2id. The vault file on disk is unreadable without your password.

3. Use Placeholders

Write {{SECRET_NAME}} anywhere — code, prompts, configs, HTTP requests.

4. Instant Resolution

EnigmAgent resolves placeholders at runtime. Real values never appear in logs or LLM context.

5. Stays Local

Everything runs on 127.0.0.1. No server, no cloud, no account. Your machine only.

6. Domain Binding

Each secret is locked to a domain. OPENAI_KEY only resolves on openai.com.

Why EnigmAgent

Built for AI developers who take security seriously.

AES-256-GCM Encryption

Military-grade encryption with Argon2id key derivation. Your vault is unreadable without your password.

100% Local

Zero cloud sync. Zero telemetry. Your secrets never leave your machine. Works fully offline.

{{PLACEHOLDER}} System

Universal reference syntax that works across LangChain, n8n, CrewAI, HTTP clients, and shell scripts.

Domain Binding

Each secret is bound to a specific domain. Prevents accidental injection on wrong sites.

Chrome Extension

Vault chat interface lives in your browser. Add secrets with a single line. No setup wizard.

Open Source (MIT)

Audit the code. Fork it. Self-host it. No vendor lock-in, no subscription, no account required.

Get Started

Multiple ways to run EnigmAgent — pick what fits your workflow.

Chrome Extension

Install from the Chrome Web Store. Open the vault chat and start adding secrets immediately.

Chrome Web Store → search "EnigmAgent"

Python SDK

Use EnigmAgent in any Python AI agent with LangChain, CrewAI, AutoGen, and more.

pip install enigmagent

CLI Tool

Manage your vault from the terminal. Use enigmagent run to inject secrets into any command.

npm install -g enigmagent-cli

n8n Community Node

Resolve {{PLACEHOLDER}} references inside your n8n automation workflows.

npm install n8n-nodes-enigmagent

MCP Server

Connect Claude, Cursor, or any MCP-compatible AI assistant directly to your vault.

npm install -g enigmagent-mcp

Docker

Run the vault as a local REST service inside a container for your CI/CD pipelines.

docker run -p 3737:3737 enigmagent/vault

Works with your stack

EnigmAgent integrates with 40+ AI frameworks and tools out of the box.

LangChainCrewAIAutoGenLlamaIndexn8nOpenAI SDKAnthropic SDKLangGraphHaystackSemantic KernelSmolagAgentsAgnoMem0PhidataClaudeCursorVS CodeJetBrainsDockerGitHub ActionsAWS LambdaZapierMake+ many more