Self-hosting overview
You can run Foundry on your own infrastructure for full data control and air-gapped or on-prem deployments.What you run
- API server — Same REST API as the hosted product (
https://api.withfoundry.ai). Auth via API keys; same envelope and endpoints. - Memory backend — PostgreSQL with pgvector for embeddings and semantic search.
- Security scanner workers — Scan jobs and optional queue (e.g. Redis + workers).
- Embeddings — Optional local embedding model or your own embedding API to avoid sending data out.
Architecture (high level)
Requirements
- PostgreSQL with pgvector extension.
- Node (or container runtime) for the API and workers.
- Storage for scanner clones and artifacts (or ephemeral clones).
- API keys — Generated and stored by your instance; same
key_prefix andAuthorization: Bearerusage.
Configuration
- Base URL — Set your instance URL so the API and workers use the same host.
- Database — Connection string and pool settings.
- Embeddings — Point to your embedding service or local model; same interface as hosted.
- Secrets — Store API key signing secrets and any third-party keys in env or secret manager.
Using the SDK and MCP against self-hosted
Point the client to your base URL: SDK:FOUNDRY_BASE_URL; otherwise configure per the self-hosted package docs.)
Licensing and artifacts
Self-hosted installers, Docker images, or deployment manifests are provided per your agreement with Foundry. Check your contract or contact support for artifact location and versioning.Summary
| Topic | Note |
|---|---|
| API | Same REST surface; your base URL. |
| Auth | Same API keys and Bearer header. |
| Memory | PostgreSQL + pgvector. |
| Security | Workers + optional queue. |
| SDK/MCP | Set baseUrl or env to your instance. |