Open source
We are building this in public because this needs to exist.
Archē is free and open source because we believe there should be a tool anyone can use without being trapped inside a single provider’s product roadmap, pricing, or API terms.
The goal is simple: make these technologies available on your own terms. Self-host it, choose the models you want, keep your workflows portable, and make sure the system stays useful even as providers change.
No dependency on a single provider to access core workflows.
A tool any team or individual can run, study, and adapt.
A shared foundation the community can improve together.
Frequently asked questions
Yes — Archē is fully open-source. There are no hidden tiers, usage limits, or premium features behind a paywall. You only pay for the AI model providers you choose to connect.
ChatGPT and Claude are individual chat interfaces tied to a single provider. Archē adds a layer on top: it's multi-provider, multi-user, and can run in the cloud or locally. It's designed to be an agnostic hub where your team works with the best models available — sharing processes, agent configurations, skills, and knowledge in one place.
Obsidian and Notion are great for writing and organizing notes. Archē goes further: your documentation becomes a living, version-controlled knowledge base that AI agents actively use to execute tasks — not just store information.
Not yet, but we're actively working on adding Ollama compatibility to enable fully offline usage. Stay tuned.
Absolutely. Self-hosting is the default. Deploy on your local machine, on-prem servers, or a private cloud with a single command.
Archē supports OpenAI, Anthropic, Fireworks, OpenRouter, and OpenCode as providers. You can route different experts to different models instead of locking the whole workspace into one vendor.
Archē is designed to follow security best practices whether deployed remotely or running locally. That said, the actual security of your deployment depends on how it's configured — especially for remote setups, make sure you have the knowledge to deploy it properly and avoid leaving gaps. For local use with Archē Desktop, your data stays on your machine with no extra risk.
To try the Desktop version — no. Just download the DMG and start using it. To self-host on your own infrastructure, some technical knowledge is recommended, though following the guides should get you there without major issues.
The Desktop version is single-user. The self-hosted remote version supports multiple users, each with their own isolated workspace, so teams can work in parallel without conflicts.
Archē is open-source. Your instance, your data, and the entire codebase are yours. Even if active development stops, everything keeps running on your infrastructure.
What's broken in your company
Fragmented operations
Teams improvise workflows across disconnected tools. Institutional knowledge disappears into silos no one maintains.
Static knowledge
Critical processes live in documents no one reads. Every new hire, every handoff starts from zero.
Vendor lock-in
AI providers own your data pipeline and complicate compliance. Your data sits on infrastructure you don't control.
Archē replaces that chaos with a singleoperating system for work.
Agents
Not one chatbot — a coordinated team of agents.
Assign roles, default models, and tool permissions per agent. Every department gets a specialist built for its workflows.
Knowledge Base
Your documentation, alive and version-controlled.
Sync external sources, edit inline, review diffs, and publish — with full traceability. Static documents become operational intelligence.
Connectors
Plug in your stack without giving up control.
Connect Linear, Notion, Zendesk, and custom MCP endpoints with real-time status, connection testing, and per-agent access control.
Models
Route to the right model for every task.
Assign the best model per agent and use case. Control provider access at the user level. Supports OpenAI, Anthropic, and OpenRouter.
How it works
One system. Four pillars.
Specialized agents
A primary agent coordinates the team. Define roles, capabilities, and tool access per agent.
Living knowledge base
Version-controlled and publishable. Sync, edit, review diffs, and ship updates with full traceability.
System connectors
Plug in Linear, Notion, Zendesk, or custom MCP endpoints. Real-time status, connection testing, and explicit access controls built in.
Isolated workspaces
Each user gets their own containerized workspace. Deploy anywhere with a single command.