Think in
code
Think in code. Own your tools.
Built for the tools
you actually own.
Every other option asks you to give something up — your data, your backend choice, your license freedom. Idep gives those back.
| Feature | Windsurf / Cursor | Google Antigravity | Zed | ◆idep |
|---|---|---|---|---|
| Runtime | Electron (VS Code fork) | Electron | Native (GPUI) | Native (Rust) |
| License | Proprietary | Proprietary | AGPL-3 | Apache 2.0 |
| AI paradigm | Inline assist | Agent orchestration | Inline assist | Precise + RAG |
| AI backends | BYOK / cloud-locked | Gemini-first | Anthropic, OpenAI, Ollama | Any backend |
| Codebase RAG | Cloud-indexed | ✗ | ✗ | ✅ local |
| Cloud dependency | Moderate | Hard (Google acct) | Low | None |
| RAM floor | ~8GB | 16GB recommended | ~4GB | ~2GB target |
| WSL2 / Linux | Good | Okay | Good | First-class |
| Open source | ✗ | ✗ | AGPL-3 | Apache 2.0 |
Agent orchestration. You become the architect. Delegate to parallel agents. Approve artifacts. Let Gemini run the show.
Powerful — if you trust the cloud with your codebase. Requires: Google account · 16GB RAM · internet
Thought-level control. You remain the thinker. AI completes your thought, locally, in-process. Your codebase never leaves your machine.
No account. No RAM floor. No vendor. Apache 2.0 — fork it, own it, ship it.
Set up in minutes.
Never leave your machine.
Configure your backend
Point Idep at any AI — Anthropic, Ollama, HuggingFace, or any OpenAI-compatible endpoint. One file, done.
# ~/.config/idep/config.toml
[ai]
backend = "ollama"
model = "codellama:13b"
endpoint = "http://localhost:11434"
# Or use Anthropic:
# backend = "anthropic"
# model = "claude-haiku-4-5-20251001" Open your project
Idep indexes your codebase in-process. No cloud upload. No waiting for a remote sync. RAG over your actual source.
$ idep .
✓ Indexed 1,847 files (3.2s)
✓ LSP: rust-analyzer attached
✓ AI: ollama · codellama:13b ready
✓ RAG: 94,221 chunks indexed
idep ready — src/main.rs Think in code
Inline completions, context-aware edits, and codebase-wide answers — all running against code that never left your machine.
fn calculate_checksum(data: &[u8]) -> u32 {
let mut sum = 0u32;
for chunk in data.chunks(4) {
// idep: expand with proper CRC32
sum ^= u32::from_le_bytes(
chunk.try_into().unwrap_or([0; 4])
);
}
sum
}
Your AI.
Your rules.
No vendor lock-in. No mandatory subscription. Swap backends in one line of config — or run entirely offline with Ollama.
# Switch backends in one line
[ai]
backend = "ollama" # ← change this
model = "codellama:13b" anthropic ollama huggingface openai custom One line of config. No platform lock. Swap at runtime.
idep
Thought. Mind. Consciousness.
One of the three forces that animate living beings.
Tools should disappear so thought can flow.
That Idep was conceived in Bali is not a marketing detail. It is a statement about where serious software can be built, and what values get encoded when you choose what to name a thing.
Active development.
Idep is pre-release. Core editing and AI integration are being built now. Track progress and contribute on GitHub.
| Module | Description | Status |
|---|---|---|
idep-ai (completion) | CompletionEngine via llm-ls bridge; debounce, stop-sequences, FIM | Complete (v0.0.2) |
idep-ai (chat) | Streaming chat callback; debounce propagation | Complete (v0.0.2) |
idep-core (buffer) | Insert, delete, lines iterator, cursor tracking | Complete (v0.0.2) |
idep-core (workspace) | Open/save file, file watcher triggers incremental reindex | Complete (v0.0.2) |
idep-lsp (bridge) | llm-ls bridge for completions | Complete (v0.0.2) |
idep-lsp (client) | LSP client lifecycle, JSON-RPC transport, rust-analyzer integration | Complete (v0.0.3) |
idep-lsp (completions) | Document sync, completion requests, textEdit handling, ranking | Complete (v0.0.4) |
idep-lsp (diagnostics) | Diagnostics, hover, goto-definition; WSL URI normalization, tests | Complete (v0.0.5) |
idep-index (ast chunking) | Tree-sitter AST chunking; Rust/TS/Python; fallback + deterministic naming | Complete (v0.0.6) |
idep-index (embeddings) | Local embeddings pipeline with fastembed; progress callbacks; cache + tests | Complete (v0.0.7) |
idep-index (vector store) | Vector store + chunk store; similarity search; persistent project indexer | Complete (v0.0.8) |
idep-ai (RAG context) | RAG Context Engine; intelligent codebase context gathering; token budget management | Complete (v0.0.9) |
idep-tui (editor) | Terminal UI Editor; vim-like navigation; undo/redo; mouse support; WSL2 verified | Complete (v0.1.0) |
idep-tui (syntax) | Tree-sitter syntax highlighting; Rust/TS/Python/TOML/Markdown; cached for 10k+ lines | Complete (v0.1.1) |
GitHub Sponsors | Quiet launch; sponsorship page live | Pending |
Backlog
| Competitive benchmark | Idep vs Antigravity on RAM, startup time, latency | Backlog |
| "Why not Antigravity?" | Honest comparison doc for developers evaluating both | Backlog |
Support Idep
Open-source, independently funded, no VC.
Idep is Apache 2.0 and will never gate core features behind a subscription. Funding goes directly to contributors. Every dollar in and out is public.
- 01 The editor is free, always. No feature gating, no license keys, no telemetry.
- 02 Apache 2.0 is permanent. No relicense, ever.
- 03 Revenue funds contributors, not founders.
- 04 Any paid layer is optional and privacy-preserving.
The IDE that thinks with you.
Follow the build. Contribute a crate. Open an issue. Apache 2.0 — own your stack.