local-agent
AI-powered code analysis tool — scan, analyze, and chat with your codebase using local Ollama models
Recursively scans your project with intelligent filtering. Supports source files, configs, PDFs, PCAP captures, and more. Respects .gitignore and custom deny/allow patterns.
Full terminal UI and web UI to chat with your codebase. Live rescan capability, focus mode to target a single file, and session logging built in.
Process large codebases fast with configurable parallel LLM requests. Tune AGENT_CONCURRENT_FILES to match your Ollama setup.
100% local processing via Ollama — no data ever leaves your machine. Also supports remote Ollama instances via --host for team setups.
Parse and analyze network traffic captures (.pcap, .pcapng) and extract text from PDF files up to 10MB — all within the same familiar interface.
Single binary with all assets embedded. No runtime dependencies beyond Ollama. Ships for macOS and Linux (arm64 + amd64).
-task flag for one-shot analysis or drop into interactive mode for multi-turn conversation
Download local-agent and start chatting with your code in minutes
Get local-agent Now