Documentation
Nova is a self-directed autonomous AI platform that runs on your hardware. Define a goal, and Nova breaks it into subtasks, executes them through a coordinated agent pipeline, and re-plans as needed — all within a Docker Compose stack you fully control. Local AI is bundled by default (Ollama with starter models). You can switch modes (hybrid / local-only / cloud-only) or point Nova at an external Ollama / vLLM instance from the dashboard at any time, no scripts.
Getting Started
Section titled “Getting Started” Quick Start Install Nova and run your first task in minutes.
Core Concepts
Section titled “Core Concepts” Architecture Services, ports, inter-service communication, and data flow.
Pipeline The 5-stage Quartet agent chain: Context → Task → Guardrail → Code Review → Decision.
Configuration Environment variables, model routing, and context budgets.
Services
Section titled “Services” Orchestrator Agent lifecycle, task queue, pipeline execution, and MCP tool dispatch.
LLM Gateway Multi-provider model routing via LiteLLM.
Memory Service Embedding and hybrid semantic/keyword retrieval via pgvector.
Chat API WebSocket streaming bridge for external clients.
Dashboard React admin UI for managing Nova.
Recovery Backup/restore, factory reset, and service management.
Chat Bridge Multi-platform chat integration: Telegram, Slack.
Guides
Section titled “Guides” Inference Backends Local and cloud LLM providers — Ollama, Anthropic, OpenAI, and more.
Deployment Docker Compose setup, GPU overlays, and production configuration.
Remote Access Tailscale and Cloudflare tunnels for accessing Nova remotely.
IDE Integration Using Nova with Continue, Cursor, and Aider.
MCP Tools Model Context Protocol tool integration and dispatch.
Skills & Rules Skills framework and .claude/ configuration.
Reference
Section titled “Reference” API Reference REST and streaming API endpoints across all services.
Security Authentication, secrets management, and access control.
Roadmap Planned features and development priorities.