Point ox4docs at any repo — GitHub, GitLab, or Bitbucket. It reads your code, screenshots your UI, and ships a complete docs site with your choice of LLM — then auto-regenerates on every PR merge.
Free to start·No credit card·GitHub · GitLab · Bitbucket·9 LLM providers
Connect once. ox4docs handles generation, hosting, and staying current.
Link any repo — GitHub, GitLab, or Bitbucket. ox4docs installs a webhook that watches for PR merges — no CI config needed.
GitHub · GitLab · Bitbucket · Any languageYour choice of LLM reads your code and writes sections adapted to your project type. Playwright screenshots your live UI.
9 LLMs · Adaptive sections · Screenshot toursA complete Astro Starlight site — SEO-ready, full-text searchable with Pagefind, live in under 90 seconds.
Astro Starlight · Deploy anywhereEvery PR merged to main triggers automatic regeneration. Docs are always in sync with your latest code.
Documentation CI · Always in syncBuilt for developers who care about quality docs but have actual code to ship.
Every PR merged to main triggers automatic regeneration. Your docs are always in sync with your latest code — no manual updates, ever.
Detects your project type and generates the right sections — API Reference for libraries, CLI Reference for CLIs, Data Models for Prisma/Drizzle apps, and more.
Claude, OpenAI, Groq, Mistral, Gemini, Together AI, Cohere, Azure, and Ollama. Bring your own key or run 100% locally — zero data leaves your machine.
Playwright captures your live UI across multiple pages and auth flows. Multi-session support for complex apps with different user roles.
Pagefind search built in. Every doc site is fully searchable out of the box — no Algolia account, no extra config, no API bill.
npx ox4docs auto-detects your stack, selects the right sections, picks the right LLM from your env, and ships a production Astro Starlight site.
No YAML to configure. No sidebar to hand-write. ox4docs reads your codebase and writes documentation that reflects what your code actually does — not a generic template.
No vendor lock-in. Bring your own API key. Run entirely locally with Ollama — zero API costs, full privacy.
Public & private repos. Webhook auto-install. PR merge triggers regeneration.
✓ SupportedSelf-managed or GitLab.com. OAuth2 login. Merge request webhook support.
✓ SupportedAtlassian Bitbucket Cloud. OAuth2 login. Works with any Bitbucket repo.
✓ SupportedClaude Sonnet & Haiku. Set ANTHROPIC_API_KEY.
GPT-4o and GPT-4 Turbo. Set OPENAI_API_KEY.
Ultra-fast inference. Llama 3, Mixtral. Set GROQ_API_KEY.
Mistral Large & Codestral. Set MISTRAL_API_KEY.
Google Gemini 1.5 Pro & Flash. Set GEMINI_API_KEY.
70+ open models. Set TOGETHER_API_KEY.
Command R+ for RAG-heavy docs. Set COHERE_API_KEY.
Enterprise-grade. Bring your deployment endpoint.
100% local. Llama 3, Mistral, any model. Zero API costs.
Provider is auto-detected from environment variables — no configuration file needed.
“We had been putting off documentation for six months. I ran ox4docs on our API repo before lunch and had a complete docs site deployed to Vercel by 3pm. It's genuinely good — not generic filler.”
“The Ollama integration is a game changer. Our internal tools are proprietary — I can't send source code to external APIs. ox4docs with a local Llama 3 model gives us everything without the security concern.”
“I open-sourced a library and had zero docs. Pointed ox4docs at the repo and had a proper Starlight site — with an actual API reference — in about four minutes. I've made it a required pre-release step now.”
No seats. No per-page billing. Just straightforward pricing that grows with your team.
Connect your repo, pick your LLM, and ship. Documentation CI handles the rest — on every PR merge.
Generate my docs free →