Skip to content

AI Tooling

Nirvana Labs provides machine-readable documentation formats for AI integration:

  • llms.txt - A lightweight index of all documentation pages with titles and URLs
  • llms-full.txt - Full documentation content in a single file, optimized for LLM context windows

Per-category documentation files are also available for more focused context:

You can also append /index.md to any documentation page URL to get the raw Markdown content. For example:

  • HTML: https://docs.nirvanalabs.io/cloud/introduction/
  • Markdown: https://docs.nirvanalabs.io/cloud/introduction/index.md

Additionally, there is a Copy page button in the top right corner of every page that allows you to copy the current page as Markdown to give to your LLM of choice.

Nirvana Labs provides an MCP (Model Context Protocol) server that enables AI assistants to interact with the Nirvana Labs API directly.

You can run the MCP Server directly via npx:

Terminal window
export NIRVANA_LABS_API_KEY="My API Key"
npx -y @nirvana-labs/nirvana-mcp@latest

For clients with a configuration JSON, add the following to your MCP configuration:

{
"mcpServers": {
"nirvana_labs_api": {
"command": "npx",
"args": ["-y", "@nirvana-labs/nirvana-mcp"],
"env": {
"NIRVANA_LABS_API_KEY": "My API Key"
}
}
}
}

Install the MCP server in Cursor using the button below. Set your environment variables in Cursor’s mcp.json (Cursor Settings > Tools & MCP > New MCP Server).

Add to Cursor

Install the MCP server in VS Code by clicking the link below. Set your environment variables in VS Code’s mcp.json (Command Palette > MCP: Open User Configuration).

Install in VS Code

Install the MCP server in Claude Code by running the following command in your terminal:

Terminal window
claude mcp add nirvana_labs_api --env NIRVANA_LABS_API_KEY="Your API Key" -- npx -y @nirvana-labs/nirvana-mcp

For more details about the MCP server, see the npm package or the GitHub repository.