Skip to main content
Get your application deployed to the cloud through a simple conversation with your AI coding assistant.

Prerequisites

Before you begin, make sure you have:
  • Python 3.13+ installed
  • uv - A fast Python package manager
  • Docker - For building your application container
  • An AI coding assistant that supports the Model Context Protocol (MCP): e.g. Cursor, Claude Code, or Windsurf
  • macOS
  • Linux
  • Windows
# Install uv
brew install uv

# Or via curl
curl -LsSf https://astral.sh/uv/install.sh | sh

Step 1: Install the Neptune MCP

Add Neptune to your AI coding assistant. Find your tool below:
Tested models: Neptune works well with Sonnet 4.5, Opus 4.5, GPT 5.1 (including Codex/Codex-mini), Gemini 3 Pro, Cursor Compose 1, and Grok Code. Other models may work but haven’t been extensively tested.
  1. Open Cursor SettingsTools & MCPAdd new MCP server
  2. Add this configuration:
  • macOS / Linux
  • Windows
{
  "mcpServers": {
    "neptune": {
      "command": "uvx",
      "args": ["--from", "git+https://github.com/shuttle-hq/neptune-cli-python.git", "neptune", "mcp"]
    }
  }
}
  1. Restart Cursor
Run this command in your terminal:
  • macOS / Linux
  • Windows
claude mcp add --transport stdio neptune -- uvx --from git+https://github.com/shuttle-hq/neptune-cli-python.git neptune mcp
Verify it was added:
claude mcp list
Use --scope user to make Neptune available across all projects, or --scope project to share with your team via .mcp.json.
  1. Open Windsurf SettingsCascadePluginsManage plugins
  2. Edit ~/.codeium/windsurf/mcp_config.json and add:
  • macOS / Linux
  • Windows
{
  "mcpServers": {
    "neptune": {
      "command": "uvx",
      "args": ["--from", "git+https://github.com/shuttle-hq/neptune-cli-python.git", "neptune", "mcp"]
    }
  }
}
  1. Press the refresh button in the Plugins panel, then reload Windsurf
Official Windsurf MCP docs →
  1. Install the Cline extension
  2. Click the MCP Servers icon in the Cline panel → Configure tab → Configure MCP Servers
  3. Add this to your cline_mcp_settings.json:
  • macOS / Linux
  • Windows
{
  "mcpServers": {
    "neptune": {
      "command": "uvx",
      "args": ["--from", "git+https://github.com/shuttle-hq/neptune-cli-python.git", "neptune", "mcp"]
    }
  }
}
  1. Reload VS Code
Official Cline MCP docs →
  1. Run MCP: Open User Configuration from the Command Palette (Cmd+Shift+P / Ctrl+Shift+P)
  2. Add this to your mcp.json:
  • macOS / Linux
  • Windows
{
  "servers": {
    "neptune": {
      "command": "uvx",
      "args": ["--from", "git+https://github.com/shuttle-hq/neptune-cli-python.git", "neptune", "mcp"]
    }
  }
}
Use MCP: Open Remote User Configuration if you want the server to run on a remote machine when connected remotely.
  1. Reload VS Code
Official VS Code MCP docs →
  1. Open SettingsToolsAI AssistantModel Context Protocol (MCP)
  2. Click Add and select As JSON, then paste:
  • macOS / Linux
  • Windows
{
  "mcpServers": {
    "neptune": {
      "command": "uvx",
      "args": ["--from", "git+https://github.com/shuttle-hq/neptune-cli-python.git", "neptune", "mcp"]
    }
  }
}
  1. Set Level to “Global” to use Neptune across all projects, then click Apply
Already use Claude Desktop? Click Import from Claude to reuse your existing MCP configs.
Official JetBrains MCP docs →
  1. Open SettingsMCP Servers (or via Warp Drive → Personal → MCP Servers)
  2. Click + Add and paste this configuration:
  • macOS / Linux
  • Windows
{
  "mcpServers": {
    "neptune": {
      "command": "uvx",
      "args": ["--from", "git+https://github.com/shuttle-hq/neptune-cli-python.git", "neptune", "mcp"]
    }
  }
}
  1. Click Start on the Neptune server to enable it
Official Warp MCP docs →
For any MCP client that supports STDIO transport, use this command:
uvx --from git+https://github.com/shuttle-hq/neptune-cli-python.git neptune mcp
Configure your client to run this command and communicate via stdin/stdout.
If your client doesn’t support STDIO or you prefer HTTP, start the MCP server in HTTP mode:
uvx --from git+https://github.com/shuttle-hq/neptune-cli-python.git neptune mcp --transport=http
The server runs at http://localhost:8001/mcp. Configure your client to connect to this URL.

Step 2: Authenticate

Once the MCP is installed, ask your AI assistant:
“Log me in to Neptune”
A browser window will open for GitHub OAuth authentication. Sign in with your GitHub account and you’re ready to deploy.

Step 3: Deploy Your App

Simply ask your AI assistant:
“Deploy this app to Neptune”
Your AI assistant will:
  1. Create an optimized Dockerfile for your project (if you don’t have one)
  2. Generate a neptune.json configuration
  3. Provision any required cloud resources
  4. Build and deploy your application
No Dockerfile? No problem. Neptune provides context to your AI assistant so it can generate one automatically based on your project structure.

Step 4: Wait for Your App to Go Live

Deployments take a moment to start. Ask your AI assistant:
“Wait for my deployment to complete”
Your AI assistant will monitor the deployment status and let you know when it’s ready with your public IP address. This typically takes 1-2 minutes.
Don’t skip this step! Your app won’t be accessible until the deployment has finished starting up.

What Can You Deploy?

Neptune can deploy any containerized application. If it runs in Docker, it runs on Neptune.

Next Steps

Need help? Join our Discord community for support and to connect with other developers.