Skip to main content

MCP Server

The Tower CLI includes an MCP (Model Context Protocol) server that allows AI assistants and editors to interact with your Tower account directly.

Without MCP, your AI assistant can only suggest Tower commands for you to run - and it has to guess at things like Towerfile syntax, which it'll often get wrong. With MCP, it can generate Towerfiles from your actual pyproject.toml, execute commands directly, and see the results - so it can deploy your code, notice an error, and fix it without you copy-pasting anything.

Prerequisites

Before using the MCP server, ensure you're logged into Tower:

tower login

The MCP server uses your existing Tower CLI authentication.

Transport Modes

The Tower MCP server supports three transport modes:

TransportBest ForNotes
stdio (recommended)Claude Code, most AI toolsSimplest setup, no background process needed
httpStreamable HTTP clientsReal-time streaming with logging notifications
sseLegacy clients, CursorServer-Sent Events, requires background process

The stdio transport is the simplest to configure and doesn't require running a background server. The AI tool spawns the MCP server process directly.

tower mcp-server --transport stdio

HTTP Streaming

For clients that support streamable HTTP with real-time logging notifications:

tower mcp-server --transport http --port 34567

The server runs on http://127.0.0.1:34567/mcp.

SSE (Server-Sent Events)

For clients that require SSE transport:

tower mcp-server --transport sse --port 34567

The server runs on http://127.0.0.1:34567/sse. Keep the terminal open while using the MCP server, or run it in the background with &.

Client Configuration

Add the Tower MCP server using stdio transport:

claude mcp add tower -- tower mcp-server

Or add directly to your Claude Code configuration (.claude.json or ~/.claude.json):

{
"mcpServers": {
"tower": {
"command": "tower",
"args": ["mcp-server"]
}
}
}

Cursor

Install MCP Server

Or open this link directly.

Or manually add to your Cursor MCP settings (mcp.json):

{
"mcpServers": {
"tower": {
"command": "tower",
"args": ["mcp-server"]
}
}
}

VS Code

First, enable MCP integrations by setting Chat > MCP: Enabled to true in your settings.

Install Tower MCP Server in VS Code

Or add to your mcp.json:

{
"servers": {
"tower": {
"command": "tower",
"args": ["mcp-server"]
}
}
}

Gemini CLI

Add to ~/.gemini/settings.json:

{
"mcpServers": {
"tower": {
"command": "tower",
"args": ["mcp-server"]
}
}
}

Windsurf

Add to ~/.codeium/windsurf/mcp_config.json:

{
"mcpServers": {
"tower": {
"command": "tower",
"args": ["mcp-server"]
}
}
}

Or access via Windsurf Settings → Cascade → Manage MCPs → View raw config.

Logs: ~/.codeium/windsurf/logs

Zed

Add to your Zed settings.json:

{
"context_servers": {
"tower": {
"command": "tower",
"args": ["mcp-server"]
}
}
}

SSE Configuration (Legacy)

If your client doesn't support stdio transport, you can use SSE. First start the server:

tower mcp-server --transport sse --port 34567 &

Then configure your client to connect to http://127.0.0.1:34567/sse with SSE transport.

Available Tools

The MCP server exposes the following tools:

Apps

ToolDescription
tower_apps_listList all Tower apps in your account
tower_apps_createCreate a new Tower app
tower_apps_showShow details for a Tower app and its recent runs
tower_apps_logsGet logs for a specific Tower app run
tower_apps_deleteDelete a Tower app

Deployment & Execution

ToolDescription
tower_deployDeploy your app to Tower cloud
tower_run_localRun your app locally using the Towerfile
tower_run_remoteRun your app remotely on Tower cloud

Towerfile Management

ToolDescription
tower_file_generateGenerate a Towerfile from pyproject.toml
tower_file_readRead and parse the current Towerfile configuration
tower_file_updateUpdate Towerfile app configuration
tower_file_add_parameterAdd a new parameter to the Towerfile
tower_file_validateValidate the current Towerfile configuration

Schedules

ToolDescription
tower_schedules_listList all schedules for apps
tower_schedules_createCreate a schedule to run an app on a cron schedule
tower_schedules_updateUpdate an existing schedule
tower_schedules_deleteDelete a schedule

Secrets

ToolDescription
tower_secrets_listList secrets in your Tower account (previews only)
tower_secrets_createCreate a new secret in Tower
tower_secrets_deleteDelete a secret from Tower

Teams

ToolDescription
tower_teams_listList teams you belong to
tower_teams_switchSwitch context to a different team

Help

ToolDescription
tower_workflow_helpShow the recommended workflow for Tower app development

Workflow

The typical workflow when using the MCP server:

  1. Create a Towerfile - Use tower_file_generate to create from an existing pyproject.toml
  2. Test locally - Use tower_run_local to verify your app works
  3. Create the app - Use tower_apps_create to register your app with Tower
  4. Deploy - Use tower_deploy to push your code to Tower cloud
  5. Run remotely - Use tower_run_remote to execute on Tower infrastructure
  6. Schedule - Use tower_schedules_create for recurring execution

For a hands-on tutorial walking through this workflow, see the MCP Walkthrough.

note

The walkthrough uses SSE transport, which requires running the MCP server as a background process. With stdio transport, you don't need to start the server manually - your AI tool can spawn it automatically.