MCP Server
The Tower CLI includes an MCP (Model Context Protocol) server that allows AI assistants and editors to interact with your Tower account directly.
Without MCP, your AI assistant can only suggest Tower commands for you to run - and it has to guess at things like Towerfile syntax, which it'll often get wrong. With MCP, it can generate Towerfiles from your actual pyproject.toml, execute commands directly, and see the results - so it can deploy your code, notice an error, and fix it without you copy-pasting anything.
Prerequisites
Before using the MCP server, ensure you're logged into Tower:
tower login
The MCP server uses your existing Tower CLI authentication.
Transport Modes
The Tower MCP server supports three transport modes:
| Transport | Best For | Notes |
|---|---|---|
| stdio (recommended) | Claude Code, most AI tools | Simplest setup, no background process needed |
| http | Streamable HTTP clients | Real-time streaming with logging notifications |
| sse | Legacy clients, Cursor | Server-Sent Events, requires background process |
stdio (Recommended)
The stdio transport is the simplest to configure and doesn't require running a background server. The AI tool spawns the MCP server process directly.
tower mcp-server --transport stdio
HTTP Streaming
For clients that support streamable HTTP with real-time logging notifications:
tower mcp-server --transport http --port 34567
The server runs on http://127.0.0.1:34567/mcp.
SSE (Server-Sent Events)
For clients that require SSE transport:
tower mcp-server --transport sse --port 34567
The server runs on http://127.0.0.1:34567/sse. Keep the terminal open while using the MCP server, or run it in the background with &.
Client Configuration
Claude Code (stdio - recommended)
Add the Tower MCP server using stdio transport:
claude mcp add tower -- tower mcp-server
Or add directly to your Claude Code configuration (.claude.json or ~/.claude.json):
{
"mcpServers": {
"tower": {
"command": "tower",
"args": ["mcp-server"]
}
}
}
Cursor
Or manually add to your Cursor MCP settings (mcp.json):
{
"mcpServers": {
"tower": {
"command": "tower",
"args": ["mcp-server"]
}
}
}
VS Code
First, enable MCP integrations by setting Chat > MCP: Enabled to true in your settings.
Install Tower MCP Server in VS Code
Or add to your mcp.json:
{
"servers": {
"tower": {
"command": "tower",
"args": ["mcp-server"]
}
}
}
Gemini CLI
Add to ~/.gemini/settings.json:
{
"mcpServers": {
"tower": {
"command": "tower",
"args": ["mcp-server"]
}
}
}
Windsurf
Add to ~/.codeium/windsurf/mcp_config.json:
{
"mcpServers": {
"tower": {
"command": "tower",
"args": ["mcp-server"]
}
}
}
Or access via Windsurf Settings → Cascade → Manage MCPs → View raw config.
Logs: ~/.codeium/windsurf/logs
Zed
Add to your Zed settings.json:
{
"context_servers": {
"tower": {
"command": "tower",
"args": ["mcp-server"]
}
}
}
SSE Configuration (Legacy)
If your client doesn't support stdio transport, you can use SSE. First start the server:
tower mcp-server --transport sse --port 34567 &
Then configure your client to connect to http://127.0.0.1:34567/sse with SSE transport.
Available Tools
The MCP server exposes the following tools:
Apps
| Tool | Description |
|---|---|
tower_apps_list | List all Tower apps in your account |
tower_apps_create | Create a new Tower app |
tower_apps_show | Show details for a Tower app and its recent runs |
tower_apps_logs | Get logs for a specific Tower app run |
tower_apps_delete | Delete a Tower app |
Deployment & Execution
| Tool | Description |
|---|---|
tower_deploy | Deploy your app to Tower cloud |
tower_run_local | Run your app locally using the Towerfile |
tower_run_remote | Run your app remotely on Tower cloud |
Towerfile Management
| Tool | Description |
|---|---|
tower_file_generate | Generate a Towerfile from pyproject.toml |
tower_file_read | Read and parse the current Towerfile configuration |
tower_file_update | Update Towerfile app configuration |
tower_file_add_parameter | Add a new parameter to the Towerfile |
tower_file_validate | Validate the current Towerfile configuration |
Schedules
| Tool | Description |
|---|---|
tower_schedules_list | List all schedules for apps |
tower_schedules_create | Create a schedule to run an app on a cron schedule |
tower_schedules_update | Update an existing schedule |
tower_schedules_delete | Delete a schedule |
Secrets
| Tool | Description |
|---|---|
tower_secrets_list | List secrets in your Tower account (previews only) |
tower_secrets_create | Create a new secret in Tower |
tower_secrets_delete | Delete a secret from Tower |
Teams
| Tool | Description |
|---|---|
tower_teams_list | List teams you belong to |
tower_teams_switch | Switch context to a different team |
Help
| Tool | Description |
|---|---|
tower_workflow_help | Show the recommended workflow for Tower app development |
Workflow
The typical workflow when using the MCP server:
- Create a Towerfile - Use
tower_file_generateto create from an existingpyproject.toml - Test locally - Use
tower_run_localto verify your app works - Create the app - Use
tower_apps_createto register your app with Tower - Deploy - Use
tower_deployto push your code to Tower cloud - Run remotely - Use
tower_run_remoteto execute on Tower infrastructure - Schedule - Use
tower_schedules_createfor recurring execution
For a hands-on tutorial walking through this workflow, see the MCP Walkthrough.
The walkthrough uses SSE transport, which requires running the MCP server as a background process. With stdio transport, you don't need to start the server manually - your AI tool can spawn it automatically.