Configure your MCP client to connect to hosted MCP servers.
DigitalOcean MCP Servers
Validated on 5 Dec 2025 • Last edited on 5 Dec 2025
You can integrate DigitalOcean services with AI development tools using DigitalOcean’s MCP servers and llms.txt. Together, these components standardize how context is delivered to large language models (LLMs):
-
Model Context Protocol (MCP) is an open standard for providing structured context to LLMs. DigitalOcean MCP servers bridge your development tools and the DigitalOcean API, exposing tools and context that allow MCP clients to manage resources such as App Platform, Droplets, Kubernetes clusters, and more.
-
llms.txt is a standardized Markdown file format that supplies LLMs with relevant product documentation and environment context. When included in a DigitalOcean service, such as an App Platform app, or a Kubernetes repository, it helps LLMs better understand the structure, configuration, and workflows associated with your resources.
MCP clients like Windsurf, Cursor, Claude, or VS Code Copilot can connect to MCP servers to access tools and context. When connected, these clients can reason about your infrastructure and perform API-backed operations on your behalf.
Getting Started
DigitalOcean MCP services provide tools that let your MCP client interact with the API and perform various operations on your resources.
Use effective prompts to manage DigitalOcean services from any MCP-compatible client.
Resources
DigitalOcean product documentation for LLMs.