GenAI Data Privacypublic
Validated on 12 Feb 2025 • Last edited on 2 Apr 2025
The DigitalOcean GenAI Platform lets you work with popular foundation models and build GPU-powered AI agents with fully-managed deployment, or send direct requests using serverless inference. Create agents that incorporate guardrails, functions, agent routing, and retrieval-augmented generation (RAG) pipelines with knowledge bases.
We do not store agent inputs or outputs on DigitalOcean infrastructure for any models.
DigitalOcean Hosted Models
For the Llama, Mistral, and DeepSeek models, input is stored in the local browser and sent to an agent’s model for inference on DigitalOcean’s infrastructure. The returned output is then stored in the local browser’s storage and displayed in the agent’s interface. If you’ve configured your agent to use prior parts of the conversation as additional context for output, the agent accesses the browser storage as necessary to retrieve the context. For custom interfaces and applications you have developed to use the GenAI Platform, where this data is stored is up to you.
Third-Party Models
While we do not store input or output for third-party model providers like Anthropic, the data sent to a third-party model is stored in accordance with that model provider’s policies. Reference your model provider’s policies for more information.
DigitalOcean Security
For more information about DigitalOcean’s security practices, see our security page.