# Memori Generated on 21 Dec 2025 from [the Memori catalog page](https://marketplace.digitalocean.com/apps/memori) Memori adds persistent, intelligent memory to your LLM apps or AI agents with one click. Memori is an open-source, SQL-native framework that gives your AI agents structured, persistent memory and Advanced Augmentation—an intelligent layer that automatically captures context, extracts meaningful facts, and enhances agent reasoning over time. Memori makes memories instantly searchable across entities, processes, and sessions, enabling stateful, self-improving AI without manual RAG or custom memory engineering. Powered by DigitalOcean’s Gradient AI Platform. ### Key Features - **Advanced Augmentation**: AI-powered memory augmentation with no latency impact. - **Data Residency**: The memories are stored in YOUR Droplet’s Postgres database. - **Knowledge Graph Visualization** - Interactive dashboard showing entity relationships across memories. - **Free 5000 memories per month** - Sign up at [memorilabs.ai](https://memorilabs.ai/) to get an API key on the Free plan. You can still use memori without the API key, but you are rate limited to 100 memories per day. - **Gradient AI Agents** - Memori is seamlessly integrated with DigitalOcean Gradient AI Agents Platform. ### How It Works 1. Deploy the 1-Click Droplet 2. Configure your Gradient AI credentials in the dashboard 3. Point your app to `http://your_droplet_public_ipv4/v1/chat/completions` 4. AI now remembers your users automatically ### Use Cases - **Customer Support** — AI remembers customer history, tickets, and preferences. - **Sales & GTM** — Memory helps AI track every touchpoint, so follow-ups feel smarter and deals close faster - **Personal Assistants** — Build assistants that retain long-term user context - **Educational Apps** — Track student progress and adapt learning responses - **E-commerce Platforms** — Remember user preferences, past purchases, browsing intent, and support personalized recommendations, follow-ups, and support experiences - **Healthcare Applications** — Maintain patient context across interactions, remember medical history (non-diagnostic), preferences, and continuity for care coordination, triage bots, and health assistants - **Internal Company Chatbots** — Retain context for internal processes, knowledge base Q&A, and employee support - **Any AI Application** — Add persistent memory to any LLM-powered app or agent ### Dashboard Features - AI chatbot demo with Memori - Memory usage monitoring - Knowledge graph visualization - Extracted facts viewer **Powered by [Memori](https://github.com/MemoriLabs/Memori) and [DigitalOcean Gradient AI](https://docs.digitalocean.com/products/gradient-ai-platform/index.html.md)** * * * **Note**: PostgreSQL is set to default and automatically configured during first boot. * * * ## Software Included | Package | Version | License | |---|---|---| | [Memori](https://github.com/MemoriLabs/Memori) | 3.1.0 | Apache 2.0 | ## Creating an App using the Control Panel Click the **Deploy to DigitalOcean** button to create a Droplet based on this 1-Click App. If you aren’t logged in, this link will prompt you to log in with your DigitalOcean account. [![Deploy to DO](https://www.deploytodo.com/do-btn-blue.svg)](https://cloud.digitalocean.com/droplets/new?image=gibsonai-memori) ## Creating an App using the API In addition to creating a Droplet from the Memori 1-Click App using the control panel, you can also use the [DigitalOcean API](https://docs.digitalocean.com/reference/api). As an example, to create a 4GB Memori Droplet in the SFO2 region, you can use the following `curl` command. You need to either save your [API access token](https://docs.digitalocean.com/reference/api/create-personal-access-token/index.html.md) to an environment variable or substitute it in the command below. ```shell curl -X POST -H 'Content-Type: application/json' \ -H 'Authorization: Bearer '$TOKEN'' -d \ '{"name":"choose_a_name","region":"sfo2","size":"s-2vcpu-4gb","image":"gibsonai-memori"}' \ "https://api.digitalocean.com/v2/droplets" ``` ## Getting Started After Deploying Memori ### Connecting to Your Droplet After creating your Droplet, connect via SSH: ``` ssh root@your_droplet_public_ipv4 ``` ### First Boot Setup On first boot, the Droplet will: 1. Initialize the PostgreSQL database 2. Generate secure encryption keys 3. Start the Memori API service This takes approximately 1-2 minutes. ### Accessing the Dashboard Open your browser and navigate to: ``` http://your_droplet_public_ipv4/dashboard ``` ### Configure Gradient AI 1. Click **Settings** in the dashboard 2. Enter your Gradient AI credentials:- **API Key**: Get from [DigitalOcean API Settings](https://cloud.digitalocean.com/account/api) - **Agent Endpoint**: Your Gradient AI agent URL 3. Click **Save Configuration** ### Test the API ``` curl -X POST http://your_droplet_public_ipv4/v1/chat/completions \ -H "Content-Type: application/json" \ -d '{ "model": "gpt-4", "messages": [{"role": "user", "content": "My name is Alex"}], "user": "user-123" }' ``` ### Example Using with OpenAI SDK **Python:** ``` from openai import OpenAI client = OpenAI( base_url="http://your_droplet_public_ipv4/v1/", api_key="not-needed" ) response = client.chat.completions.create( model="gpt-4", messages=[{"role": "user", "content": "What's my name?"}], user="user-123" ) ``` **Node.js:** ``` import OpenAI from 'openai'; const client = new OpenAI({ baseURL: 'http://your_droplet_public_ipv4/v1/', apiKey: 'not-needed' }); ``` ### Service Management ``` # Check service status systemctl status memori # View logs journalctl -u memori -f # Restart service systemctl restart memori ``` ### API Endpoints | Method | Endpoint | Description | |---|---|---| | POST | `/v1/chat/completions` | Chat with memory (OpenAI-compatible) | | GET | `/v1/memories/{user_id}` | Retrieve user memories | | DELETE | `/v1/memories/{user_id}` | Clear user memories | | GET | `/v1/usage` | Memory usage statistics | | GET | `/health` | Health check | * * * Full Memori documentation : [https://memorilabs.ai/docs](https://memorilabs.ai/docs)