Memori

Generated on 15 Dec 2025 from the Memori catalog page

Memori adds persistent memory to any AI application. Your AI will automatically remember facts, preferences, and context for each user across conversations - no code changes required. Just point your app to the Memori endpoint and get intelligent, personalized responses powered by DigitalOcean Gradient AI.

Key Features

  • Advanced Augmentation: AI-powered memory augmentation with no latency impact.
  • Data Residency: The memories are stored in YOUR Droplet’s Postgres database.
  • Knowledge Graph Visualization - Interactive dashboard showing entity relationships across memories.
  • Free 5000 memories per month - Sign up at memorilabs.ai to get an API key on the Free plan. You can still use memori without the API key, but you are rate limited to 100 memories per day.
  • Gradient AI Agents - Memori is seamlessly integrated with DigitalOcean Gradient AI Agents Platform.

How It Works

  1. Deploy the 1-Click Droplet
  2. Configure your Gradient AI credentials in the dashboard
  3. Point your app to http://your_droplet_public_ipv4/v1/chat/completions
  4. AI now remembers your users automatically

Use Cases

  • Customer Support Bots - AI remembers customer history and preferences
  • Personal Assistants - Build assistants that know user context
  • Educational Apps - Track student progress and adapt responses
  • Any AI Application - Add memory to any LLM-powered apps or Agents.

Dashboard Features

  • AI chatbot demo with Memori
  • Memory usage monitoring
  • Knowledge graph visualization
  • Extracted facts viewer

Powered by Memori and DigitalOcean Gradient AI


Note: PostgreSQL is set to default and automatically configured during first boot.


Software Included

Package Version License
Memori 3.1.0 Apache 2.0

Creating an App using the Control Panel

Click the Deploy to DigitalOcean button to create a Droplet based on this 1-Click App. If you aren’t logged in, this link will prompt you to log in with your DigitalOcean account.

Deploy to DO

Creating an App using the API

In addition to creating a Droplet from the Memori 1-Click App using the control panel, you can also use the DigitalOcean API. As an example, to create a 4GB Memori Droplet in the SFO2 region, you can use the following curl command. You need to either save your API access token to an environment variable or substitute it in the command below.

curl -X POST -H 'Content-Type: application/json' \
         -H 'Authorization: Bearer '$TOKEN'' -d \
        '{"name":"choose_a_name","region":"sfo2","size":"s-2vcpu-4gb","image":"gibsonai-memori"}' \
        "https://api.digitalocean.com/v2/droplets"

Getting Started After Deploying Memori

Connecting to Your Droplet

After creating your Droplet, connect via SSH:

ssh root@your_droplet_public_ipv4

First Boot Setup

On first boot, the Droplet will:

  1. Initialize the PostgreSQL database
  2. Generate secure encryption keys
  3. Start the Memori API service

This takes approximately 1-2 minutes.

Accessing the Dashboard

Open your browser and navigate to:

http://your_droplet_public_ipv4/dashboard

Configure Gradient AI

  1. Click Settings in the dashboard

  2. Enter your Gradient AI credentials:- API Key: Get from DigitalOcean API Settings

    • Agent Endpoint: Your Gradient AI agent URL
  3. Click Save Configuration

Test the API

curl -X POST http://your_droplet_public_ipv4/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-4",
    "messages": [{"role": "user", "content": "My name is Alex"}],
    "user": "user-123"
  }'

Example Using with OpenAI SDK

Python:

from openai import OpenAI

client = OpenAI(
    base_url="http://your_droplet_public_ipv4/v1/",
    api_key="not-needed"
)

response = client.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "What's my name?"}],
    user="user-123"
)

Node.js:

import OpenAI from 'openai';

const client = new OpenAI({
    baseURL: 'http://your_droplet_public_ipv4/v1/',
    apiKey: 'not-needed'
});

Service Management

# Check service status
systemctl status memori

# View logs
journalctl -u memori -f

# Restart service
systemctl restart memori

API Endpoints

Method Endpoint Description
POST /v1/chat/completions Chat with memory (OpenAI-compatible)
GET /v1/memories/{user_id} Retrieve user memories
DELETE /v1/memories/{user_id} Clear user memories
GET /v1/usage Memory usage statistics
GET /health Health check

Full Memori documentation : https://memorilabs.ai/docs

We can't find any results for your search.

Try using different keywords or simplifying your search terms.