Skip to content

Commit 6b5582f

Browse files
Feat: Mem0 vector store backend integration for Neptune Analytics (#3453)
Signed-off-by: Andy Kwok <[email protected]>
1 parent d38e3f1 commit 6b5582f

File tree

10 files changed

+866
-40
lines changed

10 files changed

+866
-40
lines changed
Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,42 @@
1+
# Neptune Analytics Vector Store
2+
3+
[Neptune Analytics](https://docs.aws.amazon.com/neptune-analytics/latest/userguide/what-is-neptune-analytics.html/) is a memory-optimized graph database engine for analytics. With Neptune Analytics, you can get insights and find trends by processing large amounts of graph data in seconds, including vector search.
4+
5+
6+
## Installation
7+
8+
```bash
9+
pip install mem0ai[vector_stores]
10+
```
11+
12+
## Usage
13+
14+
```python
15+
config = {
16+
"vector_store": {
17+
"provider": "neptune",
18+
"config": {
19+
"collection_name": "mem0",
20+
"endpoint": f"neptune-graph://my-graph-identifier",
21+
},
22+
},
23+
}
24+
25+
m = Memory.from_config(config)
26+
messages = [
27+
{"role": "user", "content": "I'm planning to watch a movie tonight. Any recommendations?"},
28+
{"role": "assistant", "content": "How about a thriller movies? They can be quite engaging."},
29+
{"role": "user", "content": "I'm not a big fan of thriller movies but I love sci-fi movies."},
30+
{"role": "assistant", "content": "Got it! I'll avoid thriller recommendations and suggest sci-fi movies in the future."}
31+
]
32+
m.add(messages, user_id="alice", metadata={"category": "movies"})
33+
```
34+
35+
## Parameters
36+
37+
Let's see the available parameters for the `neptune` config:
38+
39+
| Parameter | Description | Default Value |
40+
| --- | --- | --- |
41+
| `collection_name` | The name of the collection to store the vectors | `mem0` |
42+
| `endpoint` | Connection URL for the Neptune Analytics service | `neptune-graph://my-graph-identifier` |

docs/docs.json

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -164,7 +164,8 @@
164164
"components/vectordbs/dbs/langchain",
165165
"components/vectordbs/dbs/baidu",
166166
"components/vectordbs/dbs/s3_vectors",
167-
"components/vectordbs/dbs/databricks"
167+
"components/vectordbs/dbs/databricks",
168+
"components/vectordbs/dbs/neptune_analytics"
168169
]
169170
}
170171
]
@@ -223,6 +224,7 @@
223224
"pages": [
224225
"examples",
225226
"examples/aws_example",
227+
"examples/aws_neptune_analytics_hybrid_store",
226228
"examples/mem0-demo",
227229
"examples/ai_companion_js",
228230
"examples/collaborative-task-agent",
Lines changed: 120 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,120 @@
1+
---
2+
title: "Amazon Stack - Neptune Analytics Hybrid Store: AWS Bedrock and Neptune Analytics"
3+
---
4+
5+
This example demonstrates how to configure and use the `mem0ai` SDK with **AWS Bedrock** and **AWS Neptune Analytics** for persistent memory capabilities in Python.
6+
7+
## Installation
8+
9+
Install the required dependencies to include the Amazon data stack, including **boto3** and **langchain-aws**:
10+
11+
```bash
12+
pip install "mem0ai[graph,extras]"
13+
```
14+
15+
## Environment Setup
16+
17+
Set your AWS environment variables:
18+
19+
```python
20+
import os
21+
22+
# Set these in your environment or notebook
23+
os.environ['AWS_REGION'] = 'us-west-2'
24+
os.environ['AWS_ACCESS_KEY_ID'] = 'AK00000000000000000'
25+
os.environ['AWS_SECRET_ACCESS_KEY'] = 'AS00000000000000000'
26+
27+
# Confirm they are set
28+
print(os.environ['AWS_REGION'])
29+
print(os.environ['AWS_ACCESS_KEY_ID'])
30+
print(os.environ['AWS_SECRET_ACCESS_KEY'])
31+
```
32+
33+
## Configuration and Usage
34+
35+
This sets up Mem0 with:
36+
- [AWS Bedrock for LLM](https://docs.mem0.ai/components/llms/models/aws_bedrock)
37+
- [AWS Bedrock for embeddings](https://docs.mem0.ai/components/embedders/models/aws_bedrock#aws-bedrock)
38+
- [Neptune Analytics as the vector store](https://docs.mem0.ai/components/vectordbs/dbs/neptune_analytics)
39+
- [Neptune Analytics as the graph store](https://docs.mem0.ai/open-source/graph_memory/overview#initialize-neptune-analytics).
40+
41+
```python
42+
import boto3
43+
from mem0.memory.main import Memory
44+
45+
region = 'us-west-2'
46+
neptune_analytics_endpoint = 'neptune-graph://my-graph-identifier'
47+
48+
config = {
49+
"embedder": {
50+
"provider": "aws_bedrock",
51+
"config": {
52+
"model": "amazon.titan-embed-text-v2:0"
53+
}
54+
},
55+
"llm": {
56+
"provider": "aws_bedrock",
57+
"config": {
58+
"model": "us.anthropic.claude-3-7-sonnet-20250219-v1:0",
59+
"temperature": 0.1,
60+
"max_tokens": 2000
61+
}
62+
},
63+
"vector_store": {
64+
"provider": "neptune",
65+
"config": {
66+
"collection_name": "mem0",
67+
"endpoint": neptune_analytics_endpoint,
68+
},
69+
},
70+
"graph_store": {
71+
"provider": "neptune",
72+
"config": {
73+
"endpoint": neptune_analytics_endpoint,
74+
},
75+
},
76+
}
77+
78+
# Initialize the memory system
79+
m = Memory.from_config(config)
80+
```
81+
82+
## Usage
83+
84+
Reference [Notebook example](https://github.com/mem0ai/mem0/blob/main/examples/graph-db-demo/neptune-example.ipynb)
85+
86+
#### Add a memory:
87+
88+
```python
89+
messages = [
90+
{"role": "user", "content": "I'm planning to watch a movie tonight. Any recommendations?"},
91+
{"role": "assistant", "content": "How about a thriller movies? They can be quite engaging."},
92+
{"role": "user", "content": "I'm not a big fan of thriller movies but I love sci-fi movies."},
93+
{"role": "assistant", "content": "Got it! I'll avoid thriller recommendations and suggest sci-fi movies in the future."}
94+
]
95+
96+
# Store inferred memories (default behavior)
97+
result = m.add(messages, user_id="alice", metadata={"category": "movie_recommendations"})
98+
```
99+
100+
#### Search a memory:
101+
```python
102+
relevant_memories = m.search(query, user_id="alice")
103+
```
104+
105+
#### Get all memories:
106+
```python
107+
all_memories = m.get_all(user_id="alice")
108+
```
109+
110+
#### Get a specific memory:
111+
```python
112+
memory = m.get(memory_id)
113+
```
114+
115+
116+
---
117+
118+
## Conclusion
119+
120+
With Mem0 and AWS services like Bedrock and Neptune Analytics, you can build intelligent AI companions that remember, adapt, and personalize their responses over time. This makes them ideal for long-term assistants, tutors, or support bots with persistent memory and natural conversation abilities.

examples/graph-db-demo/neptune-example.ipynb

Lines changed: 18 additions & 39 deletions
Original file line numberDiff line numberDiff line change
@@ -31,18 +31,21 @@
3131
"\n",
3232
"### 2. Connect to Amazon services\n",
3333
"\n",
34-
"For this sample notebook, configure `mem0ai` with [Amazon Neptune Analytics](https://docs.aws.amazon.com/neptune-analytics/latest/userguide/what-is-neptune-analytics.html) as the graph store, [Amazon OpenSearch Serverless](https://docs.aws.amazon.com/opensearch-service/latest/developerguide/serverless-overview.html) as the vector store, and [Amazon Bedrock](https://docs.aws.amazon.com/bedrock/latest/userguide/what-is-bedrock.html) for generating embeddings.\n",
34+
"For this sample notebook, configure `mem0ai` with [Amazon Neptune Analytics](https://docs.aws.amazon.com/neptune-analytics/latest/userguide/what-is-neptune-analytics.html) as the vector and graph store, and [Amazon Bedrock](https://docs.aws.amazon.com/bedrock/latest/userguide/what-is-bedrock.html) for generating embeddings.\n",
3535
"\n",
3636
"Use the following guide for setup details: [Setup AWS Bedrock, AOSS, and Neptune](https://docs.mem0.ai/examples/aws_example#aws-bedrock-and-aoss)\n",
3737
"\n",
38+
"The Neptune Analytics instance must be created using the same vector dimensions as the embedding model creates. See: https://docs.aws.amazon.com/neptune-analytics/latest/userguide/vector-index.html\n",
39+
"\n",
3840
"Your configuration should look similar to:\n",
3941
"\n",
4042
"```python\n",
4143
"config = {\n",
4244
" \"embedder\": {\n",
4345
" \"provider\": \"aws_bedrock\",\n",
4446
" \"config\": {\n",
45-
" \"model\": \"amazon.titan-embed-text-v2:0\"\n",
47+
" \"model\": \"amazon.titan-embed-text-v2:0\",\n",
48+
" \"embedding_dims\": 1024\n",
4649
" }\n",
4750
" },\n",
4851
" \"llm\": {\n",
@@ -54,18 +57,10 @@
5457
" }\n",
5558
" },\n",
5659
" \"vector_store\": {\n",
57-
" \"provider\": \"opensearch\",\n",
60+
" \"provider\": \"neptune\",\n",
5861
" \"config\": {\n",
59-
" \"collection_name\": \"mem0\",\n",
60-
" \"host\": \"your-opensearch-domain.us-west-2.es.amazonaws.com\",\n",
61-
" \"port\": 443,\n",
62-
" \"http_auth\": auth,\n",
63-
" \"connection_class\": RequestsHttpConnection,\n",
64-
" \"pool_maxsize\": 20,\n",
65-
" \"use_ssl\": True,\n",
66-
" \"verify_certs\": True,\n",
67-
" \"embedding_model_dims\": 1024,\n",
68-
" }\n",
62+
" \"endpoint\": f\"neptune-graph://my-graph-identifier\",\n",
63+
" },\n",
6964
" },\n",
7065
" \"graph_store\": {\n",
7166
" \"provider\": \"neptune\",\n",
@@ -96,14 +91,12 @@
9691
"import os\n",
9792
"import logging\n",
9893
"import sys\n",
99-
"import boto3\n",
100-
"from opensearchpy import RequestsHttpConnection, AWSV4SignerAuth\n",
10194
"from dotenv import load_dotenv\n",
10295
"\n",
10396
"load_dotenv()\n",
10497
"\n",
105-
"logging.getLogger(\"mem0.graphs.neptune.main\").setLevel(logging.DEBUG)\n",
106-
"logging.getLogger(\"mem0.graphs.neptune.base\").setLevel(logging.DEBUG)\n",
98+
"logging.getLogger(\"mem0.graphs.neptune.main\").setLevel(logging.INFO)\n",
99+
"logging.getLogger(\"mem0.graphs.neptune.base\").setLevel(logging.INFO)\n",
107100
"logger = logging.getLogger(__name__)\n",
108101
"logger.setLevel(logging.DEBUG)\n",
109102
"\n",
@@ -120,8 +113,7 @@
120113
"source": [
121114
"Setup the Mem0 configuration using:\n",
122115
"- Amazon Bedrock as the embedder\n",
123-
"- Amazon Neptune Analytics instance as a graph store\n",
124-
"- OpenSearch as the vector store"
116+
"- Amazon Neptune Analytics instance as a vector / graph store"
125117
]
126118
},
127119
{
@@ -136,18 +128,12 @@
136128
"\n",
137129
"graph_identifier = os.environ.get(\"GRAPH_ID\")\n",
138130
"\n",
139-
"opensearch_host = os.environ.get(\"OS_HOST\")\n",
140-
"opensearch_post = os.environ.get(\"OS_PORT\")\n",
141-
"\n",
142-
"credentials = boto3.Session().get_credentials()\n",
143-
"region = os.environ.get(\"AWS_REGION\")\n",
144-
"auth = AWSV4SignerAuth(credentials, region)\n",
145-
"\n",
146131
"config = {\n",
147132
" \"embedder\": {\n",
148133
" \"provider\": \"aws_bedrock\",\n",
149134
" \"config\": {\n",
150135
" \"model\": bedrock_embedder_model,\n",
136+
" \"embedding_dims\": embedding_model_dims\n",
151137
" }\n",
152138
" },\n",
153139
" \"llm\": {\n",
@@ -159,16 +145,9 @@
159145
" }\n",
160146
" },\n",
161147
" \"vector_store\": {\n",
162-
" \"provider\": \"opensearch\",\n",
148+
" \"provider\": \"neptune\",\n",
163149
" \"config\": {\n",
164-
" \"collection_name\": \"mem0ai_vector_store\",\n",
165-
" \"host\": opensearch_host,\n",
166-
" \"port\": opensearch_post,\n",
167-
" \"http_auth\": auth,\n",
168-
" \"embedding_model_dims\": embedding_model_dims,\n",
169-
" \"use_ssl\": True,\n",
170-
" \"verify_certs\": True,\n",
171-
" \"connection_class\": RequestsHttpConnection,\n",
150+
" \"endpoint\": f\"neptune-graph://{graph_identifier}\",\n",
172151
" },\n",
173152
" },\n",
174153
" \"graph_store\": {\n",
@@ -431,13 +410,13 @@
431410
"source": [
432411
"## Conclusion\n",
433412
"\n",
434-
"In this example we demonstrated how an AWS tech stack can be used to store and retrieve memory context. Bedrock LLM models can be used to interpret given conversations. OpenSearch can store text chunks with vector embeddings. Neptune Analytics can store the text chunks in a graph format with relationship entities."
413+
"In this example we demonstrated how an AWS tech stack can be used to store and retrieve memory context. Bedrock LLM models can be used to interpret given conversations. Neptune Analytics can store the text chunks in a graph format with relationship entities."
435414
]
436415
}
437416
],
438417
"metadata": {
439418
"kernelspec": {
440-
"display_name": ".venv",
419+
"display_name": "Python 3 (ipykernel)",
441420
"language": "python",
442421
"name": "python3"
443422
},
@@ -451,9 +430,9 @@
451430
"name": "python",
452431
"nbconvert_exporter": "python",
453432
"pygments_lexer": "ipython3",
454-
"version": "3.13.2"
433+
"version": "3.13.5"
455434
}
456435
},
457436
"nbformat": 4,
458-
"nbformat_minor": 2
437+
"nbformat_minor": 4
459438
}
Lines changed: 27 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,27 @@
1+
"""
2+
Configuration for Amazon Neptune Analytics vector store.
3+
4+
This module provides configuration settings for integrating with Amazon Neptune Analytics
5+
as a vector store backend for Mem0's memory layer.
6+
"""
7+
8+
from pydantic import BaseModel, Field
9+
10+
11+
class NeptuneAnalyticsConfig(BaseModel):
12+
"""
13+
Configuration class for Amazon Neptune Analytics vector store.
14+
15+
Amazon Neptune Analytics is a graph analytics engine that can be used as a vector store
16+
for storing and retrieving memory embeddings in Mem0.
17+
18+
Attributes:
19+
collection_name (str): Name of the collection to store vectors. Defaults to "mem0".
20+
endpoint (str): Neptune Analytics graph endpoint URL or Graph ID for the runtime.
21+
"""
22+
collection_name: str = Field("mem0", description="Default name for the collection")
23+
endpoint: str = Field("endpoint", description="Graph ID for the runtime")
24+
25+
model_config = {
26+
"arbitrary_types_allowed": False,
27+
}

mem0/utils/factory.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -176,6 +176,7 @@ class VectorStoreFactory:
176176
"langchain": "mem0.vector_stores.langchain.Langchain",
177177
"s3_vectors": "mem0.vector_stores.s3_vectors.S3Vectors",
178178
"baidu": "mem0.vector_stores.baidu.BaiduDB",
179+
"neptune": "mem0.vector_stores.neptune_analytics.NeptuneAnalyticsVector",
179180
}
180181

181182
@classmethod

mem0/vector_stores/configs.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,7 @@ class VectorStoreConfig(BaseModel):
1818
"mongodb": "MongoDBConfig",
1919
"milvus": "MilvusDBConfig",
2020
"baidu": "BaiduDBConfig",
21+
"neptune": "NeptuneAnalyticsConfig",
2122
"upstash_vector": "UpstashVectorConfig",
2223
"azure_ai_search": "AzureAISearchConfig",
2324
"redis": "RedisDBConfig",

0 commit comments

Comments
 (0)