Replies: 5 comments
-
|
Graph memory is a natural evolution for Mem0. Here's an architecture that could work: Graph Memory Extension Designfrom typing import List, Dict, Optional
from dataclasses import dataclass
@dataclass
class MemoryNode:
id: str
content: str
embedding: List[float]
metadata: Dict
node_type: str # "fact", "event", "entity", "preference"
@dataclass
class MemoryEdge:
source_id: str
target_id: str
relation_type: str # "causes", "related_to", "contradicts", "updates"
weight: float
timestamp: str
class GraphMemoryStore:
def __init__(self, vector_store, graph_db):
self.vectors = vector_store # Existing Mem0 store
self.graph = graph_db # Neo4j/NetworkX
def add_memory(self, content: str, user_id: str):
# 1. Create node with embedding (existing Mem0 flow)
node = self.vectors.add(content, user_id=user_id)
# 2. Extract entities and relationships
entities = self.extract_entities(content)
# 3. Link to existing memories
similar = self.vectors.search(content, limit=5)
for mem in similar:
relation = self.infer_relation(content, mem.content)
if relation:
self.graph.add_edge(MemoryEdge(
source_id=node.id,
target_id=mem.id,
relation_type=relation,
weight=mem.score
))
return node
def search_with_context(self, query: str, depth: int = 2):
# Vector search for initial matches
matches = self.vectors.search(query)
# Expand via graph traversal
expanded = set()
for match in matches:
neighbors = self.graph.get_neighbors(match.id, depth=depth)
expanded.update(neighbors)
# Rank by combined vector + graph score
return self.rank_results(matches, expanded)Key Benefits
Integration Points# Could extend existing Mem0 Memory class
from mem0 import Memory
class GraphMemory(Memory):
def __init__(self, config):
super().__init__(config)
self.graph = GraphStore(config.graph_config)More on state-based memory: https://github.com/KeepALifeUS/autonomous-agents |
Beta Was this translation helpful? Give feedback.
-
|
We built an open-source OpenClaw plugin that talks to a self-hosted Mem0 REST server (not the cloud platform). Currently uses the standard vector memory endpoints — would love to add graph-memory support once that's exposed via the REST API. The plugin is a thin HTTP client, so adding graph-memory queries would be straightforward if Mem0 exposes them at |
Beta Was this translation helpful? Give feedback.
-
|
Graph memory for OSS would be huge! The relationship tracking between memories is where a lot of the value is. What graph memory enables:
DIY approach while waiting: You could implement a lightweight graph layer on top of current mem0: from neo4j import GraphDatabase
class GraphMemoryWrapper:
def __init__(self, mem0_client, neo4j_uri):
self.mem0 = mem0_client
self.graph = GraphDatabase.driver(neo4j_uri)
def add_with_relations(self, memory, relations: list):
# Add to mem0 for vector search
mem_id = self.mem0.add(memory)
# Add to neo4j for graph traversal
with self.graph.session() as session:
session.run(
"MERGE (m:Memory {id: $id, text: $text})",
id=mem_id, text=memory
)
for rel in relations:
session.run(
"MATCH (a:Memory {id: $from}), (b:Memory {id: $to}) "
"MERGE (a)-[:$type]->(b)",
**rel
)Not as clean as native support, but works for now. We have built similar hybrid memory systems at Revolution AI — vector for semantic recall, graph for relationships. Would love to see this native in mem0! |
Beta Was this translation helpful? Give feedback.
-
|
Would also love to see graph memory in OSS mode! Workaround until official support: You can build a lightweight graph layer on top of Mem0: from mem0 import Memory
from neo4j import GraphDatabase
class GraphMemory:
def __init__(self):
self.mem0 = Memory()
self.neo4j = GraphDatabase.driver("bolt://localhost:7687")
def add(self, text, user_id, entities=None):
# Store in Mem0
mem_id = self.mem0.add(text, user_id=user_id)
# Extract and store relationships
if entities:
with self.neo4j.session() as session:
for entity in entities:
session.run("""
MERGE (e:Entity {name: $name})
MERGE (m:Memory {id: $mem_id})
MERGE (e)-[:MENTIONED_IN]->(m)
""", name=entity, mem_id=mem_id)
return mem_id
def search_with_graph(self, query, user_id):
# Get vector results
memories = self.mem0.search(query, user_id=user_id)
# Enrich with graph context
# ...Why graph memory matters:
We build graph-enhanced memory systems at Revolution AI — the Neo4j + Mem0 combo works well. Would definitely use native support if available! |
Beta Was this translation helpful? Give feedback.
-
|
+1 for Graph Memory in OSS mode! Current workaround: Integrate Neo4j/FalkorDB directly from mem0 import Memory
from neo4j import GraphDatabase
class GraphEnhancedMemory:
def __init__(self):
self.mem0 = Memory()
self.graph_db = GraphDatabase.driver("bolt://localhost:7687")
def add(self, text, user_id, **kwargs):
# Store in Mem0 (vector)
mem_result = self.mem0.add(text, user_id=user_id)
# Extract entities and store in graph
entities = self.extract_entities(text)
with self.graph_db.session() as session:
for entity in entities:
session.run(
"MERGE (e:Entity {name: $name, user_id: $user_id}) "
"SET e.last_seen = timestamp()",
name=entity, user_id=user_id
)
return mem_result
def search(self, query, user_id):
# Vector search
vector_results = self.mem0.search(query, user_id=user_id)
# Graph traversal for related entities
with self.graph_db.session() as session:
graph_results = session.run(
"MATCH (e:Entity {user_id: $user_id})-[r]-(related) "
"WHERE e.name IN $entities RETURN related",
user_id=user_id,
entities=self.extract_entities(query)
)
return self.merge_results(vector_results, graph_results)Why graph memory matters:
OSS alternatives:
We build knowledge graphs with Mem0 at Revolution AI — would love to see official graph support in OSS mode. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hey guys,
thanks a bunch for providing the openclaw extension!
I wonder whether we can expect graph-memory support for the opensource mode soon?
Currently
enableGraphis only available for the platform-mode.Beta Was this translation helpful? Give feedback.
All reactions