Loading webinar...
Please check your connection and try again.
Learn how to build persistent memory for LLMs using SurrealDB's graph and vector capabilities. Transform your AI agents from simple search to intelligent reasoning with structured, long-term memory.
Most retrieval-augmented systems stop at search. But agents and copilots need structured, persistent memory to reason, adapt, and evolve over time.
In this session, we’ll show how to turn SurrealDB into a long-term memory layer for your LLM apps, combining graph and vector data to power richer context, better decisions.
We’ll walk through practical patterns and show how SurrealDB collapses graph, vector, and relational data into a single memory substrate for next-gen AI.
Senior Product Marketing Manager at SurrealDB
Store persistent memories with graph-linked facts
Perform similarity search and structured reasoning in one query
Use vector embeddings and graph hops inside SurrealDB