Sometimes to understand the big picture you need ~to zoom out~ a simple example.
Let me show you one that:
creates the vector store (SurrealDBVectorStore) and graph (SurrealDBGraph) instances
adds documents to the vector store, including the embeddings (what are embeddings?)
builds a graph
based on a provided topic, does a vector search and a graph query to generate and answer in natural language
For this example, the data that we are going to store and then retrieve is:
concept definitions: stored in the Vector Store
people who know about those concepts: stored in the Graph (e.g. Martin -> knows about -> SurrealDB)
1. Create the vector store and graph instances
2. Add documents to the vector store
3. Build the graph
4. Let’s get an LLM involved
For this example we are using OllamaLLM from the LangChain components. You can use any other of the LLM components.
For Ollama, here are all the parameters. In the example I turned the temperature up to the max to get the craziest outcomes possible. You may want to leave it at around 0.7, but it depends on your use case.
Let’s try it out
Bonus stage
Did you notice the prompt in the code? Be creative and try different personalities:
🥁 Drum roll...
It never disappoints.
Ready to build?
Find all the code in the langchain-surrealdb repository examples.
Get started for free with Surreal Cloud.
Any questions or thoughts about this or graph queries using SurrealDB? Feel free to drop by our Discord to get in touch.
