This post is a follow-up to this one from two weeks ago which detailed how to make a medical chatbot using SurrealDB and LangChain using Python.
Rust developers have an option to do the same too, thanks to a crate called langchain_rust which as of last year includes support for SurrealDB as a vector store. This implementation doesn’t (yet!) include graph queries, but we can still use classic vector search to find recommendations for treatment for a patient.
To start off, use a command like cargo new medical_bot
 to create a new Cargo project, go into the project directory and add the following under [dependencies]
.
The langchain-rust
 crate comes with OpenAI as a default, and includes a large number of features. We will add mistralai
 to show how easy it is to switch from one platform to another with only about two lines of different code.
The original post assumes that we have a big YAML document with a number of symptoms along with their possible treatments, which is what the serde_yaml
 dependency will let us work with.
To keep the logic simple, we will take only the description
 of each symptom and its possible_treatments
, giving us two structs that look like this.
Then for each symptom, we will look through the possible treatments to create a document for each with text that looks like the following:
This needs to be turned into a Document
 struct on the langchain-rust
 side, which looks like this.
The way to create a Document
 is via Document::new()
 which takes a String
 for the page_content
, followed by an optional HashMap
 for any metadata. The score
 will be 0.0 when inserting and is only used later on when a similarity search is performed to return a Document
.
For the metadata
, we will add the other possible treatments so that any user will be able to first see a recommended treatment for a symptom, followed by all possible treatments for reference.
The Value
 part of the Document
 struct is a serde_json
 Value
, which is why we have serde_json
 inside our Cargo.toml
 as well.
All in all, the logic to grab the YAML
 file and turn it ito a Vec
 of Document
s looks like this.
With this taken care of, it’s time to do some setup inside main()
. First we need to start running the database, which can be run in memory or via some other path such as an address to a Surreal Cloud or a locally running instance.
The next step is to initialize an embedder from the langchain-rust
 crate. Here we have the choice of an OpenAiEmbedder
 or MistralAIEmbedder
 thanks to the added feature flag.
After that comes a StoreBuilder
 struct used to initiate a SurrealDB Store
, which takes an embedder, a database, and a number of dimensions - 1536 in this case for OpenAI. If using Mistral, the dimensions would be 1024.
Note that we are wrapping this in an Arc
 so that the store can start adding the documents on startup inside a separate task without making the user wait to see any CLI output.
At the very end is an .initialize()
 method which defines some tables and fields which will be used when doing similarity searches.
Then we will clone the Arc
 to allow the store
 to be passed into a new blocking task to add the Vec<Document>
 returned by our get_docs()
 function. Inside this is a method called add_documents()
 which is a built-in method from the rust-langchain
 crate.
While the store adds these documents in its own task, we will start a simple CLI that asks the user for a query, and then passes this into the built-in .similarity_search()
 method. This method allows us to specify the number of documents to return and a minimum similarity score, to which we will go with 2 and 0.6.
The rest of the code just involves setting up a simple loop to handle user output, along with the results of the output of the .similarity_search()
 method.
As the output shows, our bot is capable of returning meaningful results despite only having access to data from 236 lines of YAML!
Want to give it a try yourself? Save the content at this link to the filename symptoms.yaml
 and then copy the following code into your cargo project, then set the env var OPENAI_API_KEY
 or MISTRAL_API_KEY
 along with cargo run
.
You can also give a crate called archiver a try, which has its own command-line interface to use SurrealDB with Ollama via the same crate we used in this post.
featured
Jul 8, 2025 8 min read
featured
Jul 15, 2025 6 min read
 
 Explore our releases, news, events, and much more