In this post, we’re diving into building a straightforward gRPC reference data service for common stocks. This service will interface with a Postgres database, enabling functionalities to add and list stock entries through the gRPC.
Before diving into the specifics, ensure you’re up to speed with the setup covered in Part 1 as it lays the groundwork for the concepts and configurations we’ll build upon here.
Everything we do in this section will take place inside the ./api/
directory.
Unless otherwise specified, commands need to be run within the directory.
Adding Rust dependencies
To build the gRPC service, we’ll use tonic
and prost.
Update your Cargo.toml
as follows to include these dependencies alongside the existing ones:
This will add the necessary crates for the gRPC service, such as tonic
and prost
.
Creating the gRPC service
The auto-generated service
Let’s kickstart our service by defining its structure in a proto file,
which will later be used to auto-generate Rust code.
Create a directory called protos
and a new refdata.proto
file within it:
The service is just two endpoints for adding a new stock and retrieving all stocks, which will need to populate the combobox of stocks in the UI.
We will also need a custom build.rs
script to compile the protobuf
definitions into Rust code.
Finally, and crucially, we must register the generated code as a module within our library.
Let’s create a services.rs
module inside the src
folder, and include the auto-generated refdata
code.
Make sure to also register services
as a module in lib.rs
by adding
pub mod services
. At this point everything should compile again if you run cargo build
.
Error handling
Before we go any further, let’s add a custom Result
type and
error handling code that will make it a little bit easier to convert
errors between the business layer and gRPC service layer.
Add a new module for this:
This code converts various error types that may occur to the most appropriate gRPC status. It may not seem very useful just yet, but we’ll extend this to cover SQLx errors later on.
The skeleton service code
To implement the refdata
service, let’s create a new module called
refdata.rs
, with 3 submodules:
- refdata/models.rs will contain our data models.
- refdata/repository.rs will contain the persistence layer.
- refdata/service.rs will implement the auto-generated service interface.
There is only one simple model for stocks to put in models.rs
:
Note: Arguably, some types could be a little better - for example the
id
could be an unsigned integer. However, SQLx maps integers in Postgres toi32
and the type conversions add a significant amount of noise to the code.
The repository code is a little more interesting. We’ll use a trait to describe the desired interface, and provide a Postgres implementation with stub functions for now.
Finally, the bring it all together, the code in service.rs
will
service as the bridge between the gRPC interface and the repository.
The RefDataService
works with any implementation of the repository.
In our case, that will be the Postgres implementation using SQLx.
Don’t forget to register the refdata module in lib.rs
.
We’ll use some of these newly created types later on in main.rs
,
so we need to make sure they are publicly visible.
Let’s update refdata.rs
to expose what we need.
At this point, everything should compile, albeit with some warnings around unused variables and functions.
Running the service
With our refdata service implementation now compiling successfully, it’s time to build and run the gRPC server using tonic to serve the refdata service.
Configuring logging
Proper logging is crucial for debugging and monitoring our service. While a comprehensive setup for structured logging and tracing is beyond this post’s scope, we’ll configure the tracing library for improved log output.
Let’s configure the tracing library for nicer logs and replace the print statement.
So that we can control the log level, we’ll also add in a call to dotenvy
.
For dotenvy
to work, add a .env
file at the same level as the Cargo.toml
.
If you’d like to override the log level, just set the RUST_LOG
variable in .env
.
The tonic service
With that out of the way, the next step is to integrate it into a tonic-powered gRPC server.
Building the refdata server is straightforward. We’ll also add a reflection
service, which we’ll need for grpcui
.
With this code in place, you should now be able to run the service.
Using grpcui
Now that our server is up and running, let’s use grpcui
to explore its capabilities interactively.
This tool provides a graphical interface for testing and inspecting our gRPC services,
offering a practical way to verify our setup and functionality.
The port number needs to correspond to the port number specified for the tonic
service.
grpcui
should automatically open in your browser, but if not, it should also
print the port where it’s exposed on localhost.
In the UI, you should see a dropdown for the list of services, which only contains
RefData
for now. Under RefData
, you’ll see the methods listed.
Of course, the endpoints will fail, as our repository is just a bunch of todos, but being able to see the endpoints proves that the server is running correctly.
Conclusion
To wrap up, we’ve successfully established a foundational gRPC service for managing stock reference data, integrating essential Rust crates and setting the stage for a robust, type-safe communication layer. Although our endpoints aren’t fully functional yet, we’ve laid the groundwork for subsequent enhancements, which we’ll tackle in the next installment of this series.
The resulting code can be found in the v1-refdata-grpc branch on GitHub.
Next post - Part 3: Repository for the reference data service
David Steiner
I'm a software engineer and architect focusing on performant cloud-native distributed systems.