Using Valkey Glide as a library within a Rust Lambda on AWS? #3072
-
Hello Valkey Glide team! Would you be able to direct me to any documentation on using the inner |
Beta Was this translation helpful? Give feedback.
Replies: 6 comments 8 replies
-
Hello! 🙂 We currently don't publish our Rust client as a crate. However, if you'd like to try Valkey Glide in Rust, you can use the following example code: You can build and use it locally. If you try it out and find it useful, we’d love to hear your feedback! Also, if you're interested in contributing to the development of the Rust client, let us know—we’d be happy to collaborate. Let us know how it goes! 🚀 |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
I'm new to AWS Lambdas and Valkey, but decided to give this a go because we've been looking into elasticache. Prereqs: Setup Cargo Lambdahttps://www.cargo-lambda.info/
Adding glide-core to the project
Compilation Errors
Had to add a
Adding a glide-core client to Lambda FunctionI set the glade client up so that it can be shared among multiple requests:
Sample Service Code (with redactions)
Setting up a valkey cache in AWSI'm sure there's more config that you can do, but I followed the instructions here:
You have to wait for the cache to be ready, and you can periodically check it with:
Once the cache is ready, you should be able to get its endpoint information. Unfortunately, you can't test elasticache resources locally, because the endpoints are only accesible behind AWS VPC. Runtime errorsNotice in the code above I have For libraries it's typically bad practice to setup logging yourself, specifically because you don't know what / how the main application is going to set it up. CaveatI haven't actually been able to test this fully yet. I'm waiting to get IAM permissions so I can actually deploy the lambda, but for now everything is at least compiling. |
Beta Was this translation helpful? Give feedback.
-
@AfterThunk
Please keep us posted once you're able to fully test the deployment — we’d love to hear how it goes. |
Beta Was this translation helpful? Give feedback.
-
Update:Changing Logging
CompilationThis page on cargo-lambda actually calls out the cross compilation issues I ran into: DeploymentDeployment worked fine with Connection IssuesBy default, elasticache is only accessible via VPC. I'm not sure if this is the correct way to handle this, but I ended up adding my lambda to the same VPC as elasticache.
Lambda Function URLI needed my lambda to be publicly accesible, so I used the
RunningAt this point, the lambda successfully spins up, I can reach it via the Function URL, and it works: // main.rs
use lambda_http::{run, service_fn, tracing, Error, Body, Request, Response, tracing};
use serde::{Serialize, Deserialize}
struct SharedResources {
pub(crate) glide_client: glide_core::client::Client
}
async fn make_glide_client() -> glide_core::client::Client {
// These variables are set with the lambda deployments.
// https://docs.aws.amazon.com/lambda/latest/dg/configuration-envvars.html#:~:text=To%20set%20environment%20variables%20in,Under%20Environment%20variables%2C%20choose%20Edit.
let address_info = glide_core::client::NodeAddress {
host: std::env::var("GLIDE_HOST_IP").map_err(Box::new).unwrap(),
port: std::env::var("GLIDE_HOST_PORT").map_err(Box::new).unwrap().parse().map_err(Box::new).unwrap()
};
let connection_request = glide_core::client::ConnectionRequest {
addresses: vec![address_info],
cluster_mode_enabled: false,
request_timeout: std::env::var("GLIDE_REQUEST_TIMEOUT").ok().and_then(|v| v.parse::<u32>().ok()),
tls_mode: Some(glide_core::client::TlsMode::SecureTls),
..Default::default()
};
glide_core::client::Client::new(connection_request, None).await.unwrap()
}
type LambdaResult<T> = Result<T, lambda_http::Error>;
#[tokio::main]
async fn main() -> LambdaResult<()> {
lambda_http::tracing::init_default_subscriber();
let sdk_config = aws_config::load_defaults(aws_config::BehaviorVersion::l
glide_client: make_glide_client().await
};
let shared_resources_ref = &shared_resources;
let handler = move |event: lambda_http::Request| async move {
function_handler(shared_resources_ref, event).await
};
lambda_http::run(lambda_http::service_fn(handler)).await
}
struct StringErr {
value: String
}
impl StringErr {
pub fn boxed(value: &str) -> Box<StringErr> {
Box::new(StringErr {
value: value.to_string()
})
}
}
impl std::fmt::Debug for StringErr {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "{}", self.value)
}
}
impl std::fmt::Display for StringErr {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "{}", self.value)
}
}
///////////////////////////////////////////////////////////////////////////////
/// Request / Result types
///////////////////////////////////////////////////////////////////////////////
#[derive(Deserialize, Debug)]
enum RequestType {
GetStatus(GetStatusRequest),
UpdateStatus(UpdateStatusRequest)
}
#[derive(Serialize, Deserialize, Debug)]
enum ResultType {
GetStatus(GetStatusResult),
UpdateStatus(UpdateStatusResult)
}
trait RequestHandler {
async fn handle(self, shared_resources: &crate::SharedResources) -> LambdaResult<ResultType>;
}
impl RequestHandler for RequestType {
async fn handle(self, shared_resources: &crate::SharedResources) -> LambdaResult<ResultType> {
match self {
Self::GetStatus(request) => request.handle(shared_resources).await,
Self::UpdateStatus(request) => request.handle(shared_resources).await
}
}
}
///////////////////////////////////////////////////////////////////////////////
/// Update Ticket status
///////////////////////////////////////////////////////////////////////////////
#[derive(Deserialize, Debug)]
struct UpdateStatusRequest {
ticket: Ticket
}
#[derive(Deserialize, Serialize, Debug)]
struct UpdateStatusResult {}
impl RequestHandler for UpdateStatusRequest {
async fn handle(self, shared_resources: &crate::SharedResources) -> LambdaResult<ResultType> {
let mut cmd = redis::Cmd::new();
cmd.arg("SET")
.arg(self.ticket.ticket_id.as_ref().unwrap())
.arg(serde_json::to_string(&self.ticket).map_err(Box::new)?);
let _ = shared_resources.glide_client
.clone()
.send_command(&cmd, None)
.await
.map_err(Box::new)?;
Ok(ResultType::UpdateStatus(UpdateStatusResult {}))
}
}
///////////////////////////////////////////////////////////////////////////////
/// Get Ticket Status
///////////////////////////////////////////////////////////////////////////////
#[derive(Deserialize, Debug)]
struct GetStatusRequest {
ticket_id: String
}
#[derive(Deserialize, Serialize, Debug)]
struct GetStatusResult {
ticket: Ticket
}
impl RequestHandler for GetStatusRequest {
async fn handle(self, shared_resources: &crate::SharedResources) -> LambdaResult<ResultType> {
let mut cmd = redis::Cmd::new();
cmd.arg("GET")
.arg(self.ticket_id);
let result = shared_resources.glide_client
.clone()
.send_command(&cmd, None)
.await
.map_err(Box::new)?;
let ticket : Ticket = match result {
redis::Value::SimpleString(val) => serde_json::from_str(&val).map_err(Box::new)?,
redis::Value::BulkString(val) => serde_json::from_slice(&val).map_err(Box::new)?,
_ => return Err(StringErr::boxed("Invalid result from Valkey!"))
};
Ok(ResultType::GetStatus(GetStatusResult { ticket }))
}
}
///////////////////////////////////////////////////////////////////////////////
/// Main Function Handler
///////////////////////////////////////////////////////////////////////////////
async fn function_handler(shared_resources: &crate::SharedResources, event: Request) -> LambdaResult<Response<Body>> {
tracing::debug!("function_handler: invoked!");
let request : RequestType = match event.body() {
lambda_http::Body::Empty => Err(StringErr::boxed("Requests cannot be empty!"))?,
lambda_http::Body::Text(val) => serde_json::from_str(&val)?,
lambda_http::Body::Binary(val) => serde_json::from_slice(&val)?
};
tracing::debug!("function_handler: request: {event:?}");
// TODO: Bailing with an error causes a 502 Bad Gateway to be received by the client.
let result = request.handle(shared_resources).await?;
tracing::debug!("function_handler: result: {result:?}");
let resp = Response::builder()
.status(200)
.header("content-type", "application/json")
.body(serde_json::to_string(&result).map_err(Box::new)?.into())
.map_err(Box::new)?;
Ok(resp)
} |
Beta Was this translation helpful? Give feedback.
-
@AfterThunk 👏🏽 |
Beta Was this translation helpful? Give feedback.
Hello! 🙂
We currently don't publish our Rust client as a crate. However, if you'd like to try Valkey Glide in Rust, you can use the following example code:
Valkey Glide Rust Benchmark.
You can build and use it locally. If you try it out and find it useful, we’d love to hear your feedback! Also, if you're interested in contributing to the development of the Rust client, let us know—we’d be happy to collaborate.
Let us know how it goes! 🚀