Over the past few years, I've been using queues more and more in my applications. While this is a concept that many developers learn early on, for whatever reason I just didn't. I had heard about queues but never really understood where I would use them and what value they would give.
I started using queues in my clients' applications when I ran into issues with looping through datasets to send transactional emails. I'd query for a list of customers to send scheduled emails to and partway through the loop, something would throw an error. This would inevitably crash the program and I'd need to restart the loop by offsetting where it stopped so as to not duplicate the emails that went out before the crash.
for (let customer of customers) {
await sendReminderEmail(customer); // 1, 2, 3 are good but 4 crashes 💥
}
My life got infinitely better when I figured out that I could use queues to process these emails. Instead of relying on a single fragile loop to process everything, I could instead set up a dedicated server to be the place to manage a bunch of aspects of the job. If the initial attempt sending the email were to fail, it would be automatically retried. If something threw an error, it wouldn't impact the rest of the jobs.
emailQueue.process(async (job, done) => {
await sendReminderEmail(job);
// 1, 2, 3 are good but 4 crashes but it's no big deal 😄
done();
});
Typical Queues in Node
My experience with queues has mostly been in a traditional long-running server environment with Node.js. In an express-style application, it's fairly easy to set up a job queue and process those jobs.
In my case, I've mostly used a library called Bull along with Redis. With these, setting up a job queue looks something like this:
# install the bull package
npm install bull
# run the redis server locally (install it if necessary)
redis-server
import Queue from "bull";
export const emailQueue = new Queue("emailQueue");
The above emailQueue
defines a queue that can be operated on. We can add
jobs to it and then process
those jobs in any way we like.
For example, if we wanted to queue up an email to go to a customer after some data is saved, we could do so in a route handler or resolver.
import { emailQueue } from "./queues";
export const createInvoice = async ({ input }) => {
// create the invoice
const invoice = await db.invoice.create({
data: {
// ... invoice data stuff
},
});
// do some other stuff
// once all of that is done, queue up an email to the customer
await emailQueue.add(invoice.customer);
return invoice;
};
Adding a job to the queue essentially pushes a record to our redis server. This record then needs to be processed somehow. It's in the processing of the job that we do the actual email sending action.
In a typical long-running node server with something like a server.js
file, this might look like the following:
// server.js
import express from 'express';
import { emailQueue } from './queues';
import { sendCustomerEmail } from './util';.
const app = express();
// set up app stuff
...
// process the queue'd jobs
emailQueue.process(async(job, done) => {
sendCustomerEmail(job.data);
done();
})
The emailQueue.process
sits there always listening for jobs to process. In this setup, as soon as a job is added to the queue, it is processed immediately. This might be appropriate for your situation but it might not. Perhaps you want to send the emails in a batch at a certain time. In this case, you might prefer to use something like a cron job to schedule queue processing at predefined times.
Queues in RedwoodJS
I recently wanted to bring this same setup to a RedwoodJS app that I'm building for a client. Since there isn't a typical server.js
file that we might find for an express-like node application, I wasn't sure how to set things up. Redwood provides the structure to get CRUD resolvers with simple commands but there wasn't anything in the api
directory that looked to be appropriate for processing the queue'd jobs.
After chatting with some other RedwoodJS users, I discovered something I didn't realize existed in Redwood: the yarn rw exec
command for arbitrary scripts.
These scripts can be both one-off and long-running. Since I'm using Render for hosting, I can use a long-running script that listens for queue'd jobs to process. Perfect!
Extending the above example, we can move this to a RedwoodJS script.
In the scripts
directory, we can create a new file called emailQueue
.
yarn rw g script emailQueue
Once the script is created, we can move the job processing logic there.
import { emailQueue } from "./../api/src/lib/queue";
export default async ({ _args }) => {
emailQueue.process(async (job, done) => {
// send the email
sendCustomerEmail(job.data);
done();
});
};
This is pretty much what we had before but the file has a single purpose now: it just processes the queue'd jobs. We're no longer mixing it into a larger file like we would with a server.js
file in a traditional Node app.
The queue instance is now in the api
directory. We don't need to put it there but it's a decent place for it since it will be imported in our services so that jobs can be added to the queue.
The script is in place but now we need to execute it. We do this with a simple Redwood CLI command:
yarn rw exec emailQueue
When we exec
this script, it will sit there any listen for queue'd jobs to process which is exactly what we need.
Running the Script in Deployments
Running the script to process jobs is easy enough in development using the yarn rw exec
command. But what about when we deploy our apps?
Turns out this is faily simple as well. We just need to adjust the deployment command.
For example, in the settings for my app hosted with Render, I just change the start command from this
yarn rw deploy render api
To this:
yarn rw deploy render api && yarn rw exec emailQueue
If you've got the funds for multiple instances in your deployment environment, it's probably wise to have a dedicated service for your queue processing. If your queue script crashes, you don't want it affecting the rest of your app.
In this scenario, you may want to create a new instance and point your RedwoodJS app to it. Then the start command would just be the stuff to get the script running:
yarn rw exec emailQueue
Wrapping Up
RedwoodJS allows us to have long-running processes through executable scripts. Without much effort at all, we can get the same kind of behavior we're looking for from a traditional long-running node app. This setup is perfect for working with queues in a RedwoodJS app.