Blog

Using Node to sling high volume requests into Redis as Resque jobs

A client of ours has a Rails application that needs to consume hundreds of requests per second from a third party service, with flexibility to scale upwards to thousands of requests per second. Rails just isn't well suited to handle such a high volume of requests.

Node.js on the other hand is a great to tool when it comes to doing a simple task very fast and in high volumes, and being able to easily scale to boot.

Aside from consuming the request, each request needs to be processed individually, and at that volume, a queue was necessary. Resque was an easy choice.

The Node Server

Disclaimer: The following code is simplified for brevity and is not production ready.

Setup modules, Redis, the Resque queue name, and the Resque worker class name:

Consume the request:

Translate them into a format Resque understands:

Send them to the Redis store:

A batch of queued jobs will get sent to Redis once every second.

Take note that job_queue is getting spliced. Between the lines of code count = job_queue.length; and job_queue.splice( 0, count ); it's possible and likely that a request will sneak into the queue, so it's important we splice off of the queue instead of clearing/resetting it.

Resque workers pick up the jobs from there.

A Demonstration

Run the following locally:

Then open Chrome, navigate to the node server url, and paste the following into Chrome's developer tool's console:

You'll see the records get slung into Redis, ready for those hungry Resque workers!

That's it. Cheers!

Props

Sam Breed gets the credit for whipping up the actual production node server. And a special thanks to Ryan Cook for a jump start with the node code that helped build the job in a format that Resque understands.