Hello World, this is the beginning of 2 Part Series on "How to make your REST APIs blazing fast 🚀". These are from my personal experiences and from projects that I built.

Some time ago, I was working on a marketplace platform, where the users can list their products to sell. On the Home Page, it would load a bunch of products, and with the products self-data, it will also load some stats, previous sale history, recent lists data etc. We also let the user sort, filter and do more actions right on the page without reloading or re-fetching for a quick experience. But this came at a cost. For the API to send all this data, it had to do a bunch of calculations which ended up taking a few hundred milliseconds, in range 200-400ms, and worse during high traffic. So we started looking into ways to improve this. This series talks about those methods.

Part 1: A simple caching strategy for Node REST APIs

Part 2: Cache invalidation 😭

So let's jump right into Part 1

Here is the endpoint that we will work on. It simply takes in some query, fetches data from the database, processes it and returns back a JSON response.



// products/routes.js router . get ( ' / ' , processQuery , productsController . index , responseHandler )

okay now, let's add some Cache 💸!

For this example, We will use node-cache, we will put it in a single file, then it can be easily replaced by any cache storage by changing just few lines.

First of all, install the node-cache package.

$ npm install node-cache --save

We will create a cache middleware, it can be used with any endpoint we want easily. This is how the middleware looks like.



// middlewares/cache.js const NodeCache = require ( ' node-cache ' ) // stdTTL: time to live in seconds for every generated cache element. const cache = new NodeCache ({ stdTTL : 5 * 60 }) function getUrlFromRequest ( req ) { const url = req . protocol + ' :// ' + req . headers . host + req . originalUrl return url } function set ( req , res , next ) { const url = getUrlFromRequest ( req ) cache . set ( url , res . locals . data ) return next () } function get ( req , res , next ) { const url = getUrlFromRequest ( req ) const content = cache . get ( url ) if ( content ) { return res . status ( 200 ). send ( content ) } return next () } module . exports = { get , set }

Let's go over the functions one by one.

getUrlFromRequest takes the request and returns the complete request URL.

We use this URL as the unique KEY for our cache. set saves our processed response ( res.locals.data ) to the cache with the complete URL as the KEY. get uses the URL as the KEY to retrieve the previously stored cached response, if it finds the data, it sends it back as the response, else the request is forwarded to the next middleware.

Our cache middleware is ready! Let's plug it in with our product route.



// products/routes.js const cache = require ( ' ./cache-middleware ' ) // 👈 import our cache middleware router . get ( ' / ' , cache . get , // 👈 processQuery , productsController . index , cache . set , // 👈 responseHandler )

That's all, Our Endpoint is already faster! But how 😯??

We have added our two middlewares get and set to the route. When a new request comes in, it will first go through cache.get , since we don't have anything in the cache yet, the request passes down to the next middlewares and arrives at cache.set , which will save the response in the cache for next 5 minutes.

Any request that comes in the next 5 minutes, will be able to retrieve this cache form cache.get and will immediately return it to the users. No calculations are done. The database isn't touched.

By doing this we were able to bring down our response time to just a few milliseconds 🎉.

But, yes, this is not the final solution, there are minor issues with this approach. Users on the site won't get real-time data, the data shown can be max 5 mins old. While this approach may work for some use cases, this was not an acceptable solution for us, our users needed real-time data. So we had to look into this more. We had to look into Cache Invalidation 😈, which we will talk about in our next Part. 👋

Follow me on Twitter | Github, I build and post cool stuff. 👨‍💻