Ice Fog Blog

Caching JSON objects in Node.js with Redis

Node.js is already a fast web framework, but adding the power, speed and flexibility of Redis can take it to the next level.

There are all sorts of caching strategies you can implement to improve performance in your apps. This post is going to look at caching JSON objects in a basic Redis store without loading any additional modules. While I don't have anything against modules, my preference is to keep things as vanilla as possible. Redis is able to store several types of values, but out-of-the-box, JSON isn't one of them.

Node.js is already fast, my database is already fast, why cache?

Many web apps are now running APIs in the backend, whether as a single service or microservices and speed is key to a great user experience. While Node can process a lot of data in a hurry and databases are fast, latency is still a killer. Many times, an API call might grab data from a few tables, combine it into a single response and return that. If you're lucky, you can do that in a single select database, but not always. If we are able to do that once or even once an hour (or whatever timeframe is appropriate) that's going to speed things up.

In the following code sample, I have wrapped the Redis-specific code inside of a single utility to hide the proprietary caching implementation away from my other files and ensure that if I decide to switch from Redis to any other platorm, I only need to change a single file. The following code assumes you have already imported, using npme the appropriate redis package.

cache.js

    const redis = require('redis');
    const config = require('../../config').cache;
    
    const client = redis.createClient(config.port, config.host);
    const DEFAULT_EXIRATION = config.expiration;
    
    async function getValue(key) {
        return await pullFromCache(key).then(t => t)
            .catch(err => { console.error(err); return undefined; });
    }
    
    async function pullFromCache(key) {
        return new Promise((resolve, reject) => {
            client.get(key, (err, reply) => {
                if (err)
                    reject(err);
    
                resolve(JSON.parse(reply));
            });
        });
    }
    
    function setValue(key, value) {
        client.set(key, JSON.stringify(value), (err, reply) => { });
        client.expire(key, DEFAULT_EXIRATION);
    }
    
    function deleteValue(key) {
        client.del(key, function (err, reply) {
            console.log(`DELETED CACHE KEY ${key}`);
        });
    }
    
    module.exports = { getValue, setValue, deleteValue }
  

This code is all quite self-explanatory, but the only thing I'd like to point out is the pullFromCache function is wrapping the client.get() in a Promise. I prefer this approach for any asyncronous calls, which isn't the way the Node Redis library works.

The following code shows how to set and get from the cache.

user.js
    const database = require('../database') // this is just knex
    const cache = require('../util/cache') // the cache wrapper from above
    
    async function getUser(req) {
        return new Promise(async (resolve, reject) => {
            const cacheKey = `user/${req.params.id}`;
            var cacheVal = await cache.getValue(cacheKey);
    
            if (cacheVal)
                return resolve(cacheVal);
    
            database("user").where('id', req.params.id).first()
              .then(user => {
                if (!user)
                  reject("User not found");
                
                  cache.setValue(cacheKey, user);
                  resolve(user);
              });
      }
  

I'm hoping that these code snippets can help somebody else like me, who doesn't want tons of extra packages in their projects (and blazing fast APIs).