Scaling WebSocket Applications with Redis Pub/Sub for Horizontal Deployment

WebSockets enable persistent real-time connections between clients and servers — but scaling them across multiple instances introduces complexity. In this article, we’ll explore how to use Redis Pub/Sub to broadcast messages between WebSocket servers in a horizontally scaled Node.js environment.

Why Redis Pub/Sub?

When using multiple server instances behind a load balancer, a client connected to instance A won't be able to send messages to clients connected to instance B unless there's a shared message bus. Redis Pub/Sub allows all instances to stay in sync and broadcast messages to one another in real time.

Step 1: Install Dependencies

npm install ws redis uuid

Step 2: Create Redis Clients

Each instance will use one Redis client to publish messages and another to subscribe to the shared channel.

// redis.js
const { createClient } = require('redis');

const pub = createClient();
const sub = createClient();

pub.connect();
sub.connect();

module.exports = { pub, sub };

Step 3: WebSocket Server with Redis Sync

// server.js
const WebSocket = require('ws');
const { v4: uuidv4 } = require('uuid');
const { pub, sub } = require('./redis');

const wss = new WebSocket.Server({ port: process.env.PORT || 3000 });
const clients = new Map();

sub.subscribe('chat-channel', (message) => {
  const data = JSON.parse(message);
  for (const ws of clients.values()) {
    if (ws.readyState === WebSocket.OPEN) {
      ws.send(JSON.stringify(data));
    }
  }
});

wss.on('connection', (ws) => {
  const id = uuidv4();
  clients.set(id, ws);

  ws.on('message', (message) => {
    const data = JSON.parse(message);
    pub.publish('chat-channel', JSON.stringify(data));
  });

  ws.on('close', () => {
    clients.delete(id);
  });
});

console.log('WebSocket server with Redis Pub/Sub running.');

Step 4: Testing with Multiple Instances

You can run multiple instances of the server on different ports:

PORT=3000 node server.js
PORT=3001 node server.js

Then use a load balancer (e.g., Nginx or HAProxy) to distribute clients across instances.

Benefits of This Approach

  • Scalable: Easily handle thousands of concurrent clients across machines.
  • Decoupled: Each instance can operate independently but still stay in sync.
  • Performant: Redis Pub/Sub offers low-latency broadcasting between nodes.

Limitations

This is great for simple messaging, but Redis Pub/Sub is ephemeral — if a subscriber disconnects, it misses messages. For reliability and message history, you might explore Redis Streams, Kafka, or persistent queues.

Conclusion

Scaling WebSocket servers with Redis Pub/Sub is a powerful strategy to support real-time applications across distributed infrastructure. It keeps your messaging layer fast and synchronized with minimal effort.

If this post helped you, consider supporting me: buymeacoffee.com/hexshift