Let’s be real: streaming RTSP in the browser is like trying to fit a square peg into a round hole. RTSP streams are like the cool kids at the party that browsers just don’t invite. You can either use HLS, which works but adds a bit of a delay (15–60 seconds, anyone?), or you can get fancy and use WebRTC, which, spoiler alert, is way better for real-time performance.

In this post, I’m going to walk you through how we went from sub‑1ms latency in our local prototype to 60-second delays in production and back again (with WebRTC, of course). Plus, how you can avoid the pain I went through, so you don’t end up pulling your hair out in frustration.


Table of Contents

  1. Why RTSP Doesn't Just Work in the Browser
  2. My First Attempt: RTSP → HLS with FFmpeg & hls.js
  3. The Latency Nightmare: AI + HLS = Bad News
  4. Why HLS Will Never Be Your Real-Time Friend
  5. Enter WebRTC & Janus Gateway: Our Savior
  6. How I Made It Work (and Stayed Sane)
  7. What I Learned (aka My Cautionary Tale)
  8. Conclusion: Don't Be Like Me, Do It Right From the Start

1. Why RTSP Doesn’t Just Work in the Browser

Okay, so RTSP (Real-Time Streaming Protocol) is the king of live video streaming. It’s what most cameras use, and it works great… until you try to get it into the browser. Browsers are all like, "Nah, I don't do RTSP." They prefer HLS (HTTP Live Streaming) because it’s super chill and easy to play.

But here’s the catch: HLS has latency—lots of it. Like, sometimes more than 30 seconds, and that’s not ideal when you're trying to build something that’s, you know, real-time. Think surveillance, AI detection, or anything where delays mean missed action (or worse—awkward silences in a live event stream).


2. My First Attempt: RTSP → HLS with FFmpeg & hls.js

So, I decided to hack together a quick prototype. I was like, "How hard can it be to just convert RTSP to HLS and get it working in Angular?" Spoiler: It was easier than I thought, but it still felt like I was solving a puzzle while blindfolded.

Step 1: Convert RTSP to HLS with FFmpeg

First, I needed to take the RTSP stream and convert it to HLS, because that’s what browsers like. Here’s the FFmpeg command I used:

bash
ffmpeg -i rtsp://camera/stream \
-c:v copy -c:a aac \
-f hls \
-hls_time 1 \
-hls_list_size 3 \
-hls_flags delete_segments \
public/stream.m3u8

What does it do? It takes the RTSP stream and breaks it down into smaller HLS chunks—1 second each. It’s like taking a pizza and slicing it into tiny, snackable pieces. The browser loves that.

Step 2: Play HLS with hls.js in Angular

Then, I used hls.js to play the stream in Angular. A bit of code, and boom—it worked. Here’s the magic:

import { Component, AfterViewInit, ViewChild, ElementRef } from '@angular/core';
import Hls from 'hls.js';

@Component({
  selector: 'app-video-player',
  template: ``,
})
export class VideoPlayerComponent implements AfterViewInit {
  @ViewChild('video') videoRef!: ElementRef<HTMLVideoElement>;
  private hls!: Hls;

  ngAfterViewInit() {
    const video = this.videoRef.nativeElement;
    const src = '/assets/stream.m3u8';

    if (Hls.isSupported()) {
      this.hls = new Hls();
      this.hls.loadSource(src);
      this.hls.attachMedia(video);
      this.hls.on(Hls.Events.MANIFEST_PARSED, () => video.play());
    } else if (video.canPlayType('application/vnd.apple.mpegurl')) {
      video.src = src;
      video.addEventListener('loadedmetadata', () => video.play());
    }
  }
}

Local Results: I was feeling pretty proud of myself. The latency was like 1 ms. Real-time! I could almost hear the camera’s lens spinning. Everything was smooth.


3. The Latency Nightmare: AI + HLS = Bad News

Then, we decided to integrate it into the actual application. You know, the one where we run AI inference on the frames coming from the RTSP stream. This is where the fun begins—or should I say, the nightmare.

Everything worked fine locally. The stream was fast, I was happy, life was good. But once I threw it into production, the latency went through the roof:

  • 15–25 seconds of delay—seriously.
  • On slower connections, it even spiked to 1 minute.

Imagine trying to run real-time object detection with that kind of delay. It was like trying to stream a live event while watching it through a broken telescope.


4. Why HLS Will Never Be Your Real-Time Friend

HLS was great for a prototype, but it wasn’t real-time. Why? Let me break it down:

  • Segmented by Design: HLS takes your video, chops it into small segments, and sends them one at a time. That’s like waiting for a new episode of your favorite series, but only getting one minute at a time.
  • Client-Pull: The browser has to keep checking for new segments. It’s like sending someone to the store for milk and hoping they come back before you die of thirst.
  • Buffering: To avoid choppy video, browsers buffer multiple segments. Which means more latency. Fun, right?

5. Enter WebRTC & Janus Gateway: Our Savior

I was done. I needed a solution that could handle real-time streaming. Enter WebRTC and Janus Gateway—the dynamic duo that saved the day.

WebRTC is built for low-latency, real-time communication, and Janus Gateway is a powerful media server that supports WebRTC. It’s like the Batman and Robin of streaming—able to handle RTSP to WebRTC conversion with minimal delay. And the best part? It’s all peer-to-peer, which means no buffering, no waiting.


6. How I Made It Work (and Stayed Sane)

Here’s how I switched from HLS to WebRTC with Janus:

  1. Install Janus Gateway

    Janus is like the Swiss Army knife of WebRTC servers, so I installed it and configured the streaming plugin for RTSP.

  2. Configure RTSP Streaming

    Janus can take an RTSP stream and serve it directly as a WebRTC stream. I added the camera’s RTSP URL in the config file and boom, ready to go.

  3. WebRTC Player in Angular

    I used the Janus JS API to connect to the Janus server and get the stream playing. It was as simple as:

// WebRTC video player component
this.janus.attach({
  plugin: 'janus.plugin.streaming',
  success: (pluginHandle) => {
    this.streamPlugin = pluginHandle;
    this.streamPlugin.send({ message: { request: 'watch', id: 1 } });
  },
  onremotestream: (remoteStream) => {
    this.videoRef.nativeElement.srcObject = remoteStream;
  },
});

Now the latency was under 500 ms, and my AI model could process frames in real-time.


7. What I Learned (aka My Cautionary Tale)

  • Latency is real: HLS is fine for static streaming, but if you're doing AI inference, it’s too slow.
  • WebRTC is your friend: If you need true real-time performance, WebRTC + Janus is the way to go.
  • WebRTC ≠ Easy: It’s way more complex than HLS, but so worth it for real-time apps.

8. Conclusion: Don’t Be Like Me, Do It Right From the Start

If you’re trying to stream RTSP with minimal latency for real-time processing, HLS just won’t cut it. Skip the frustration, and go straight to WebRTC. It’ll save you from tearing your hair out (and trust me, I’ve been there).

So, save yourself the headache, grab a Janus server, and start streaming like a pro. You’ll thank me later. 😉


Image description

Happy streaming, and may your latencies always be low and your frame rates high!