Skip to content

Custom Overlays

Oliver Hargreaves Mar 11, 2024 9:08:09 AM

Whether you are a social media sensation looking to spruce up your video or running a custom business conferencing tool, it is common practice to add logos or additional content to your video stream. We have previously discussed how to set up custom media pipelines, but this dives directly in to showcase how to make a realistic overlay that you can use.

In this example, we take a video feed, add a score box with the current time as well as an image, and stream it out for others to view. We have also added buttons to the example to be able to update the scores based on external input.

To begin, let's add the helper logic to control our user input and add the ability to load our image from a URL.

let p1Points = 0;

let p2Points = 0;
let p1Btn = document.getElementById("player1");
let p2Btn = document.getElementById("player2");
p1Btn.addEventListener("click", async () => { p1Points++; });
p2Btn.addEventListener("click", async () => { p2Points++; });
let loadImage = async (url) => {
    try {
        // Fetch the image
        const response = await fetch(url);
        const blob = await response.blob();
        // Create ImageBitmap from the blob
        return await createImageBitmap(blob);
    } catch (error) {
        console.error("Error loading image:", error);
    }
};

The bulk of the work will be focused on the local media and defining the new stream that we will create using the camera feed as a starting point.

Let's begin by creating our camera feed’s stream, setting it as the source of our HTML video element, and trigger the video to begin playing once the data has loaded.

try {
   // create a reference to the local video stream

    stream = await navigator.mediaDevices.getUserMedia({ audio: false, video: { width: 640, height: 480 }
    });
}
catch (ex) {
    writeStatus("ERROR: Getting local stream.");
}
// set the source of the video DOM element to the local stream
video.srcObject = stream;
// create video element onLoadMetaData handler
video.onloadedmetadata = function async(e) {
    // start playing the video once we have the meta data
    video.play();
};

Next, we will create a canvas object. This will give us the ability to “draw” additional content onto the screen.

// create a new canvas object

const canvas = document.createElement("canvas");
// configure it to be 2 dimensional
const ctx = canvas.getContext("2d");
// add the new canvas object to the DOM
document.getElementById("canvasContainer").appendChild(canvas);
// set the size of the canvas
canvas.width = 640;
canvas.height = 480;
const image = await loadImage("https://i.imgur.com/dAx3ABE.jpeg");

We can now define our drawing function. First we will clean the canvas by drawing a clearing rectangle the full size of the frame. Then we will draw the current frame from our video object as the base layer. We can begin adding in our additional content. In our case, this will be a background box for our text, three rows of text, and an image.

let draw = () => {

    // clear previous results
    ctx.clearRect(0, 0, 640, 480);
    // draw new results to the canvas
    ctx.drawImage(video, 0, 0, 640, 480);
    // draw white background rectangle to contain text
    ctx.fillStyle = "white";
    ctx.fillRect(5, 0, 150, 60);
    // add text values in black color
    ctx.fillStyle = "black";
    ctx.fillText("Player 1: " + p1Points, 10, 10);
    ctx.fillText("Player 2: " + p2Points, 10, 30);
    // get current date and time and write the time to the screen
    const date = new Date(); ctx.fillText( "Time: " + date.getHours() + ":" + date.getMinutes() + ":" + date.getSeconds(), 10, 50 );
    // add the image to the top right corner
    ctx.drawImage(image, 500, 5, 100, 100);
    window.requestAnimationFrame(draw);
};
window.requestAnimationFrame(draw);

We finish our draw function by requesting a new Animation Frame with the draw function as the  source and trigger the drawing loop by calling the same requestAnimationFrame function after our function declaration.

The last step is to create a new stream from our canvas object and set it as the source for our local media object.

// get the stream from the canvas object with the drawing on it
 
let st = canvas.captureStream(60);
// create a local media object using the canvas stream
var localMedia = new fm.liveswitch.LocalMedia(false, st);

Congratulations! You have now created your own custom scorekeeping view on top of a live video frame.  If you want to see this in action please check out the full application in this codepen.

If you are interested in building this out further and exploring this with some of the other examples we have discussed in our blogs such as the chat example, please sign up for a free 30-day trial here and build your own application!


Need assistance in architecting the perfect WebRTC application? Let our team help out! Get in touch with us today!