Skip to content

WebRTC Mutate Local Video with Canvas

Oliver Hargreaves Nov 22, 2023 12:00:00 PM

In this blog post, we'll explore how to utilize the Canvas feature to transform your local video. LiveSwitch's SDK provides the capability to perform various video transformations at different stages of the life cycle. In this article, our focus will be on applying transformations at the latest step for local video. As an example, we'll demonstrate how to use canvas rendering to apply a blur filter to your local media.

 

To get started, we need to create a stream object for our local video, specifying the desired resolution and frame rate. Here's how you can set it up:

// create stream

let stream = await navigator.mediaDevices.getUserMedia({
  video: {
    width: { exact: 640 },
    height: { exact: 480 },
    frameRate: { exact: 30 }
  }
});

Next, we need to obtain the DOM element that will represent the canvas.

let canvas = document.getElementById("greywebcamcanvas");

Now, create a video element and configure the properties on the DOM element we just created.

let video = document.createElement("video");

// configure video element
video.autoplay = true;
video.srcObject = stream;
video.onloadedmetadata = function (e) {
  video.play();
};

The final step involves initializing the canvas's context object and configuring settings to disable image smoothing.

// get canvas context

var context = canvas.getContext("2d");
context.mozImageSmoothingEnabled = false;
context.imageSmoothingEnabled = false;

With the groundwork laid, it's time to define the logic that will mutate our video stream. We will employ a custom draw function that applies a blur filter to the context, draws an image on the video element, and then renders it on the DOM. Execute our custom draw function to see the transformation in action.

// create draw function

let draw = () => {
  // add custom filter to blur video
  context.filter = "blur(4px)";
  // draw image on canvas with filter in place
  Context.drawImage(video, 0, 0, canvas.width, canvas.height);
  // send the frame to the window
  window.requestAnimationFrame(draw);
};
// call our draw function
window.requestAnimationFrame(draw);

We now have our fully transformed local video track! We should now add this to our layout so we can view the remote video from others in our meeting. Take a look at the full example and try out some additional transformations in our CodePen.

 

It's important to note that this approach can be utilized for any type of video transformation. However, exercise caution and be mindful of the additional processing involved in these transformations. Since the process occurs on the local device, it's essential to test the performance of your transformations on older devices you intend to support. A transformation that runs smoothly on a Google Pixel 8 Pro might lead to overheating and device shutdown within minutes on an older Google Pixel 5.

 

Need assistance in architecting the perfect WebRTC application? Let our team help out! Get in touch with us today!