Skip to content

Mutate Local Media with Canvas: Data Buffer Modifications

Oliver Hargreaves Dec 28, 2023 12:00:00 PM

In this blog, I will be showing you an alternative to modifying your local media before streaming it out. In this example, we will use 2 canvas objects and a data buffer to make our modifications to the video stream.

Our modification will be to switch our video stream from full color to grayscale; however, you can replace that logic with anything else based on your exact use case. This process is repeatable and intended to give you a starting point and ideas for your customizations.

Let's start by fetching our local video feed from our webcam.

 // get access to the stream for your local video

let stream = await navigator.mediaDevices.getUserMedia({
  video: {
    width: { exact: 640 },
    height: { exact: 480 },
    frameRate: { exact: 30 }
  }
});

Take note of the resolution and framerate we are using here since we will reuse those values later on.

Next, we want to prepare the HTML Elements we will be using to display our final video stream.

 // fetch the HTML Element wrapper that we will display your modified local video in

let canvas = document.getElementById("greywebcamcanvas");
// create a new HTML Element to represent the video stream
let video = document.createElement("video");
// configure default values of the video element
video.autoplay = true; video.srcObject = stream;
// start playing the video once the meta data has loaded
video.onloadedmetadata = function (e) {
  video.play();
};

We are defining our video player element, setting the source as our stream object, and triggering the video to play once the metadata is loaded.

Now, we will set up our two canvases. Our “canvas” object will represent what we will ultimately display to the user, and the “backCanvas” will be our workspace that allows us to make the modifications we want.

// create our canvas object

var context = canvas.getContext("2d");
// create the canvas HTML Element
var backCanvas = document.createElement("canvas");
// create our second canvas
var backContext = backCanvas.getContext("2d");


With all of our objects defined and ready to be used, we can begin defining all of our modification logic inside of a draw() function.

// start by drawing our raw local video feed onto the back canvas

backContext.clearRect(0, 0, 640, 480); // clear canvas
backContext.drawImage(video, 0, 0, canvas.width, canvas.height);


Our first step is to reset our back canvas and then draw our next frame onto it.

We then want to read the data from the drawn on frame and pull it into our data buffer.

// We then read the image data to pull the frame back into a data object we can modify

var idata = backContext.getImageData(0, 0, 640, 480);
var data = idata.data;


Next, we apply our customization to the data. This is the block of code that you can replace or change based on what your exact use case is.

// Loop through the pixels, turning them grayscale

for (var i = 0; i < data.length; i += 4) {
  var r = data[i];
  var g = data[i + 1];
  var b = data[i + 2];
  var brightness = (3 * r + 4 * g + b) >>> 3;
  data[i] = brightness;
  data[i + 1] = brightness;
  data[i + 2] = brightness;
}


We can now take the data from our modified data buffer and draw it to our original canvas object.

// take the modified data and draw that output to the first canvas

context.putImageData(new ImageData(data, 640, 480), 0, 0);
// push the drawn image of the first canvas to the screen for playback
window.requestAnimationFrame(draw);


Finally, we trigger the first call to draw as the final step of our handleLocalVideo() function.

// trigger drawing of the frames to the screen

window.requestAnimationFrame(draw);


The remaining work is to establish the LiveSwitch connection, enable our custom functionality, and output the customized video stream for others to view.

First, we will trigger our custom logic.

// trigger local video logic defined above

await handleLocalVideo();


Next, we will use our first canvas object to be the source of the video that we will stream.

// fetch HTML Element that will be displayed

let canvas = document.getElementById("greywebcamcanvas");
// create a video stream by pulling frames from the canvas 30 times a second
let greyscaleStream = canvas.captureStream(30);
// create our local media object using our mutated greyscale stream
var localMedia = new fm.liveswitch.LocalMedia(false, greyscaleStream, false);
// start our local media object
await localMedia.start();


Note that we are using our greyscaleStream instead of the direct webcam feed to instantiate our local media object, so it will only ever be aware of this modified stream.

Now we need to create our client, generate a token that grants us access to connect, and register our client to gain access to the channel.

// create our LiveSwitch client

let client = new fm.liveswitch.Client(gatewayUrl, applicationId, username);
// generate a token to register with
let token = fm.liveswitch.Token.generateClientRegisterToken(
  client,
  claims,
  sharedSecret
);
// register our client on our defined channel using our token
let channels = await client.register(token);


Finally, we will pull the channel object we have registered with, create our video stream object from our local media, create a new SFU Upstream connection using our video stream, and open the connection to begin the flow of media.

// pull the channel object that we just registered with

let channel = channels[0];
// pull the video stream object from our local media
let videoStream = new fm.liveswitch.VideoStream(localMedia);
// create our SFU upstream connection to begin streaming just our video stream
let connection = channel.createSfuUpstreamConnection(videoStream);
// open the upstream connection
await connection.open();


Congratulations, you are now streaming your modified local video feed for others to view! Please note, that this solution is based on modifying the video stream on the source device. If you are working with low performance devices, you should use caution on how much additional processing you add to the device.

If you would like to see the full solution in CodePen, please look here.

To try this out in your application, please sign up for a free 30-day trial of the LiveSwitch Cloud platform here and try this example out on any of the support Development Kits we offer.

Need assistance in architecting the perfect WebRTC application? Let our team help out! Get in touch with us today!