Skip to content

Join a LiveSwitch Session with Audio Muted

Oliver Hargreaves Jan 11, 2024 2:59:39 PM

When building a live streaming application, our first step is always to make sure that we can successfully stream our audio and video feeds from one device to another. While crucial, this first step doesn't entirely capture the practical usage that a production version of your application may encounter.

In this post, I will show how to join a LiveSwitch session with your audio enabled but muted. If you have a conference call with 10 people on it, you can imagine that at least 1 person has some kind of background noise - be it a passing fire truck, a screaming toddler, or noisy neighbors in the adjacent office.  Muting audio upon entry becomes a thoughtful feature to avoid interruptions and maintain the flow of ongoing conversations.

You can add conditional logic or make this controlled by a button press on your lobby page to trigger this change; however, I will show you the basic implementation of this.

To start off, we want to establish some configurations and basic pieces of information for our application.

// setup LiveSwitch Configurations

let gateway = "";
let appId = "62c0809a-5671-426f-94a5-8edbdd1fe962";
let secret = "0070c9c582894ef7969986ba228399c527201d910ed9451eb8b45097194ad689";
// create a default username for our client
let username = "MutedAudio";
// generate a claim to join the "MuteOnJoin" channel
let claims = [new fm.liveswitch.ChannelClaim("MuteOnJoin")];

Next, we need the ability to create a client and token object that uses these configurations to initial a session. We will create 2 helper functions to take care of this.

// return a new client

let createClient = () => {
  return new fm.liveswitch.Client(gateway, appId, username, "chrome-js-mac");

// return a new token
let getToken = (client, claims) => {
  return fm.liveswitch.Token.generateClientRegisterToken(

I am also going to add a helper function that will allow me to print messages to the screen to track my progress instead of having to open the console.

// helper function to print a message to the screen

let writeStatus = function (message) {
  let sdiv = document.getElementById("status-div");
  let p = document.createElement("p");
  p.innerHTML = message;

We can now dive into our core logic. Since the core action we will be taking in our application is to join a channel, we can create a helper function to handle the connection logic that allows us to join.

We will assume that we already have created a local media object and will use the audio and video streams from our local media to create a new SFUUpstreamConnection and open the connection to begin streaming.

// start upstream connection

let startSfuConnection = (channel, lm) => {
  writeStatus("Opening SFU connection");
  // pull the audio stream off our local media object
  let audioStream = new fm.liveswitch.AudioStream(lm);
  // set the stream to a muted state BEFORE we open the connection
  // pull the video stream off the local media object
  let videoStream = new fm.liveswitch.VideoStream(lm);
  // create our SFU connection object

  let connection = channel.createSfuUpstreamConnection(
  // open the connection to start streaming our local media;
  writeStatus("Starting SFU Connection");
  return connection;

Notice what we do right after defining our audioStream object. As soon as we have a representation of this stream, we want to set the muted state of the stream to true. Since we are calling this before we open our connection that is using that audio stream, the other users on the channel will be able to see but not hear this user.  To unmute the user, you can also make the following call:


We can now write our application logic and leverage the helper functions we have generated. We will create a client, generate a token, create our local media object, and add the local media view to the DOM.

let init = async () => {

  writeStatus("Creating Client");
  // initialize client
  let client = createClient();
  // create a client token
  let token = getToken(client, claims);
  writeStatus("Starting Local Media");
  // create a local media object being sure to enable audio and video
  var localMedia = new fm.liveswitch.LocalMedia(true, true);
  // get access to the video container element in the DOM

  const video = document.querySelector("#localVideo");
  // insert our local preview tile using the getView() helper

We will trigger the remaining logic once our local media has been started. We begin by calling start() on our local media object and then joining our channel.

// start local media

  // trigger the join logic once media has been started
  .then((lm) => {
    // register the client using your token
      .then((channels) => {
        // client registered 
        writeStatus("Connected to Server.");
        // fetch the channel you connected to
        let channel = channels[0];
        // create the new SFU Upstream Connection using your canvas stream
        let sfuConnection = startSfuConnection(channel, localMedia);
      .fail((ex) => {
        writeStatus("ERROR: " + ex);
  .fail((ex) => {
    writeStatus("ERROR: " + ex);

You can see how we use the token we generated above to register our client, then pull our channel out of the list of potential channels we could have joined, and use our helper function to create our SFU connection.

Congratulations! You are now streaming your video and muted audio streams. You can see the full solution here. If you would like to try this out for yourself, sign up for a free 30-day trial here and see how many different ways you can use this functionality!

Need assistance in architecting the perfect WebRTC application? Let our team help out! Get in touch with us today!