Let's Connect
Sign Up
Contact Us
Sign Up

The Complete Guide to Scaling Event Platforms With Multipoint Control Units (MCUs)

Michelle Chen
Jan 11, 2021 7:00:00 AM

The results are in: there is no sign that the virtual event industry will slow down in 2021. Hybrid virtual events gained popularity as a result of the transformative shift to virtual platforms in 2020, but 2021 will be the year for scaling and growing virtual event platforms to reach bigger, global audiences.

However, doing so with an unoptimized platform can be a massive technical challenge and cause performance bottlenecks, undesirable latency in video/audio, and an inability to integrate seamlessly with on-site studios and broadcasting equipment.

This guide will help developers and businesses 1) avoid these common struggles when scaling a virtual video event platform, and 2) understand how to optimize large-scale broadcasts with LiveSwitch Cloud's powerful Multipoint Control Unit (MCU) feature.

Keep reading to find out:

  • What are Multipoint Control Units (MCUs)
  • How MCUs can help overcome scalability challenges
  • How to set up MCUs in LiveSwitch Cloud
  • What makes MCUs in LiveSwitch Cloud flexible
  • How customers have used LiveSwitch to launch successful virtual event platform
  • The best practices for scaling event platforms in 2021

What is a Multipoint Control Unit?

A software-based multipoint control unit (MCU) is a mixer capable of combining and composing multiple inbound media streams into a single outbound media stream. Often at the core of video platforms that display large groups of participants, the MCU is critical tech needed for accommodating the display of participant groups beyond the typical 9 streams arranged in a 3x3 grid found commonly in video conferencing apps today.

Additionally, when MCU streams are displayed in clustered layouts, the video platform can display even more participants reaching into the thousands. MCUs often take care of both mixing and transcoding, enabling the platform to broadcast video with the optimal layout and codecs for online attendees and their connected devices - which is particularly useful in the burgeoning virtual event industry.

Live Video Topology - MCUs with LiveSwitch, Encoding, Decoding and Mixing Client AV Feeds

MCUs: From Telephony to Virtual Event Platforms

The technology behind a multipoint control unit is not new. Previously, SIP and H.323 providers have used MCUs to connect conferencing participants from their phones to a video meeting. A single, bi-directional audio and video feed is passed from the phone call to the Media Server, which is then transcoded into a mixed feed and broadcasted to other client devices.

Now, particularly useful in the virtual events space, MCUs are paired with SFUs (Selective Forwarding Units) as the gold standard for flexible video live streaming.

LiveSwitch developers have leveraged MCUs to relay bi-directional AV feed from the audience to the production stadium screens and an SFU to connect the broadcast from the event venue back to thousands of remote viewers. The powerful combination of these technologies has made it possible for virtual event providers to delight fans and reach thousands of audiences worldwide without sacrificing video quality or latency.

Virtual Event Platforms - MCU Video Architecture and How To Do ItIs a Multipoint Control Unit Right For Me?

WebRTC developers who work with live video understand that while Multipoint Control Units are an extremely useful technological component for the virtual event space, the underlying technology is CPU-intensive. Media is decoded, mixed, and re-encoded in real-time on the server-side. Because of this, there is an advantageous opportunity for developers to offload CPU resources normally used for encoding and decoding individual media streams from the client to the server, and by doing so, deliver the best experience for even the most under-powered devices out there.

When MCUs are applied in parallel with other hybrid video architectures such as the SFU on a live video platform deliberately optimized for large-scale broadcasts, virtual platforms can enjoy cost-effective and high-performance connections. This flexibility has been demonstrated in the past by LiveSwitch customers who are now shaping the future of virtual event technology in 2021.


"We often get questioned why we decided to include MCUs in LiveSwitch’s hybrid architecture when the powerful SFU alternative is readily available,” explains Anton, CTO of Frozen Mountain, “The simple answer is: we built LiveSwitch for ultimate flexibility. Our vision for the product is to empower developers to build the exact platform they need with one powerful SDK. We give developers the ability to decide when, if, and how to use the hybridization features in LiveSwitch Cloud.”

 

How To Work With LiveSwitch’s Multipoint Control Unit Feature

Now that we’ve discussed MCUs at a broader level, it’s time to dive into exactly how developers can work with LiveSwitch's Multipoint Control Unit feature. Configuring the MCU settings in the LiveSwitch Cloud console is quite simple, taking only four main steps with two optional customizations. Once the personalization in the Console is complete, then the application code layer can be customized.

Step By Step MCU Configuration in LiveSwitch Cloud

How to Display Hybrid Video Connections on a Single Layout

LiveSwitch supports multiple video connection types together within a single session on a single client. Clients may open both SFU connections and MCU connections simultaneously during a session.

var audioStream = new FM.LiveSwitch.AudioStream(localMedia, remoteMedia);
var videoStream = new FM.LiveSwitch.VideoStream(localMedia, remoteMedia);

// Create a bidirectional MCU connection that is both send and receive.
var connection = channel.CreateMcuConnection(audioStream, videoStream);
layoutManager.AddRemoteView(remoteMedia.Id, remoteMedia.View);


// Create an upstream SFU connection that sends my audio video data. 
var connection = channel.CreateSfuUpstreamConnection(audioStream, videoStream);

//For every participant upstream connection I want to ingest, I also need to open a receive downstream SFU connection.
channel.OnRemoteUpstreamConnectionOpen += (remoteConnectionInfo) =>
{
    ...
    var remoteMedia = new RemoteMedia();
    layoutManager.AddRemoteView(remoteMedia.Id, remoteMedia.View);
    var connection = channel.CreateSfuDownstreamConnection(remoteConnectionInfo, audioStream, videoStream);
    ...
};

There are a number of benefits to accomplishing this with LiveSwitch Cloud. The ability to optimize for bandwidth usage, accommodate under-powered client devices, and guarantee compatible video and audio codecs are some of the main benefits.

This is particularly useful for virtual event platforms with multiple sources and sinks that naturally benefit from displaying MCU streams for one purpose and SFU bi-directional feeds for others.

How To Mix Audio Video Feeds In MCU on LiveSwitch

Can You Customize Remote Viewer MCU Layouts?

You can effectively customize the remote viewer layouts in an MCU view with the LayoutManager class in LiveSwitch Cloud.

LayoutManager allows custom UI elements such as labels and buttons to be added to the MCU remote video view of an event platform. Hand-raising, crowd cheering, and polling UI elements are some of the features our clients have added in the past using LayoutManager, SetLocalView and AddRemoteView.

By using LayoutManager included in the LiveSwitch Cloud SDK, combined with further customization, developers can implement virtually any layout to recreate the in-person event experience their customers want.

How To Customize MCU Layouts for Live Video Platforms

Successful Virtual Event Use Cases with LiveSwitch

Sports and entertainment, major business industry conferences, and virtual event platforms have all used MCUs as part of their large-scale video platforms. LiveSwitch Cloud's MCU feature is highly flexible and can be used for a number of different use cases in the virtual events space. Developers can enjoy the ability to customize live video and scale video platforms to reach a bigger audience.

One appealing factor in mixing participant video and audio feeds into a single screen is the ability to monitor participant interactions and behaviors. MCUs are capable of displaying several video feeds concurrently on production control room screens at an optimal resolution and frame-rates for event staff to use to moderate the behaviors of their streaming participants.

This optimized resolution and frame-rate only applies to the moderators' view; the monitored participants will still receive the highest-quality feeds at the bitrates their devices and networks can handle. At any point in time, the quality of the monitor feed can be adjusted with an instant response that is not dependent on any extra bandwidth being available at either end of the broadcast.

Bringing It All Together

Best Practices For Virtual Event Platform Scalability

Practice #1: Leverage a Hybridized Video Architecture

LiveSwitch was built with the vision to be completely unique - a hybridization of forwarding (SFU), mixing (MCU), and peer-to-peer (P2P), all at the same time in the same session. It has been developed as a live video platform and API that uniquely supports hybrid video architecture in a way that empowers developers to meet the exact application requirements for their use case. No compromises.

Practice #2: Make an MCU an Integral Part Of Your Architecture

MCUs provide virtual event platforms the capability to create real-time bi-directional video and audio feeds ranging from small groups up to thousands. Choosing to use an advanced MCU, such as LiveSwitch, can optimize opex for both 1) cost-effective streaming and 2) network quality by leveraging automatic bandwidth estimation and adaptive simulcast such that participants always receive the best quality feed for their devices and internet connections.

Practice #3: Choose a Right Server Hosting Solution

The performance of a virtual event platform is influenced by the infrastructure hosting the live video application. Since MCUs offload the bandwidth required to mix, decode, and encode live video from the client device to the Media Server, the right server infrastructure is critical for platform stability. Shared, cloud, on-premises, and private cloud are the four main options for hosting live video.

Scalable Virtual Event Platform Hosting Options and Comparison

Practice #4: Understanding Where You Might Need Support

The LiveSwitch Cloud platform and API is a powerful, flexible solution for the virtual events and event production space, but harnessing the true capabilities of any platform can be time-consuming. When businesses are trying to pivot to virtual events this year and grow their online business, why go alone?

We believe one of the key reasons our virtual event clients succeed so often is that they reach out when they realize the inherent value in accelerating their product launches with the assistance of live video development experts. Whether you require a project manager, senior developers or an entire team to ensure your event platform scales perfectly and can run smoothly 24/7, we're here for you. An unbiased assessment of your platform requirements, full project build out, and/or feature creation are all options when you work with us to build your next virtual event.


Let's make your dream virtual event platform happen. Contact our team of experts to build the perfect virtual fan event application or to scale your existing platform to a global audience. 

Virtual Event Platforms, Virtual Conferences, and Online Tradeshows Development

Move Your Event Business Online.


Build and launch the perfect hybrid virtual event platform with massively scalable live video technology. Start with LiveSwitch Cloud then go faster with our expert Professional Services help.

Build Your Next App Today

Subscribe by Email