September 29, 2023

WHEP WebRTC low latency playback in Nimble Streamer

Nimble Streamer team is continuously improving low latency streaming features to provide our customers with the best set of options they can choose from.

Last year we introduced WebRTC WHIP (WebRTC-HTTP ingestion protocol) support in Nimble Streamer to provide ultra-low latency ingest into the media server. You can read full WebRTC WHIP setup instructions and try it in action.

Now it's time to take the next step and give our customers low latency playback. Our team has always preferred to rely on open standards so we chose the best option available.

WebRTC WHEP (WebRTC HTTP Egress Protocol) provides easy communication between a server and a client, while being interoperable with other solutions that support WHEP signaling. This is a result of industry cooperation, thanks to Sergio Murillo and Cheng Chen who developed it into an IETF standard draft.

Nimble Streamer uses Pion implementation of WebRTC API. This framework not only gives the flexible API but also provides high-performance and low resource usage which completely correlates with our own approach to creating sustainable and cost-effective software.

We'd like to thank Sean DuBois and all Pion contributors for maintaining such a great framework.

1. WHEP support overview

Nimble Streamer generates WHEP playback output with the following codecs:

  • Video: H.264/AVC, VP8, VP9 and AV1
  • Audio: Opus

You can refer to supported codecs page to see how you can deliver the pre-encoded content and re-package it without additional action. For instance, you can ingest VP8 video and Opus audio using WebRTC WHIP and generate WHEP playback as is, with no additional overhead.

If your source has different codecs then you can transcode the content. E.g. if you get RTMP with H.264 video and AAC audio from your source, you use Live Transcoder to transcode AAC into Opus and then to pass through H.264 with no need for decoding and encoding.

You can read more about various scenarios in Further Usage section below.

The stream can be played in any browser using WHEP player. Please refer to Player setup and usage section below for more details.

Notice that WHEP AV1 playback is currently working only in Chrome. We're looking forward to other browsers and platforms to support it via WebRTC.

Our team likes to contribute back to the open source that use use. In order to handle AV1 WebRTC playback, we've made a code contribution into the Pion framework to add proper AV1 support.


2. Server setup process


Follow these steps to get WHEP working in your Nimble Streamer instance.
We assume you already have a paid WMSPanel account. If you don't, please sign up and subscribe.

2.1 Installation

First, make sure you've installed the latest Nimble Streamer or upgraded it to the latest version.

Second, WHEP implementation has some dependencies on Nimble Live Transcoder so you need to install it first, then subscribe for a license and register it on your server. Please refer to Transcoder installation page for details.

2.2 Set up SSL for Nimble

The next step is to enable SSL for your Nimble instance for further playback.

You can set up your SSL certificate using this general instruction. You may obtain CertBot Let's Encrypt free certificate as we've described here. For testing purposes you may create your own self-signed certificate but in order for it to work, you'll need first to open any https:// page like https://yourhost.com:port/ and accept security risk. For production purposes in general, you need to have a valid SSL certificate. Also, your server must be assigned for the domain of this certificate.

Once you've set up SSL for Nimble, you need to test it. Open https://yourhost.com in your browser, where yourhost.com is the host of your Nimble. If you get error 404 and have no warnings from your browsers then your SSL was set up is properly and is working.

2.3 Add parameters into config

The next step is to enable the feature set in Nimble config file. On Linux, it's located at /etc/nimble/nimble.conf.  Please check parameters reference page for more details about operating with the config.

Here are the parameters you need to have in the config
webrtc_whep_support = true
access_control_allow_headers = content-type,location
access_control_expose_headers = location
transcoder_change_params_on_the_fly_enabled=true
If you use WHIP ingest, you're familiar with some of those as we described in respective article. So you just need to put webrtc_whep_support there in addition.

Once you've added the parameters, don't forget to re-start Nimble as described here.

2.4 Enable WHEP for streaming application


Now you need to enable WHEP for the output applications that you will provide for your users.

Log into your WMSPanel account, go to Nimble Streamer top menu and click on Live streams settings item.

Here you can enable WHEP either globally on server level, or for a specific application.


Once you enable the checkbox and save settings, you won't need to re-start Nimble, it will be applied automatically within a few seconds.


3. Player setup and playback

Now lets see how you can play the WHEP low latency stream generated by Nimble Streamer.


3.1 WHEP Player

Softvelum WHEP Player is available in our github account for cloning and further usage.

We forked WebRTC player by Eyevinn as we consider it the best open source player for this purpose. Besides the fork, we contributed back the code for handling audio-only WHEP streams, we'll show this use case below.

Please refer to Getting started section in our repo to see how you can set up and customize the player for your own website, e.g. enabled audio-only playback.

Try now: you may also try our WHEP player at Stream test page where we provide several most popular players to check your streams. You'll find webrtc-player as the first option in the list. 


3.2 Playback URL

WHEP uses HTTP for establishing connection so the URL for playback will look familiar:

https://127.0.0.1:8443/live/whip/whep.stream

You can see HTTPS protocol and port 8443 for streaming via SSL.

Please also notice "whep.stream" element after the stream name. This is how Nimble Streamer will handle it as a request for WHEP playback.

App and stream name is defined based on your input and output streams setup, please refer to respective protocol setup instruction.

You can use that URL for further playback in our WHEP Player or any other WHEP-enabled player.


4. Network-related parameters


By default, Nimble Streamer works in ice-lite mode.

If Nimble server instance runs on a host with public IP address then additional configuration is not needed.

If a server instance runs on Amazon EC2 then you'll need to create an additional config file at /etc/nimble/whep_config.json and add the following JSON there:
{
"NAT1To1IPs":"a.b.c.d",
"NAT1To1CandidateType":"host"
}
where "a.b.c.d" is a public address assigned to AWS server instance. If it has multiple IP addresses, just add them in the same parameters separating by comma like this:
{
"NAT1To1IPs":"a.b.c.d,w.x.y.z",
"NAT1To1CandidateType":"host"
}
This file is processed by Nimble at the beginning of each new publishing session, so you can change it without re-starting the server.

To define ports range, you can also add these parameters:
{
"PortMin":1000,
"PortMax":40000
}
In this case the candidates will be selected only from the range of ports 1000 to 40000.

If you want to use the same port for all WHEP playback connections, you can use the following parameter instead of PortMin and PortMax:

{
  "ICEUDPMuxPort":1234
}

Notice that you cannot use the same port for both WHEP playback and WHIP ingest simultaneously.

If you use network parameters mentioned above, the combined JSON in this case will be:
{
"NAT1To1IPs":"a.b.c.d",
"NAT1To1CandidateType":"host",
"PortMin":1000,
"PortMax":40000
}
JSON format requires this kind of syntax and if you add them in different blocks or have no commas between parameters, Nimble will not process the config.

5. Further Usage

WHEP Playback is a great addition to the existing set of output protocols. This means that you can create new combinations of input with output as well as use proven existing feature sets on top of WHEP playback to add more power to it.


5.1 Live transmuxing and transcoding

The number of use cases for combining WHEP with other streaming protocols is huge, here are some of them.

Combine WHIP input and WHEP output. You can take WHIP ingest into Nimble instance and then re-package it into WHEP playback. It's a very light-weight operation, so ultra-low latency WebRTC ingest from your browser or mobile app will flow seamlessly into low latency playback in other browsers. Use this article to set up WHIP ingest and use the setting above to complete the WHEP part.

Convert RTMP into to WHEP. You use your favorite RTMP-powered media source to deliver H.264 video with AAC audio into Nimble Streamer. Then you set up Nimble Live Transcoder scenario which passes through the H.264 content without decoding/encoding, decodes AAC and encodes it into the Opus output. With this H.264/Opus combination you then create WHEP output just like we described above. This will not require too much resources because audio transcoding is a cheap operation.

Use SRT HEVC source for WHEP output. You use some HEVC-powered encoder or media server which is able to deliver it via SRT stream. Once you route it to Nimble instance, you can then use Live Transcoder to transcoder HEVC video into H.264 or VP9, and also transcode AAC into Opus. This resulting content is then easily delivered via WHEP into viewers' browsers for convenient playback.

These are just few possible option, but the general idea is that you may juggle transmuxing and transcoding features of Nimble Streamer to achieve the best combination of codecs and protocols to provide the best user experience. And all that comes at the low cost of ownership.


Notice on transcoding scenarios for Opus output.
Once you create a scenario and put audio encoder element, please use FFmpeg as encoder and libopus as codec name as shown below.


5.2 Audio-only low latency playback

Nimble Streamer is used extensively for audio-only scenarios, like online radios. This includes a huge Icecast feature set, audio-only HLS and audio only SLDP.

With WHEP in board, you can add one more element to this kind of cases. Audio processing is a light-weight operation that will enable you to give more playback options to your listeners.

Most obvious case is when you ingest Icecast, transcode audio from MP3 or AAC into Opus, and then just generate WHEP output for browser playback.

Speaking of audio-only, you can do the same with Dante audio which is widely used in professional live production environments. Just set up Dante input and transcode it  into Opus output the same way as you do for Icecast.

In addition, you can use traditional sources like RTMP or SRT to pick up audio track from it, transcode into Opus and make audio-only output via WHEP. It works the same way as for Icecast transformation.

MPEGTS input can carry multiple tracks, including several audio channels for different languages or additional comments. You can pick up each track and generate Opus output.

Audio mode is something that we wanted to take care of, knowing the interest from our online radio customers to it. As was mentioned above, we even made a contribution into Eyevinn's player to to handle this case.

5.3 NDI to WHEP

NDI ingest can also be a good source for WHEP output. If you're running a live production based on that technology, you can get NDI into Nimble Streamer and transcode it into WHEP live stream. This will give you a seamless ultra-low latency bridge between your internal production environment and the Internet viewers in their browsers.

It's a unique use case to any existing solution so if you use NDI, you should definitely try it. Just set up the NDI input and the procedure above to generate WHEP.

5.4 Playout (server playlist)


Nimble Streamer Playout, or as we often call it server playlist, is also able to generate WHEP output. With Playout way you may combine live streams with pre-recorded videos. This allows creating your own TV or radio station with just one Nimble Streamer instance.

So now your viewers can watch Playout output in the browser with the WHEP Player, just like they do with other protocols.

Notice that you will still need to transcode the Playout output into supported video codecs and Opus audio, unless your original content is already encoded with them. Once you set up Playout config, just add the Transcoder scenario to make the proper output.

5.5 Paywall and authorization features

Nimble Streamer Paywall feature set fully covers WHEP playback just like it does for other protocols

So everything you need to protect your stream, can be used for WebRTC playback. Just check an additional protocol in your WMAuth rule.


5.6 HTTP aliasing

WHEP URLs can also be used in combination with HTTP Aliasing feature.

Aliasing allows mapping multiple names to single media stream in order to add some flexibility into streaming process. It's good for cases when you have some media stream and you need to provide it under different names. For example, you are a content provider and want to give your stream to multiple partner websites and services. Aliased streams can use different security and monetization approaches as well.

So full aliase-based flexibility is now available for WHEP as well.


5.7 Playback statistics

Last but not least, WMSPanel web service provide wide and rich feature set for reporting the viewers' statistics which now include WHEP views. Any stats and metrics our customers have been using for more than a decade, now cover low latency playback as well:

  • Daily stats with geo-location and devices report
  • Deep stats with per-stream daily stats
  • Unique viewers
  • High-precision reporting

WHEP playback is handled like any other HTTP-based protocol so it has the same proven reliability.


As WebRTC allows tracking lost packets, the daily stats have Lost column which indicated the lost traffic in addition to overall bytes sent and received.


6. Performance tuning

If you serve your live streams to dozens of thousands of viewers, you may need to tune up the perfomarnce of Nimble Streamer for that. There are a couple of a parameters that you can put into  /etc/nimble/nimble.conf. file.

  • webrtc_whep_worker_threads - this parameter sets the number of working threads for WHEP sessions processing. It's "1" by default.
  • webrtc_whep_max_viewers - in some high load cases the number of simultaneous WHEP connections can be significant. By default Nimble processes 2500 simultaneous connections per each WHEP working thread, but you can increase/decrease this value with this parameter.

Please refer to parameters reference for more options.



Let us know how this feature works for you and what else we can do to enhance it from your practical perspective.

Later on we'll introduce more tutorials - both text and videos - to show full power of WebRTC WHEP playback.

Follow us in social media to get updates about our new features and products: YouTube, Twitter, Facebook, LinkedIn, Reddit, Telegram