December 11, 2022

License API and deferred payments for Transcoder and Addenda

Nimble Streamer has premium add-ons which require additional licenses to operate. Those are Live Transcoder for content transformation and Addenda for various features like DRM, Advertizer or SRT PASSet.

You can create those licenses at the time of your first subscription or later on whenever you need them at any moment of your billing period. You also need to make payment in order to activate - either during your first subscription or any time during billing period once you need them.

This is not convenient in many cases especially when you create some automation process where you cannot log into WMSPanel every time you make a license.

So we made two big adjustments for those of our long-time trusted customers who want to utilize more licenses:

  1. You can defer first-time license payments to your next billing date.
  2. You can create Transcoder and Addenda licenses via API.

Let's see what you can do now.


Defer your payments

In order to start working with a new Transcoder server you need to create a license for it and activate it.

Usually you activate the license by making one-time payment proportional to the cost of monthly license (50 USD) and to a number days left until the expiration date of current license (basically days left until WMSPanel monthly payment).

Now eligible customers may request the alleviation of this policy. If you're a long-time customer with active usage of Transcoder, you may request us to defer these payments. Some existing customers have already been enabled this feature.

Addenda is already available for deferred payments for all subscribed customers.

This deferment works like this:

  1. When you create a license you may choose to defer payment by clicking on "Activate and pay later".
  2. The license is then activated and you can use it right after that.
  3. The deferred payment amount is put into your account billing.
  4. At the next payment date, that amount will be charged along with other expenses for the next billing period.
  5. Those expenses will include license' regular price of 50 USD unless you cancel it.

This way, you will not need to pull out your credit card each time you need to create a license.

Contact us to see if your account is eligible and to enable this capability.


API for licenses 

With the deferred payment enabled, you can make another step and create your licenses using our WMSPanel API as described in these calls' descriptions.

So what you need to do is 

  1. make an API call which creates a license,
  2. get this license ID in response,
  3. register this new license for your server.

That's it, you can now either manually create a new scenario for this new server, or use API to operate Transcoder scenarios on that server.

This allows automating a lot of processes related to Nimble Streamer functionality.


Contact our team if you have any questions about this approach and if you'd like to enable it for your account.

November 28, 2022

AV1 support for VOD MPEG-DASH streaming via Nimble Streamer

Nimble Streamer has extensive VOD feature set. It allows dynamically re-package static files into VOD HLS and MPEG-DASH streams. MPEG-DASH protocol has been fully supported by Nimble for a long time, allowing to reach wide range of devices with various codecs on board.

AV1 codec has been introduced to the public a few years ago and has grown into a mature technology with help of industry leads. It's supported in all major browsers which makes it pervasive across the web.


Now, following the requests of our customers Nimble Streamer team has implemented AV1 VOD transmuxing into the product. Having MP4 files containing AV1 content, you can set up Nimble to process these files and generate MPEG-DASH output.

Follow MPEG-DASH VOD transmuxing setup article to set up Nimble Streamer to generate AV1-powered VOD streams.

Other VOD-related standard features  of Nimble Streamer applicable for DASH are supported as well.

Remote HTTP storage support allows to effectively stream files, those size exceeds available file system capacity. An AV1 files can be processed via remote storage as well. You may also make adaptive bitrate VOD streams using SMIL files. The generated streams can then be protected with Paywall feature set, including pay-per-view framework, hotlink protection, geo-lock and more.

In addition to Paywall, you can encrypt AV1 content with Widevine using Nimble DRM. You may use any DRM management solution supported by Nimble to protect your streams.


Live AV1 support: with Enhanced RTMP spec, Nimble Streamer now supports HEVC/H.265 and AV1 processing for live re-packaging. 

Please also check AV1 live streaming support article describing live scenarios with AV1 codec.


Feel free to let us know of your experience with AV1 and share your thoughts on its usage with Nimble and beyond.

Follow us in social media to get updates about our new features and products: YouTubeTwitterFacebookLinkedInRedditTelegram


Related documentation

VOD streaming in Nimble Streamer, MPEG-DASH support in Nimble StreamerMPEG-DASH VOD transmuxing setupNimble DRM


October 12, 2022

CEA-608/708 subtitles support in SLDP

CEA-608/708 closed captions are now supported in SLDP low latency playback protocol by Softvelum, in both Nimble Streamer and SLDP HTML5 Player.

The pipeline works as follows:

  1. Closed captions are delivered in NAL units of your content via any live streaming protocol supported by Nimble Streamer, including SRT, RTMP, MPEGTS and others.
  2. Nimble Streamer delivers the content via SLDP as usual.
  3. SLDP HTML5 Player recognizes subtitles in the stream.
  4. End user may enable subtitles display and watch video with closed captioning.

Notice that subtitles processing works only with SLDP Player SDK which is available as a premium product. Learn more about HTML5 Player SDK here. You can subscribe to SDK in order to generate the package for your domains and get our team's support going forward.

You can try this feature with our players testing free page before purchasing the SDK.


Please also take a look at Subtitles digest page to see what else Nimble can do for you.


Let us know if you have any questions about closed captioning in Softvelum products.


Follow us in social media to get updates about our new features and products: YouTubeTwitterFacebookLinkedInRedditTelegram

October 10, 2022

HEVC support in Chrome

Bitmovin has recently pointed out that Google Quietly Added HEVC Support in Chrome. Also, Jan Ozer made an analysis of current state of HEVC support. This means that it should be able to process HEVC live and VOD content via MPEG-DASH and HLS. So our team has run tests on all available devices to make sure it work as the expected.

We found the following conditions work fine for HEVC playback in Chrome:

  • You have the latest Chrome browser (at least version 107)
  • Your device has hardware decoding of HEVC

The latter point proved to be important as not all devices has built-in decoding capabilities for HEVC.

With the above conditions met, we could play HEVC on Windows, Linux, Mac, Android and iOS.


Re-package with Nimble Streamer

On Nimble Streamer side we've added HEVC support a long time ago so. You can do various processing and delivery combinations.

For HEVC VOD content file you can simply perform on-the-fly re-packaging into VOD MPEG-DASH and into fMP4 VOD HLS. Notice that fMP4 container in preferred for HLS HEVC playback.

For live streams, you can take streams via any incoming protocols which support HEVC, those are RTSPMPEG-TSSRTRIST from any source. You may also send HEVC via RTMP as non-standard feature. If you use Nimble Streamer to receive input from WebRTC source with WHIP signalling you can send WebRTC with HEVC from Apple devices.

As an output you have MPEG-DASH and fMP4 protocols:


Encode with Nimble Live Transcoder

If your live source provides the content encoded with other supported codecs, you can transcode it with Nimble Live Transcoder, a premium add-on for Nimble.

From Live Transcoder perspective, HEVC is just another codec to provide as output. Currently HEVC can be encoded with the following libraries:

Having the content encoded with HEVC codec, you can deliver it to Chrome with MPEG-DASH and fMP4 HLS protocols as described in re-packaging section above.


Take a look at Nimble Streamer HEVC support digest page for other details.


Follow us in social media to get updates about our new features and products: YouTubeTwitterFacebookLinkedInRedditTelegram

September 19, 2022

Play audio-only SLDP with Opus on iPhone

When using WebRTC as a source of your content, you need to consider that the audio codec for that protocol is Opus. So in order to play sound on a user device side, the streaming provider has two options.

The first option is to transcode it into commonly used AAC codec. It's a default option for many customers and we describe the transcoding in our WebRTC setup video and WebRTC to MPEGTS UDP video.

The second option is to pass Opus content through into the player without transcoding. In this case the protocol must be able to contain this codec. SLDP - low latency playback protocol from Softvelum - is able to carry Opus.

The playback though will depend on the platform your consumer uses.

Video+audio SLDP with Opus can be played in Windows, Linux, Android, iPad and (with recent update) iPhone browsers via HTML5 SLDP Player, and also using native Larix Player for Android and Larix Player for iOS.


Opus on iPhone

Apple's platforms have limitations on Opus playback using system components. So out-of-box, it's impossible to play any video with Opus audio there.

This is why we made a step forward and created our own playback implementation there.

Opus audio can be played in iPhone browser in audio-only mode using SLDP HTML5 Player. You can embed the web player and play audio streams.

So when you're receiving audio via your WebRTC input you can avoid additional transcoding and play it directly with the lowest possible latency.


Premium feature from SDK

Notice that Opus audio-only playback in iPhone is a premium feature available only as part of SLDP HTML5 Player SDK. Feel free to subscribe in order to get access to this and other capabilities of SLDP web player.


Follow us in social media to get updates about our new features and products: YouTubeTwitterFacebookLinkedInRedditTelegram

August 30, 2022

SEI metadata insertion support in Larix Broadcaster

Larix Broadcaster can now insert SEI metadata into SRT, RTMP and RTSP streams. It also allows specifying NTP server for get precise time.

This allows synchronizing live streams published from Larix Broadcaster using any software or hardware capable of SEI sync up.

Learn more about time synchronization in Softvelum products 

Also watch video tutorial showing Larix Broadcaster generating live streams with SEI, and Nimble Streamer producing synced-up NDI outputs:



Follow us in social media to get updates about our new features and products: YouTubeTwitterFacebookLinkedInRedditTelegram


August 8, 2022

SRT bonding and libsrt 1.5 in Nimble Streamer

Nimble Streamer has a wide SRT feature set which covers multiple capabilities. Our team has been an active contributor and a platinum member of SRT Alliance so we follow up closely with all updates.

At the moment Nimble uses libsrt version 1.4.4 by default when you install SRT package, however we continuously monitor new versions and features. So we wanted to give a try to the latest update.

libsrt 1.5

Now Nimble allows using libsrt 1.5 as an option.

If you want to try it with the latest Nimble build, use SRT package installation procedure with this command to install SRT package:

sudo apt-get install nimble-srt-1.5

You can always install "nimble-srt" package to get libsrt 1.4.4.


Bonding

With version 1.5.0, libsrt provide a new major feature set for Connection bonding.

Nimble Streamer allows enabling SRT bonding as provided by libsrt.


This capability is enabled by adding "srt-bonding-enabled" parameter with "true" in MPEGTS In setting for incoming SRT connection:



With this rule Nimble streamer will listen for incoming SRT stream 'in bond' on all the machine’s available interfaces “0.0.0.0” on the defined port “2022” (UDP).

If you’d like to specify the interfaces to bind additionally to the one defined in 'Local IP' and 'Local port' main input fields, please use the srt-group-nodes parameter.

Parameter syntax is “srt-group-nodes” with value like “ip1:port1, ip2:port2, ip3:port3” etc.
Example of value: "192.168.1.2:1101, 10.0.0.2:1100, 172.16.0.2:1100"

See the screenshot below


If any of our customers have SRT sender software or hardware capable of bonding, please feel free to try it with Nimble Streamer and let us know how it works, your real-life feedback is always valuable for us.


Learn more about SRT features of Nimble Streamer and SRT capabilities of other Softvelum products.


Follow us in social media to get updates about our new features and products: YouTubeTwitterFacebookLinkedInRedditTelegram


August 7, 2022

Pull SRT by streamid with RTMP-style app and stream name

SRT (Secure Reliable Protocol) has a lot of features that make it a reliable replacement for other delivery protocols like RTMP or RTSP. One of the features that RTMP or RTSP users enjoy is the ability to pull media streams from the source media server by their application name and stream name via single port. This gives a lot of flexibility for both source and recipient parties.

Nimble Streamer now brings that convenience of RTMP to SRT.

When providing output SRT streams via Listen mode, Nimble allows the following setup.

  • Live streams input is set up from any protocol like SRT, RTMP, NDI, MPEGTS UDP, streams from Live Transcoder or any other described here. The respective output streams are available for further usage as described in our tutorials.
  • SRT output with Listen mode is set with Use stream ID option for sources, as shown below.
  • The receiving party uses SRT streamid parameter defined as "appname/streamname" just like in RTMP pulled streams.
  • Nimble generates the SRT output using the live stream which was requested in streamid parameter.

So having just one IP address and one port, Nimble can serve multiple output streams just as the receivers are requesting.

This feature is part of Nimble Addenda package and it requires active Addenda license registered on Nimble server instance.


Setup process


Before moving forward, you need to enable this parameter in nimble.conf and restart Nimble instance:

srt_multipoint_listener_enabled = true

Read this page for more details about changing parameters of Nimble Streamer.


We assume you already have some incoming stream to process. E.g. you have some RTMP encoder publishing content into Nimble Streamer and it's available in Nimble via "live" application name and the incoming stream as "output" stream name.

To set up SRT part, go to Live streams settings menu, choose UDP streaming tab.

Just like for regular SRT output, click the Add SRT setting button.

In the new dialog, choose Listen mode - as this feature is working only in Listen - and specify local IP and port. We recommend using All interfaces selection (0.0.0.0).

Besides other parameters, select Use stream ID radio button. This will set Source app and stream  name to {STREAM_ID} placeholder which means the streamid parameter will be used to determine the source stream.


You can read more about other options in SRT output setup.

Once you save settings, they will be applied to the server within a few seconds.

That's it, you can now pull streams via SRT in Caller mode with proper streamid.


Usage


Let's see a couple of examples or obtaining the stream with proper streamid parameter.

First, here's an example of an srt-live-transmit tool getting the stream and playing it:

./srt-live-transmit "srt://192.168.0.114:2022?mode=caller&streamid=live/output" file://con | ffplay -

You can see it requesting SRT stream with streamid containing app and stream name (streamid=live/output).


Here's how the SRT stream setup may look like in Larix Player mobile app which supports SRT playback in all modes for both Android and iOS.


You can see IP and port of media server, SRT is set in Caller mode and then streamid parameter is set to "live/output".


Feel free to use this feature set for providing SRT streams.

Let us know if you have any questions about this feature.

Follow us in social media to get updates about our new features and products: YouTubeTwitterFacebookLinkedInRedditTelegram


Related documentation

Nimble Streamer SRT capabilities, SRT Publisher Assistance Security SetNimble Addenda

August 3, 2022

Passing MPEGTS data intact into MPEGTS, UDP, SRT and RIST

Nimble Streamer MPEG-TS streaming capabilities cover a lot of features such as receiving UDP-based MPEGTS, SRT or RIST and then re-packaging it into other protocols. You may also package it back into MPEGTS and stream out via UDP, SRT, RIST or HTTP MPEGTS.

However there are cases when you need to pass the MPEGTS through Nimble Streamer directly without re-packaging. Like, when the stream has some data that Nimble must keep intact, such as subtitles, metadata, PIDs, multiple tracks etc.

Now you can set Nimble to generate UDP-based output directly from MPEGTS input without additional re-packaging. In addition to that you can generate HTTP-based MPEGTS output.

Let's see how it's done.

We assume that you've already added the incoming MPEGTS streams like it's shown in our MPEGTS setup tutorial article.

We have an input from SRT incoming stream.


With the input available, we can set up all output scenarios.

UDP, SRT and RIST output

Now you need to set up UDP output, so open UDP streaming tab.


We've added the SRT output based on SRT streaming capabilities input so we'll add UDP output, using original SRT stream to pass it through.

Click on Add UDP setting and enter your destination IP address and port.

In addition to that, click on Raw MPEGTS source radio button, then select the input you'd like to use via Source dropdown.


We've selected our SRT input and saved settings.


In the list of UDP streams you'll see the Source streams column to have a respective input.


HTTP-based MPEGTS output


The same approach can be applied to general MPEGTS output.

Go to MPEGTS Out tab and click on Add outgoing stream.

In the setup dialog, choose Raw MPEGTS source and select the existing source from the list.



That's it, once you save setting, it will be applied within a few seconds and you'll be able to use MPEGTS HTTP output as bypass. As per above screenshot, the URL will look like https://nimble_server_ip_or_domain/live/mpegts/mpeg.2ts


Notice that in in this mode, you may have HTTP MPEGTS output only. Nimble doesn't process this stream, hence no Transcoder cannot be used.


Follow us in social media to get updates about our new features and products: YouTubeTwitterFacebookLinkedInRedditTelegram


Related documentation

MPEG-TS feature setMPEGTS setup tutorial, SRT support in Nimble Streamer

July 19, 2022

SEI metadata NTP time sync support in Nimble Streamer

Remote production with multiple video sources has an known issue that may affect the perception of the viewers. Each source (camera or encoder) delivers the stream with its own delay relative to the time when it was shot and that delay is typically different across devices. Thus when the first stream is delivered with half-second delay and another one is a second away from it, with both cameras showing the same object from different angles, a viewers will see a significant difference. That becomes critical in some intense real-time events like sport matches.

So producers and technicians need to have the way to synchronize all sources in order to use them in a single time scale. The industry-proven way to solve this is to have this kind of setup:

  1. All sources are set to use the same reference time, e.g. get it from the same NTP server.
  2. Each source inserts SEI metadata into the stream's content frames when encoding the output.
  3. Destination media decoder is set to have a certain time window (a delay) before sending the content further.
  4. Decoder gets SEI of frames of all sources, holds frames with respective timestamp and send them out at the same time when the delay is over.

Nimble Streamer allows using that approach and handles the following  two use cases related to SEI metadata.

  • Synchronize NDI multiple output streams. If Nimble Transcoder scenario has certain settings (described further), then Nimble will process SEI metadata and will delay the output to provide simultaneous NDI outputs.
  • Forward SEI metadata. Nimble Live Transcoder may get SEI metadata from source frames and add it into the output content, i.e. simply pass it though regardless of the output protocol. This is useful if you need to process the content with Nimble Transcoder but then send it out for further processing.

Both H.264/AVC and H.265/HEVC video codecs are supported for SEI metadata extraction.

Let's see the setup and usage process. We also added a video tutorial with sync demo below.


Prerequisites

Both cases are set up in Nimble Streamer Live Transcoder so make sure you have the following:

This instruction assumes that you've already set up live stream input with the protocol of your choice. Take a look at the live streaming digest page to find proper instructions, like SRT full setup instruction, RTMP setup article and more.

In our example we've already set up our incoming stream "/live/stream1" as example.

Make sure your source video streams have SEI metadata and they all are synced with the same NTP server.

Nimble Streamer uses system time, so make sure your server's OS is also synced up via NTP.


Set up transcoder scenario and decoder element

First, let's add transcoder scenario for our first stream. You can watch our Transcoder tutorial videos to see how it's usually done. We go to "Transcoders" menu and click on "Create new scenario" button.

In this new scenario, add a new Video source element which represents decoder settings. Put first stream's app name and stream name in respective edit boxes.

Then click on "Forward SEI timecodes" checkbox.



If that parameter is set, the Transcoder will take SEI data into further processing.

Notice that you can use any decoder for extracting SEI metadata, except for "quicksync" at the moment.


NDI output setup

To create NDI output, add a Video output element, check NDI there, then add the name of the output.

Then click on Expert setup link to open additional parameters. There you will find "SEI timecode delay (ms)" edit box.

This delay parameter defines how much time the Transcoder will wait from SEI timecode of a received frame before sending the content to the output stream. Nimble will wait for frames to arrive within a "delay" time frame and then will send them out. Thus if you define the same delay for several output streams in Transcoder, they will be sent out to their respective NDI streams at the same time even if some of the frames were delivered with some latency.

Since NDI has just a few milliseconds of delay in delivery within a local network, your production software will receive all your streams simultaneously.


In our example we set this delay to 1 second to make sure all source streams are delivered regardless of their own latency and network conditions.

You need to check network conditions between your streams' sources and other parameters that may require to increase the timecode delay. If you use SRT, check what values of "latency" parameters are used for your senders: your delay must be larger than that with some spare time added.

As for NDI in general, you can read NDI setup article and watch NDI video tutorials playlist to learn more about the setup and usage process.


Add audio

Don't forget about audio content in your Transcoder scenario.

  • For NDI audio output, add a decoder element with default settings and an NDI encoder, just like you did for video.
  • For other types of audio output, you can either make a passthrough, or create a decoder-encoder couple if you need to transform the audio.


Set up all streams

Once you save the scenario and the settings are applied to your server, you need to create other scenarios for all other streams, all having the same delay setting. As soon as you apply them, all of them will be in sync with each other.


Forward SEI metadata

If you have an existing Transcoder scenario and you want to make sure the SEI metadata is kept intact, you can set up proper forwarding in Video output element under Expert setup section.

Currently SEI forwarding is supported for libx264 and NVENC encoding libraries only.

With that option enabled, the resulting output H.264 stream will have SEI time metadata. The recipient encoder will need to take care of synchronizing the streams by itself.


Passthrough

If you need to create a scenario with Passthrough element for video in it, then your SEI metadata will be automatically passed through Transcoder without any modification. You don't need to do anything specific in this case.


Video: tutorial with demo

Watch how this can be set up, using Makito X4 encdoer as a source.




That's it. Later on we'll introduce a video tutorial to show the setup in details.

Meanwhile feel free to try this feature and let us know if you have any additional thoughts on it.

Follow us in social media to get updates about our new features and products: YouTubeTwitterFacebookLinkedInRedditTelegram


Related documentation

Nimble Streamer, Live Transcoder, NDI support in Nimble, Our YouTube channel

July 10, 2022

Publishing HEVC via WebRTC from Apple devices

WebRTC typical video codecs list for streaming solutions has H.264, VP8 and VP9, where H.264 is dominant. However with the rise of HEVC/H.265 popularity, technology vendors consider adding it into the WebRTC stack as well.

Apple devices and OSes support HEVC in WebRTC ingest.

So we also decided to add that capability into our products.

Now Nimble Streamer supports HEVC codec in WebRTC ingest feature set. You can publish WebRTC  stream from iPhones, iPads and Mac devices with that codec.

If you want to try it now, please refer to Nimble Streamer WebRTC setup instructions and watch WebRTC features playlist on our YouTube channel.

Notice that if you want to use HEVC specifically, use "h265" value for "videocodecs" parameter:

https://your_host/live/whip?whipauth=login:password&videocodecs=h265

You can try our Publisher demo page for that and use WebRTC JS publisher library in your own solutions.


To enable this feature in iPhone Safari, go to this menu:

Settings >> Safari >> Advanced >> Experimental Features and enable "WebRTC H265 Codec" parameter as shown below:




Also try: Larix Broadcaster for iOS allows ingesting WebRTC into Nimble Streamer or any other server/service with WHIP support with H.264/AVC video.


Let us know if you have any questions regarding our WebRTC implementation.


Follow us in social media to get updates about our new features and products: YouTubeTwitterFacebookLinkedInRedditTelegram

July 6, 2022

Opus audio support in SLDP

SLDP protocol is widely used by Nimble Streamer for low latency last-mile delivery of live streams. It supports a wide variety of codecs, though customers had to rely on source and player capabilities to play their codec of choice.

Our team has recently released WebRTC support in Nimble Streamer which introduced new audio codec - Opus - which is the standard in WebRTC world. So a customer who wanted to play audio properly in browsers via SLDP, had to transcode it into AAC first.

Now Opus is fully supported in SLDP. So Nimble Streamer can take any input with Opus audio and transmux it into SLDP with no need to transcode it.

Our SLDP Player for HTML5 has full support for Opus playback in any browsers on Windows, Linux, Android. With iOS17, it also works on iPhone.

Play audio-only SLDP with Opus on iPhone - take a look at this capability of SLDP protocol as well.

Larix Player for Android and iOS have Opus playback support for SLDP and SRT protocols.

So if you get WebRTC ingest into Nimble Streamer, you may transmux it into SLDP output as a direct pipeline without the need for audio transcoding. You may still need to transcode your video to change the video codec and to set video resolution but your audio can be used directly as a passthrough. And if you use WebRTC for audio-only transmission, this will be a direct pipeline for you.

We plan adding Opus support into Larix Player apps and SDK for iOS and Android so you could add this codec into your mobile playback solutions as well.

June 7, 2022

Manage client sessions using Nimble API

Nimble Streamer allows controlling end-user client sessions using various approaches.

The most capable approach is Pay-per-view framework which allows controlling the streaming process on per-stream and per-user level. You can use your own handler application with custom business logic to gather stats and block un-wanted viewers and listeners. Nimble sends data to PPV handler and acts according to the response.

Another approach is to use playback session authorization framework where Nimble communicates with custom handler app on each streaming session start. Nimble sends data about a connection being established and in response the handler returns the decision whether Nimble must allow or deny the new session.

Both solutions assume that Nimble will send requests to a handler and will get responses with some decisions regarding current sessions.

New API

Now we introduce additional approach that works the opposite way, it uses Nimble Streamer HTTP API where you make calls to Nimble Streamer instance.

You make direct API call, get the list of active sessions and then make follow-up calls to deleted un-wanted sessions.

Initial setup

First, follow Pre-setup steps on API description page. This is required in order to enable and use the API. It needs a couple of parameters in nimble.conf file.

In addition you can secure your calls by using security token as described in respective section.

Get list of sessions

Use /manage/sessions method to get the list of current sessions as described in this docs section. In your response you'll get a JSON containing data of each session, including app and stream name, IP and some other parameters. Each session has its ID which you can use for terminating it.

Having list of all sessions, you can save it to your own database as well as make further decisions about each connection.

Delete specific session

If some clients need to be disconnected, you can call /manage/sessions/delete method with a list of session IDs that must be disconnected. Full description is available in this section

The HLS and MPEG-DASH clients will get 403 HTTP response while for other protocols (MPEG-TS/Icecast/RTMP/RTSP/SRT) they will just be disconnected.


This set of APIs provides a simple way to control Nimble Streamer playback. If you need more sophisticated way, check other approaches on top of this article.

May 4, 2022

WebRTC publish setup for Nimble Streamer

In memory of Alex Gouaillard
who inspired our team for WebRTC

WebRTC has become a significant part of live streaming landscape in various use cases and scenarios - from low latency streaming to live chatting. It's a big stack of technologies which are combined in various combinations depending on the problem which a customer needs to resolve.

Softvelum team got multiple requests from customers regarding WebRTC support and finally we came to a combination of proper technology pieces best fit for solving this task. The streaming tasks which our customers specifically wanted us to solve are related to easy ingest of live streams from any browser.

Current WebRTC support in Nimble Streamer covers the following:

  • Ingest of WebRTC live stream into Nimble Streamer.
  • WHIP is used for signaling, see details below.
  • H.264, VP8 and VP9 video and Opus audio input.
  • H.265/HEVC video input from Apple devices.
  • AV1 video input from Chrome
  • JavaScript client for publishing video and demo page with sample client.

Signaling is an important part of WebRTC stack because it defines how a client connects to the host or to another client. Nimble Streamer uses WebRTC-HTTP ingestion protocol (WHIP) for signaling. It's a standard with Internet Draft status and it's already used by various WebRTC products. So we decided to use WHIP to be compatible with as many solutions as possible.

Nimble Streamer uses Pion implementation of WebRTC API. Special thanks to Sean DuBois and all Pion contributors.

Output streams can be generated in all protocols supported by Nimble Streamer, e.g. HLS, SLDP, NDI, SRT, depending on output codecs and required transcoding, see more details below.

Let's go step by step to set up Nimble Streamer to receive WebRTC ingest.


Notice that currently only Linux version of Nimble Streamer supports WebRTC. We're working on Windows support.

We assume you've already installed Nimble Streamer or upgraded it to the latest version. You also need and active WMSPanel account and a respective subscription.


1. Enable feature in Nimble config

First, you need to add a couple of parameters to nimble.conf to enable the feature. On Linux this file is available as /etc/nimble/nimble.conf. For more details about this file and its parameters, check Configuration reference page.

Add these parameters:

webrtc_whip_support = true
access_control_allow_headers = content-type
access_control_expose_headers = location
transcoder_change_params_on_the_fly_enabled = true

Then re-start Nimble Streamer instance. On Ubuntu it's done by this command:

sudo service nimble restart

Check installation instructions for other platforms.


2. Set up SSL for Nimble

The next step is to enable SSL for your Nimble instance as it's required for secure WHIP signaling.

You can set up your SSL certificate using this general instruction. You may obtain CertBot Let's Encrypt free certificate as we've described here.

For testing purposes you may create your own self-signed certificate but in order for it to work, you'll need first to open any https:// page like https://yourhost.com:port/ and accept security risk.

For production purposes in general, you need to have a valid SSL certificate. Also, your server must be assigned for the domain of this certificate.

Once you've set up SSL for Nimble, you need to test it. Open https://yourhost.com in your browser, where yourhost.com is the host of your Nimble. If you get error 404 and have no warnings from your browsers then your SSL was set up is properly and is working.


3. Set up WHIP client authorization

WHIP clients use URL parameters to pass their settings to the host.

WHIP client allows publishing from any browser to the server so Nimble Streamer requires to have user and password to be defined for the application where the publishing will be performed. If you don't set up a user/password credentials pair for the WHIP client and for the target application then your user won't be able to stream.

To set up an application, go to WMSPanel, open Live Streams Settings menu, choose a designated server, open Applications tab and create an application with required user and password.


In our example the app name is "live" and we'll use "whip" as stream name later on.

You may use two options to authorize clients on the server for publication.

3.1 Simple user/pass authorization

In your client publishing URL, use "whipauth" parameter to send credentials like this:

https://your_host/live/whip?whipauth=login:password

Where "live" is the name of the application with credentials.

Notice that whoever opens your publishing page, they can see your app name and user/password pair. This means high chance of leaking credentials for unauthorized publications. So use this authorization approach only if your debugging or if you provide separate applications for your trusted publishers.

In any other case, please use publish control framework.

3.2 Publish control framework

If you need more sophisticated authorization of your publishers based on your business logic, use Publish control framework. With publish control, you can prevent leaking of your publishing credentials. Also, you'll be able to get the status of all published streams and decline any of them any time.

When the setup is done, the URL will have "publishsign" parameter.

https://your_host/live/whip?publishsign=aWQ9SURfMSZzaWduPW95Zi9YVHBLM0c3QkQ4SmpwVnF1VHc9PSZpcD0xMjcuMC4wLjE=

Read this setup article to get all details.


4. Codecs support

As was mentioned earlier, Nimble Streamer supports H.264, VP8 and VP9 video with Opus audio in WebRTC ingest. So if your client uses these codecs, Nimble will be able to process them and make proper output.

In addition, you may use H.265/HEVC video input from Apple devices as well as AV1 browser publishing from Chrome.

If you need your users to publish from their browsers only with certain video codec, you can indicate that by setting "videocodecs" parameter. E.g. to make server accept only H.264 you can set it like this:

https://your_host/live/whip?whipauth=login:password&videocodecs=h264

If you are ready to accept either H.264 or VP8, use comma in that parameter's value:

https://your_host/live/whip?whipauth=login:password&videocodecs=h264,vp8


5. Generating output

Once the content is ingested, Nimble Streamer provides the following options for further processing.

5.1 Direct output via limited protocols

If the ingest has H.264 and Opus codecs, Nimble Streamer will be able to generate H.264/Opus output via these protocols:
  • MPEG2TS-based protocols: MPEG-TS over UDP multicast, SRT and RIST, to be played via VLC or ffmpeg.
  • SLDP: the protocol supports Opus, it can be played in SLDP web player on Windows, Linux and Android.

5.2 Full-featured transcoded output

All input codecs - VP8, VP9, H.264 and Opus - can be transcoded into any other codecs. That includes H.264/AAC output that is a de-facto standard for Internet, as well as generate HEVC (H.265) video. 

Use Nimble Live Transcoder to transform the input with any variety of decoders and encoders, with software libraries, NVENC and QuickSync hardware acceleration.

Watch Transcoder video tutorials for more setup examples. For those scenarios, the WebRTC ingest will be just another input stream. Here is a simple example of a transcoder scenario with H.264 and AAC output.






You can then use any combination of live streaming output protocols and options, like HLS, SRT, NDI, SLDP, etc. Nimble Streamer transmuxing engine will provide any combination you need.

You can also record the generated content using Nimble Streamer DVR and then provide the playback using HLS and MPEG-DASH protocols.

5.4 Notice on packet loss

Notice that if a publishing client and your server are located far from each other or need to communicate via bad quality networks, then you should expect some video and audio frames loss. Protocols and players handle this type of frames loss differently. At this moment Nimble Streamer does not try to add fake video frames or audio silence to compensate that behavior.

6. Network-related and general parameters

By default, Nimble Streamer works in ice-lite mode

If Nimble server instance runs on a host with public IP address then additional configuration is not needed.

If a server instance runs on Amazon EC2 then you'll need to create an additional config file at /etc/nimble/whip_input.json and add the following JSON there:

{
  "NAT1To1IPs":"a.b.c.d",
  "NAT1To1CandidateType":"host"
}

where "a.b.c.d" is a public address assigned to AWS server instance. If it has multiple IP addresses, just add them in the same parameters separating by comma like this:

{
  "NAT1To1IPs":"a.b.c.d,w.x.y.z",
  "NAT1To1CandidateType":"host"
}

This file is processed by Nimble at the beginning of each new publishing session, so you can change it without re-starting the server.

To define ports range, you can also add these parameters:

{

  "PortMin":1000,
  "PortMax":40000
}

In this case the candidates will be selected only from the range of ports 1000 to 40000.

If you want to use the same port for all WHIP ingest connections, you can use the following parameter instead of PortMin and PortMax.

{
  "ICEUDPMuxPort":1234
}

Notice that you cannot use the same port for both WHIP ingest and WHEP playback simultaneously.

If you use network parameters mentioned above, the combined JSON in this case will be:

{
  "NAT1To1IPs":"a.b.c.d",
  "NAT1To1CandidateType":"host",
  "PortMin":1000,
  "PortMax":40000
}

In addition you can use this config file to define supported codecs on the server level instead of defining them per session using SupportedVideoCodecs parameter:

{
  "NAT1To1IPs":"a.b.c.d",
  "NAT1To1CandidateType":"host",
  "PortMin":1000,
  "PortMax":40000, 
  "SupportedVideoCodecs":"vp8,h264" 
}

JSON format requires this kind of syntax and if you add them in different blocks or have no commas between parameters, Nimble will not process the config.


7. Browser publishing library and demo page

We've created a JavaScript library which you can use for adding publishing capabilities into your web pages. Use its code in your projects or take it as is for embedding into your pages to connect your users to Nimble Streamer.

There's also WebRTC publication demo page which uses that library to provide simple way to check your server setup. Just enter a WHIP URL with server address and publishing credentials, and then click on Publish. You will then be able to use your camera and microphone to streaming, and will see detailed logs of what's happening.

8. Mobile publishing client


Larix Broadcaster for iOS and Android allows ingesting WebRTC into Nimble Streamer or any other server/service with WHIP support with H.264/AVC video and Opus audio.

Watch Ingesting WebRTC from Larix Broadcaster into Nimble Streamer tutorial as an example of setup. There we make passthrough scenario with Opus audio ingested from iPhone and played in browser.

9. Video tutorials


Watch this brief tutorial demonstrating the setup process.



Also watch setup process to take WebRTC ingest and produce NDI output from it.


The following tutorial shows how to set up a Nimble Streamer to receive content via WebRTC and then send it as the UDP multicast into the local network without transcoding.



10. WHEP ultra-low latency playback

If you'd like to use WebRTC ultra-low latency for further playback, Nimble Streamer fully supports it.

Please read WHEP WebRTC low latency playback in Nimble Streamer article for more details regarding setup and usage.



Our team keep improving WebRTC support in Nimble Streamer, so stay tuned for updates.

Follow us in social media to get updates about our new features and products: YouTubeTwitterFacebookLinkedInRedditTelegram

Related documentation


March 17, 2022

Quick URL Import

We’d like introduce our new improvement to the WMSPanel called Quick URL Import.

The Quick URL Import button in MPEGTS IN and UDP Streaming tabs in Live Streams Settings menu will help instantly transfer publishing or ingest URL from your stream provider to Nimble. Feel free to use it with UDP, HTTP, HLS, SRT or RIST protocols.

The standard URI is accepted as:
protocol://HOST:PORT/PATH?PARAM1=VALUE1&PARAM2=VALUE2&...

This will save your time on editing settings if you have stream URL with encoded parameter in the URL.

Quick import will recognize the stream protocol and additional parameters in the URL and the accepted parameters will be automatically filled as options in the corresponding fields.

For SRT, you may even use the streamid format proposed by the Haivision. The RIST URL syntax supported as described on this documentation page.


Just find a green Quick URL Import button on MPEGTS IN or UDP Streaming tabs.


Then fill in the URL. Depending on a protocol, a new window will appear after the Add setting button is pressed. As the URL is parsed, the parameters will be filled in the corresponding fields.



Add more parameters like stream name to have a complete setting and you're good to go with the streaming.

Related documentation



January 31, 2022

HEVC support for Widevine and PlayReady DRM in Nimble Streamer

Nimble Streamer DRM provides wide range of DRM encryption technologies and key management platforms.

Recently we've added support for H.265/HEVC codec for these encryption technologies:


They work for all major scenarios:

So you can deliver your un-protected stream into Nimble Streamer, convert it into MPEG-DASH or HLS fMP4, record into DVR if needed, encrypt and then deliver to your viewers for further playback. And if you have any VOD files, you can define transmuxing rules and then set up DRM so they could be played via DRM-powered players.

On the viewers' side, Larix Player for Android allows running MPEG-DASH streams via embedded ExoPlayer in all streaming modes and decode Widevine and PlayReady streams. You can download it on Google Play and visit Player website to learn more.

Feel free to try Nimble Streamer DRM in action and let us know of any questions.


Related documentation

Nimble Streamer DRM, Nimble Addenda package, Larix Player for Android











January 26, 2022

Support for CEA subtitles: CEA-608 in MPEG-DASH and CEA-708 in HLS

 Nimble Streamer has wide support for MPEG-DASH live streaming, including subtitles processing.

When CEA-608 subtitles are integrated into a video track, most players require those subtitles to be declared in the manifest, otherwise a viewer cannot select them at all.

This tag is used in a manifest for the declaration:

<Accessibility schemeIdURI="urn:scte:dash:cc:cea-608:2015" value="CC1=lang">

where value contains the number of the track with subtitles and their language, e.g. "CC1=eng".

This option can be set in server settings under Nimble Streamer / Live Streams Settings menu in Global  tab in CEA-608/CEA-708 settings field.

This setting is applied to live and DVR output streams.

The format is as follows:
<app1>[/<stream>]:N=<lang>[;N=<lang>] <app2>[/<stream>]:N=<lang>[;N=<lang>]
Each new application is separated by a space. Here's an example where all streams for "live_app" application will have first track with Russian subtitles:


For MPEG-DASH CEA-608 the setting is simply "live_app:1=rus". This is what you'll see in a manifest:
<Accessibility schemeIdURI="urn:scte:dash:cc:cea-608:2015" value="CC1=rus">
For HLS CEA-708 it's "live_app:708.1=rus"

This is how you'll see it in your player:




You may combine settings for multiple apps and streams, e.g. for DASH CEA it would be
live_app:1=eng;2=rus live_app2/stream1:1=eng;2=fra
will set two tracks for all streams in "live_app" application and also will define two tracks for a single "live_app2/stream1" stream.

If you want to set a setting for entire server, just skip "app=" part. E.g. for HLS CEA-708 set parameter to "708.1=eng"


Please also take a look at Subtitles digest page to see what else Nimble can do for you.


Related documentation 

Nimble Streamer MPEG-DASH features



January 12, 2022

Server playlist support for live steams input

Server playlist feature set for Nimble Streamer was introduced to provide capabilities to create output live streams from a set of VOD files.

Now Server playlist got a couple of more features to improve it:

  • take live streams as input for playlist entries;
  • define default streams in case current playlist entry is not available.

Notice that new features do not change the playlist's basic principles and mechanics. They add new parameters as we describe below. So before reading about the updates, please get familiar with these materials:

Let's see what we've got.


Live streams input

You can specify any available live streams. So no matter where your live stream is coming from - RTMP, SRT input or a stream from Nimble Transcoder - you can use it as your source.

You need to prepare your content for playlist input and also additionally transcode it afterwards as we describe in section 2 "Preparing content" of Server playlist spec.

The semantics of live stream input is similar to VOD input: it's inserted among other entries in "Streams" block having "Type" parameter set to "live" as shown below.

{
  "SyncInterval": 5000,
  "Tasks":
  [
    {
      "Stream": "live/playlist",
      "Blocks": [
        {
          "Id":"1", "Start":"2022-01-17 08:00:00",
          "Streams":[
            {
              "Type":"vod", "Source":"/var/mp4/sample.mp4", "Duration":20000"
            },
            {
              "Type":"live", "Source":"live/stream", 
"Duration":600000"
            }
          ]
        }
      ]
    }
  ]
}

The following parameters can be used for live stream entry:

  • Source - input stream name, defined as "application_name/stream_name" as seen in output streams at Nimble Streamer live streams page.
  • Duration - the duration of the current stream.
  • TotalDuration is also supported but it means the same as Duration. If both parameters are set, the one with the smallest value will be used.


Default streams

If the current live stream which is supposed to be playing now, is unavailable for some reason, you may specify a default stream which will be played instead. A DefaultStream parameter can be defined on a block level as shown below.

{
  "SyncInterval": 5000,
  "Tasks":
  [
    {
      "Stream": "live/playlist",
      "Blocks": [
        {
          "Id":"1", "Start":"
2022-01-17 08:00:00",
          "DefaultStream": {
            "Type":"live",
            "Source": "live/default"
          },

          "Streams":[
            {
              "Type":"live", "Source":"live/stream"
            },
            {
              "Type":"vod", "Source":"/var/mp4/sample.mp4", "Duration":20000
            }
          ]
        }
      ]
    }
  ]
}

This default stream may have "Type" parameter be either "live" or "vod". The "Source" defines where the content is taken from, see server playlist spec for details.

For VOD mode, it also supports "AudioStreamId" and "VideoStreamId" parameters to select a respective track if a VOD file has several tracks.


Playlist Generator

You can use our Playlist Generator to create a simple playlist using our UI wizard

Watch this video tutorial to see the setup process in action.




Let us know if you have any further feedback regarding the server playlist.


Related documentation

Server playlistGenerate NDI stream from local files via Server PlaylistWeb UI for Server Playlist