June 7, 2022

Manage client sessions using Nimble API

Nimble Streamer allows controlling end-user client sessions using various approaches.

The most capable approach is Pay-per-view framework which allows controlling the streaming process on per-stream and per-user level. You can use your own handler application with custom business logic to gather stats and block un-wanted viewers and listeners. Nimble sends data to PPV handler and acts according to the response.

Another approach is to use playback session authorization framework where Nimble communicates with custom handler app on each streaming session start. Nimble sends data about a connection being established and in response the handler returns the decision whether Nimble must allow or deny the new session.

Both solutions assume that Nimble will send requests to a handler and will get responses with some decisions regarding current sessions.

New API

Now we introduce additional approach that works the opposite way, it uses Nimble Streamer HTTP API where you make calls to Nimble Streamer instance.

You make direct API call, get the list of active sessions and then make follow-up calls to deleted un-wanted sessions.

Initial setup

First, follow Pre-setup steps on API description page. This is required in order to enable and use the API. It needs a couple of parameters in nimble.conf file.

In addition you can secure your calls by using security token as described in respective section.

Get list of sessions

Use /manage/sessions method to get the list of current sessions as described in this docs section. In your response you'll get a JSON containing data of each session, including app and stream name, IP and some other parameters. Each session has its ID which you can use for terminating it.

Having list of all sessions, you can save it to your own database as well as make further decisions about each connection.

Delete specific session

If some clients need to be disconnected, you can call /manage/sessions/delete method with a list of session IDs that must be disconnected. Full description is available in this section

The HLS and MPEG-DASH clients will get 403 HTTP response while for other protocols (MPEG-TS/Icecast/RTMP/RTSP/SRT) they will just be disconnected.


This set of APIs provides a simple way to control Nimble Streamer playback. If you need more sophisticated way, check other approaches on top of this article.

May 4, 2022

WebRTC publish setup for Nimble Streamer

In memory of Alex Gouaillard
who inspired our team for WebRTC

WebRTC has become a significant part of live streaming landscape in various use cases and scenarios - from low latency streaming to live chatting. It's a big stack of technologies which are combined in various combinations depending on the problem which a customer needs to resolve.

Softvelum team got multiple requests from customers regarding WebRTC support and finally we came to a combination of proper technology pieces best fit for solving this task. The streaming tasks which our customers specifically wanted us to solve are related to easy ingest of live streams from any browser.

Current WebRTC support in Nimble Streamer covers the following:

  • Ingest of WebRTC live stream into Nimble Streamer.
  • WHIP is used for signaling, see details below.
  • H.264, VP8 and VP9 video and Opus audio input.
  • JavaScript client for publishing video and demo page with sample client.

Signaling is an important part of WebRTC stack because it defines how a client connects to the host or to another client. Nimble Streamer uses WebRTC-HTTP ingestion protocol (WHIP) for signaling. It's a standard with Internet Draft status and it's already used by various WebRTC products. So we decided to use WHIP to be compatible with as many solutions as possible.

Nimble Streamer uses Pion implementation of WebRTC API. Special thanks to Sean DuBois and all Pion contributors.

Output streams can be generated in all protocols supported by Nimble Streamer, e.g. HLS, SLDP, NDI, SRT, depending on output codecs and required transcoding, see more details below.

Let's go step by step to set up Nimble Streamer to receive WebRTC ingest.


Notice that currently only Linux version of Nimble Streamer supports WebRTC. We're working on Windows support.


1. Enable feature in Nimble config

First, you need to add a couple of parameters to nimble.conf to enable the feature. On Linux this file is available as /etc/nimble/nimble.conf. For more details about this file and its parameters, check Configuration reference page.

Add these parameters:

webrtc_whip_support = true
access_control_allow_headers = content-type
access_control_expose_headers = location

Then re-start Nimble Streamer instance. On Ubuntu it's done by this command:

sudo service nimble restart

Check installation instructions for other platforms.


2. Set up SSL for Nimble

The next step is to enable SSL for your Nimble instance as it's required for secure WHIP signaling.

You can set up your SSL certificate using this general instruction. You may obtain CertBot Let's Encrypt free certificate as we've described here.

For testing purposes you may create your own self-signed certificate but in order for it to work, you'll need first to open any https:// page like https://yourhost.com:port/ and accept security risk.

For production purposes in general, you need to have a valid SSL certificate. Also, your server must be assigned for the domain of this certificate.

Once you've set up SSL for Nimble, you need to test it. Open https://yourhost.com in your browser, where yourhost.com is the host of your Nimble. If you get error 404 and have no warnings from your browsers then your SSL was set up is properly and is working.


3. Set up WHIP client authorization

WHIP clients use URL parameters to pass their settings to the host.

WHIP client allows publishing from any browser to the server so Nimble Streamer requires to have user and password to be defined for the application where the publishing will be performed. If you don't set up a user/password credentials pair for the WHIP client and for the target application then your user won't be able to stream.

To set up an application, go to WMSPanel, open Live Streams Settings menu, choose a designated server, open Applications tab and create an application with required user and password.


In our example the app name is "live" and we'll use "whip" as stream name later on.

You may use two options to authorize clients on the server for publication.

3.1 Simple user/pass authorization

In your client publishing URL, use "whipauth" parameter to send credentials like this:

https://your_host/live/whip?whipauth=login:password

Where "live" is the name of the application with credentials.

Notice that whoever opens your publishing page, they can see your app name and user/password pair. This means high chance of leaking credentials for unauthorized publications. So use this authorization approach only if your debugging or if you provide separate applications for your trusted publishers.

In any other case, please use publish control framework.

3.2 Publish control framework

If you need more sophisticated authorization of your publishers based on your business logic, use Publish control framework. With publish control, you can prevent leaking of your publishing credentials. Also, you'll be able to get the status of all published streams and decline any of them any time.

When the setup is done, the URL will have "publishsign" parameter.

https://your_host/live/whip?publishsign=aWQ9SURfMSZzaWduPW95Zi9YVHBLM0c3QkQ4SmpwVnF1VHc9PSZpcD0xMjcuMC4wLjE=

Read this setup article to get all details.


4. Codecs support

As was mentioned earlier, Nimble Streamer supports H.264, VP8 and VP9 video with Opus audio in WebRTC ingest. So if your client uses these codecs, Nimble will be able to process them and make proper output.

If you need your users to publish from their browsers only with certain video codec, you can indicate that by setting "videocodecs" parameter. E.g. to make server accept only H.264 you can set it like this:

https://your_host/live/whip?whipauth=login:password&videocodecs=h264

If you are ready to accept either H.264 or VP8, use comma in that parameter's value:

https://your_host/live/whip?whipauth=login:password&videocodecs=h264,vp8


5. Generating output

Once the content is ingested, Nimble Streamer provides the following options for further processing.

5.1 Direct output via limited protocols

If the ingest has H.264 and Opus codecs, Nimble Streamer will be able to generate H.264/Opus output via MPEG2TS-based protocols: MPEG-TS over UDP multicast, SRT and RIST, and played via VLC or ffmpeg.

5.2 Full-featured transcoded output

All input codecs - VP8, VP9, H.264 and Opus - can be transcoded into any other codecs. That includes H.264/AAC output that is a de-facto standard for Internet, as well as generate HEVC (H.265) video. 

Use Nimble Live Transcoder to transform the input with any variety of decoders and encoders, with software libraries, NVENC and QuickSync hardware acceleration.

Watch Transcoder video tutorials for more setup examples. For those scenarios, the WebRTC ingest will be just another input stream. Here is a simple example of a transcoder scenario with H.264 and AAC output.






You can then use any combination of live streaming output protocols and options, like HLS, SRT, NDI, SLDP, etc. Nimble Streamer transmuxing engine will provide any combination you need.

You can also record the generated content using Nimble Streamer DVR and then provide the playback using HLS and MPEG-DASH protocols.

5.4 Notice on packet loss

Notice that if a publishing client and your server are located far from each other or need to communicate via bad quality networks, then you should expect some video and audio frames loss. Protocols and players handle this type of frames loss differently. At this moment Nimble Streamer does not try to add fake video frames or audio silence to compensate that behavior.

6. Network-related and general parameters

By default, Nimble Streamer works in ice-lite mode

If Nimble server instance runs on a host with public IP address then additional configuration is not needed.

If a server instance runs on Amazon EC2 then you'll need to create an additional config file at /etc/nimble/whip_input.json and add the following JSON there:

{
  "NAT1To1IPs":"a.b.c.d",
  "NAT1To1CandidateType":"host"
}

where "a.b.c.d" is a public address assigned to AWS server instance. If it has multiple IP addresses, just add them in the same parameters separating by comma like this:

{
  "NAT1To1IPs":"a.b.c.d,w.x.y.z",
  "NAT1To1CandidateType":"host"
}

This file is processed by Nimble at the beginning of each new publishing session, so you can change it without re-starting the server.

To define ports range, you can also add these parameters:

{

  "PortMin":1000,
  "PortMax":40000
}

In this case the candidates will be selected only from the range of ports 1000 to 40000.

If you use network parameters mentioned above, the combined JSON in this case will be:

{
  "NAT1To1IPs":"a.b.c.d",
  "NAT1To1CandidateType":"host",
  "PortMin":1000,
  "PortMax":40000
}

In addition you can use this config file to define supported codecs on the server level instead of defining them per session using SupportedVideoCodecs parameter:

{
  "NAT1To1IPs":"a.b.c.d",
  "NAT1To1CandidateType":"host",
  "PortMin":1000,
  "PortMax":40000, 
  "SupportedVideoCodecs":"vp8,h264" 
}

JSON format requires this kind of syntax and if you add them in different blocks or have no commas between parameters, Nimble will not process the config.


7. Browser publishing library and demo page

We've created a JavaScript library which you can use for adding publishing capabilities into your web pages. Use its code in your projects or take it as is for embedding into your pages to connect your users to Nimble Streamer.

There's also WebRTC publication demo page which uses that library to provide simple way to check your server setup. Just enter a WHIP URL with server address and publishing credentials, and then click on Publish. You will then be able to use your camera and microphone to streaming, and will see detailed logs of what's happening.


8. Video tutorials


Watch this brief tutorial demonstrating the setup process.



Also watch setup process to take WebRTC ingest and produce NDI output from it.


The following tutorial shows how to set up a Nimble Streamer to receive content via WebRTC and then send it as the UDP multicast into the local network without transcoding.




Our team keep improving WebRTC support in Nimble Streamer, so stay tuned for updates.

March 17, 2022

Quick URL Import

We’d like introduce our new improvement to the WMSPanel called Quick URL Import.

The Quick URL Import button in MPEGTS IN and UDP Streaming tabs in Live Streams Settings menu will help instantly transfer publishing or ingest URL from your stream provider to Nimble. Feel free to use it with UDP, HTTP, HLS, SRT or RIST protocols.

The standard URI is accepted as:
protocol://HOST:PORT/PATH?PARAM1=VALUE1&PARAM2=VALUE2&...

This will save your time on editing settings if you have stream URL with encoded parameter in the URL.

Quick import will recognize the stream protocol and additional parameters in the URL and the accepted parameters will be automatically filled as options in the corresponding fields.

For SRT, you may even use the streamid format proposed by the Haivision. The RIST URL syntax supported as described on this documentation page.


Just find a green Quick URL Import button on MPEGTS IN or UDP Streaming tabs.


Then fill in the URL. Depending on a protocol, a new window will appear after the Add setting button is pressed. As the URL is parsed, the parameters will be filled in the corresponding fields.



Add more parameters like stream name to have a complete setting and you're good to go with the streaming.

Related documentation



January 31, 2022

HEVC support for Widevine and PlayReady DRM in Nimble Streamer

Nimble Streamer DRM provides wide range of DRM encryption technologies and key management platforms.

Recently we've added support for H.265/HEVC codec for these encryption technologies:


They work for all major scenarios:

So you can deliver your un-protected stream into Nimble Streamer, convert it into MPEG-DASH or HLS fMP4, record into DVR if needed, encrypt and then deliver to your viewers for further playback. And if you have any VOD files, you can define transmuxing rules and then set up DRM so they could be played via DRM-powered players.

On the viewers' side, Larix Player for Android allows running MPEG-DASH streams via embedded ExoPlayer in all streaming modes and decode Widevine and PlayReady streams. You can download it on Google Play and visit Player website to learn more.

Feel free to try Nimble Streamer DRM in action and let us know of any questions.


Related documentation

Nimble Streamer DRM, Nimble Addenda package, Larix Player for Android











January 26, 2022

CEA-608 support in MPEG-DASH streams

 Nimble Streamer has wide support for MPEG-DASH live streaming, including subtitles processing.

When CEA-608 subtitles are integrated into a video track, most players require those subtitles to be declared in the manifest, otherwise a viewer cannot select them at all.

This tag is used in a manifest for the declaration:

<Accessibility schemeIdURI="urn:scte:dash:cc:cea-608:2015" value="CC1=lang">

where value contains the number of the track with subtitles and their language, e.g. "CC1=eng".

This option can be set in server settings under Nimble Streamer / Live Streams Settings menu in Global  tab in CEA-608 settings field.

This setting is applied to live and DVR output streams.

The format is as follows:
<app1>[/<stream>]:N=<lang>[;N=<lang>] <app2>[/<stream>]:N=<lang>[;N=<lang>]
Each new application is separated by a space. Here's an example where all streams for "live_app" application will have first track with Russian subtitles:


The setting is simply "live_app:1=rus". This is what you'll see in a manifest:
<Accessibility schemeIdURI="urn:scte:dash:cc:cea-608:2015" value="CC1=rus">
This is how you'll see it in your player:




You may combine settings for multiple apps and streams, e.g.
live_app:1=eng;2=rus live_app2/stream1:1=eng;2=fra
will set two tracks for all streams in "live_app" application and also will define two tracks for a single "live_app2/stream1" stream.

If you want to set a setting for entire server, just skip "app=" part, e.g. set parameter to "1=eng"


Related documentation 

Nimble Streamer MPEG-DASH features



January 12, 2022

Server playlist support for live steams input

Server playlist feature set for Nimble Streamer was introduced to provide capabilities to create output live streams from a set of VOD files.

Now Server playlist got a couple of more features to improve it:

  • take live streams as input for playlist entries;
  • define default streams in case current playlist entry is not available.

Notice that new features do not change the playlist's basic principles and mechanics. They add new parameters as we describe below. So before reading about the updates, please get familiar with these materials:

Let's see what we've got.


Live streams input

You can specify any available live streams. So no matter where your live stream is coming from - RTMP, SRT input or a stream from Nimble Transcoder - you can use it as your source.

You need to prepare your content for playlist input and also additionally transcode it afterwards as we describe in section 2 "Preparing content" of Server playlist spec.

The semantics of live stream input is similar to VOD input: it's inserted among other entries in "Streams" block having "Type" parameter set to "live" as shown below.

{
  "SyncInterval": 5000,
  "Tasks":
  [
    {
      "Stream": "live/playlist",
      "Blocks": [
        {
          "Id":"1", "Start":"2022-01-17 08:00:00",
          "Streams":[
            {
              "Type":"vod", "Source":"/var/mp4/sample.mp4", "Duration":20000"
            },
            {
              "Type":"live", "Source":"live/stream", 
"Duration":600000"
            }
          ]
        }
      ]
    }
  ]
}

The following parameters can be used for live stream entry:

  • Source - input stream name, defined as "application_name/stream_name" as seen in output streams at Nimble Streamer live streams page.
  • Duration - the duration of the current stream.
  • TotalDuration is also supported but it means the same as Duration. If both parameters are set, the one with the smallest value will be used.


Default streams

If the current live stream which is supposed to be playing now, is unavailable for some reason, you may specify a default stream which will be played instead. A DefaultStream parameter can be defined on a block level as shown below.

{
  "SyncInterval": 5000,
  "Tasks":
  [
    {
      "Stream": "live/playlist",
      "Blocks": [
        {
          "Id":"1", "Start":"
2022-01-17 08:00:00",
          "DefaultStream": {
            "Type":"live",
            "Source": "live/default"
          },

          "Streams":[
            {
              "Type":"live", "Source":"live/stream"
            },
            {
              "Type":"vod", "Source":"/var/mp4/sample.mp4", "Duration":20000
            }
          ]
        }
      ]
    }
  ]
}

This default stream may have "Type" parameter be either "live" or "vod". The "Source" defines where the content is taken from, see server playlist spec for details.

For VOD mode, it also supports "AudioStreamId" and "VideoStreamId" parameters to select a respective track if a VOD file has several tracks.


Playlist Generator

You can use our Playlist Generator to create a simple playlist using our UI wizard

Watch this video tutorial to see the setup process in action.




Let us know if you have any further feedback regarding the server playlist.


Related documentation

Server playlistGenerate NDI stream from local files via Server PlaylistWeb UI for Server Playlist