December 30, 2020

Recap of 2020: NDI, Low Latency HLS, DRM, SRT, RIST, Larix Broadcaster and more

The year of 2020 is over and our team is glad to wish you a Happy New Year!

Despite everything that happened this year, our team kept moving forward and improving our products.

So let us show you briefly what we have accomplished this year. You could find more details and minor updates in our earlier newsletters and social media so here we'll just point to significant ones.


Nimble Streamer

Low Latency HLS

Apple Low Latency HLS is the next generation of Apple's HTTP Live Streaming protocol created to achieve latency of around three seconds. Nimble Streamer now supports LL HLS and it generates the output stream according to the latest spec.
Read LLHLS setup and usage article which includes video tutorial. Currently the output plays only on Apple platforms and via THEOPlayer. You can watch our sample stream on their LLHLS demo page.

DRM on board

Nimble Streamer now has full support for DRM. Our customers can protect their live streams, DVR and VOD output by encrypting them with Widevine, Playready and FairPlay. Key management can be done via BuyDRM, EZDRM, Pallycon and Verimatrix. Visit DRM feature set page to learn more about setup and usage.
DRM is available as part of our Addenda premium package. It covers DRM, Advertizer and a set of other premium features.

More protocols

NDI support was added into Nimble Streamer. It allows receiving NDI input for transcoding it into any supported output protocols, as well as turning any input stream into the NDI output. This is extremely useful for live production teams. Read NDI setup article to learn more and watch video tutorial to see it in action.

SRT protocol feature set had a break-through this year among our customers. We saw it gaining a huge momentum across the industry. And our team also improved our implementation in Nimble Steamer this year:

  • SRT Publisher Assistance Security Set (SRT PASSet) is our new security and management framework for SRT. It allows processing incoming streamid, making per-server, per-application and per-stream authentication, managing published streams and more. Read framework overview in this article to see what it can do for you.
  • In addition, we've added SRT playback protection to Paywall feature set and added playback stats for WMSPanel.
Read more about how Softvelum products implement SRT.

RIST is another technology which we've added this year. This is a new protocol for reliable streaming over UDP, you can read more about its advantages on RIST forum website. Nimble Streamer supports RIST in Push, Pull and Listen modes.

SLDP low latency protocol was another streaming technology that we improved. Synchronized playback on multiple devices became available as part of all SLDP implementations. You can stream simultaneously from Nimble Streamer to SLDP Player on HTML5 web page, Android and iOS for better user experience. View video demonstration of this feature in action, you'll love it.

If you build big infrastructures based on Nimble Streamer and WMSPanel, you will also like slice-wide permissions which you can use for assigning any group of non-admins to any group of servers, in order to give them a specific set of permissions for controlling Nimble instances.

Improving Transcoder 

Nimble Live Transcoder was also improved significantly this year with a bunch of sophisticated features:


More media

You may find interesting these articles mentioning the usage of Nimble Streamer in complex scenarios:

Side note: we've started producing video tutorials on various subjects, e.g. Nimble on Amazon EC2: installation and SRT setup. Check our YouTube channel to see more and subscribe if you want to get new videos as soon as they arrive.


Larix Broadcaster and mobile streaming

This year Larix Broadcaster had a huge increase in usage.

Larix was mentioned in a few articles on SVG News:

You can find more links, including industry leaders mentioning Larix, on our documentation reference.

This year we've put significant efforts into the quality control of mobile solutions. Softvelum apps now pass through a series of tests before being released into production. We've published our basic description of connectivity testing to demonstrate the most important aspect of that.

If you subscribe to any of our mobile SDK you can be sure you get a properly tested solution.

Before we go on to the features from past year, we'd like to share an awesome upcoming update.
  • Talkback provides audio return feed from the studio back to the mobile streamer. You can get feedback via SRT (all receiver modes), RTMP, SLDP and Icecast.
  • Besides Talkback, Larix Broadcaster will support publishing via SRT Listen and SRT Rendezvous modes.
  • To use these features right now, join Beta program in Google Play, and use this link to install it via TestFlight on iOS. Let us know how it works for you.

As for major Larix Broadcaster functionality, we've added plenty of major and minor 
updates, including more RTMP authentication compatibility with servers and services, audio-only mode, streaming pause, RIST streaming and more. We also introduced Larix Grove, a simple format which allows distributing streaming setup details across mobile devices.

Larix Player is the new name of our playback solution formerly known as "SLDP Player" which is available on Android, iOS, Android TV and tvOS. It can play SRT in Caller, Listen and Rendezvous modes, RTMP, SLDP, Icecast, HLS and MPEG-DASH.

SDK release notes has a list of all significant features added this year.

Also, we created mobile products playlist on YouTube to show our own video tutorials. Check SRT to OBS from Larix tutorial as an example.


That's it for now.

Our team wishes you a Happy New Year and we'll see you in 2021!


Follow us any way you like for future updates: Twitter, Facebook, Telegram, LinkedIn, YouTube and Reddit.

November 27, 2020

VOD DRM support in Nimble Streamer

We've previously introduced DRM support in Nimble Streamer which included Widevine, Playready and FairPlay encryption support for MPEG-DASH and HLS streams in live and DVR mode.

Nimble Streamer now supports DRM for VOD, allowing to encrypt MPEG-DASH and HLS streams generated from VOD files. VOD DRM covers CENC-based encryption for MPEG-DASH and fMP4 HLS.

When a viewer connects to MPEG-DASH or fMP4 HLS VOD stream within an application protected by the DRM, its output will be encrypted with respective DRM engine.

  1. For VOD setup please refer to this digest page. Check MPEG-DASH VOD setup and HLS VOD setup articles.
  2. For DRM setup details visit DRM feature set page.
  3. If you use FairPlay, you need to use fmp4 container in playlist name, e.g. https://servername/stream/name/playlist_fmp4.m3u8
  4. You also need to explicitly forbid access to VOD files via progressive download as described in this article.


Watch this video tutorial showing EZDRM setup for live, DVR and VOD content protection.




Contact us if you have any questions regarding this feature.




.

November 24, 2020

Apple Low Latency HLS setup in Nimble Streamer

Apple has introduced Low Latency HLS spec back in 2019 and has officially released the final version of the spec in the fall of 2020, along with making the playback available on iOS and Mac.

Nimble Streamer now has full support for Apple LL HLS in addition to "legacy" RFC8216 HLS. You can read about major advantages on LL HLS digest page. This article describes the setup process of Nimble Streamer.

We assume you already have Nimble Streamer installed on one of your servers,or upgraded to the latest version and you have full access there.


You watch a short version of this article in our video tutorial below.

However we highly recommend reading this article to use this feature efficiently.


LL HLS uses HTTP/2 via SSL as a transport protocol. So you need to enable it before performing any further setup.


1. HTTP/2 setup

1.1 First, install the latest Nimble Streamer or upgrade your existing instance. 

1.2 Now set up the SSL certificate for Nimble Streamer. This is an important steps so please make sure SSL is working before moving forward.

1.3 Add ssl_http2_enabled=true parameter into nimble.conf and restart Nimble Streamer. Read parameters reference to find more about config location and restart.
LiteSpeed HPACK library: Nimble Streamer uses LS-HPACK library for encoding and decoding HTTP headers using HPACK compression mechanism. Softvelum team would like to say thank you to LiteSpeed team, we appreciate their effort and technical expertise.
1.4 After that you'll be able to use HTTP/2 to reach live streams with HLS and MPEG-DASH protocols enabled. You can check if Nimble Streamer delivers HTTP/2 by checking access log. Read this article to get familiar with logging. E.g. you can enable logs by using log_access=file parameter. Once you add it and re-start Nimble, you'll be able to view the log. In Ubuntu it's located in /var/log/nimble/access.log by default.
Now when you try to get your regular HLS live stream via https:// via curl or HTTP/2-capable player, you'll get this kind of record in access log:
Dec 24 17:43:09 nimble[5500]: [nimble-access.log] 192.18.1.2 - - "GET /livell/stream/playlist.m3u8 HTTP/2" 200 84 1114 372 "-" "AppleCoreMedia/1.0.0.17C54 (iPhone; U; CPU OS 13_3 like Mac OS X; en_us)"
You can see HTTP/2 there which means it's working. In other cases it will have HTTP/1.1 and this will mean you need to check what's wrong.

If a client tries to access LL-HLS stream via HTTP/1.1, or if HTTP/2 is not properly set up, then the player will fall back to legacy HLS and will not use any advantages of LL-HLS.

2. Live streaming setup


Now you need to set up transmuxing settings via WMSPanel web service. If you are not familiar with live streaming setup of Nimble Streamer, please refer to live streaming digest page, or respective input protocol pages, such as RTMP streaming.

You may use any RTMP, RTSP, SRT, RIST and MPEG-TS input stream as a source for LLHLS.
Please make sure you have a correctly set up live stream (like regular-latency HLS, RTMP or SLDP) before trying to use LL HLS.

Once you have a live stream set up in WMSPanel, go to Nimble Streamer top menu and select Live streams settings. You will see Global setting tab for selected server. You may create application-specific setting as well if it's needed for your streaming scenario.


Currently, Nimble Streamer supports all 3 container types available for HLS, you can see their respective check boxes on the screenshot above. Those types have the following meaning for Low Latency HLS case:
  • HLS - this type is recommended for audio-only LL streams. It's optimized for audio delivery and it has a reduced chunk size. This allows significantly saving bandwidth for audio streams. The ID3 tags are also inserted in each audio part.
  • fMP4 (CMAF) - fragmented MP4 for video+audio, audio-only and video-only modes. We highly recommend using this type of container for video+audio and video-only cases as it allows utilizing all advantages of LL HLS.
  • HLS (MPEGTS) - this type supports both video+audio, audio-only and video-only modes, however we do not recommend it for LL HLS. It brings a lot overhead which diminishes all advantages and increases the latency. Also it does not support HEVC unlike fMP4. We've added this type support as a fix for a bug of iOS 12 which implemented previous LL HLS spec. So even though you can use it with LL, Apple does not recommend using it and Nimble Streamer team does not as well.
Once you select either of those containers, WMSPanel will show Enable Apple's Low Latency HLS checkbox and you need to select it. You will then see HLS part duration edit box to define parts' duration in milliseconds.
Please consider the following when choosing parts duration:
  • We recommend using part duration of 2000ms with chunk duration of 6 seconds. The expected latency will be about 6 seconds in this case. This duration provides optimal bandwidth and latency as well as good playback buffer.
  • The minimum recommended part duration of 1000ms. The expected latency will be about 4-5 seconds in this case. This duration gives better latency but less playback buffer and more bandwidth consumption as well as server resources' usage.
  • The smallest duration value allowed by the web UI is 500 ms because it's the minimum that makes sense in terms of low latency delivery. The latency will be around 2 seconds in this case. This duration gives very low playback buffer, severe bandwidth consumption and server resources' usage. Use it only if you have a managed network and latency is crucial for your case.
  • Part size smaller than 500ms does not make any practical sense because of the overhead in delivery and processing chain. Even though Nimble is able to produce smaller parts, we just don't want our customers to deal with potential bottlenecks in other areas.
  • You may align key frame interval with chunk size. E.g. for 6 seconds chunk and 1000ms part duration the valid key frame intervals would be 1, 2, 3 seconds. For same chunk size and part duration of 2000, the valid key frame intervals will be 1, 2, 3, 6 seconds.
  • Key frame interval can be set on your encoder side. Nimble Live Transcoder allows setting it as well.

Once LL HLS is enabled, you need to re-start the input stream so Nimble Streamer could start producing LL HLS output stream.

3. Workflow and playlists


Now as the set up has been made, you can use the player to consume the stream using the usual playback URL. The main playlist will have proper chunklists which will have a content according to LL-HLS spec. Here is an example of such chunk list (obtained via video.m3u8).
   #EXTM3U
    #EXT-X-VERSION:7
    #EXT-X-INDEPENDENT-SEGMENTS
    #EXT-X-MAP:URI="audio.fmp4?nimblesessionid=1"
    #EXT-X-TARGETDURATION:7
    #EXT-X-MEDIA-SEQUENCE:1
    #EXT-X-SERVER-CONTROL:CAN-BLOCK-RELOAD=YES,PART-HOLD-BACK=1.536
    #EXT-X-PART-INF:PART-TARGET=0.512
    #EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:29:02.609Z
    #EXTINF:5.995,
    a_6_6016_1.fmp4?nimblesessionid=1
    #EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_0.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_1.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_2.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_3.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_4.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_5.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_6.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_7.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_8.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_9.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_10.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.363,URI="a_6_12011_2_11.fmp4?nimblesessionid=1"
    #EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:29:08.604Z
    #EXTINF:5.995,
    a_6_12011_2.fmp4?nimblesessionid=1
    #EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_0.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_1.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_2.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_3.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_4.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_5.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_6.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_7.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_8.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_9.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_10.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.362,URI="a_6_18006_3_11.fmp4?nimblesessionid=1"
    #EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:29:14.599Z
    #EXTINF:5.994,
    a_6_18006_3.fmp4?nimblesessionid=1
    #EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_0.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_1.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_2.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_3.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_4.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_5.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_6.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_7.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_8.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_9.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_10.fmp4?nimblesessionid=1"
    #EXT-X-PART:DURATION=0.384,URI="a_6_24000_4_11.fmp4?nimblesessionid=1"
    #EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:29:20.593Z
    #EXTINF:6.016,
    a_6_24000_4.fmp4?nimblesessionid=1 
    #EXT-X-PRELOAD-HINT:TYPE=PART,URI="a_6_30000_5_0.fmp4?nimblesessionid=1"

      

Parts in chunklist. Comparing to regular HLS, you see a lot of lines representing parts like this:
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_0.fmp4?nimblesessionid=1"
The full chunk which contain these parts will be described after all parts' lines:
a_6_24000_4.fmp4?nimblesessionid=1
All parts within chunks are numerated starting from zero. So "a_6_18006_3_0.fmp4" mean its the first part of chunk number 3.

Part length. This attribute declares a designated size of upcoming parts:
#EXT-X-PART-INF:PART-TARGET=0.512
In this example it's 512 milliseconds.

Can block reload
. Check this line:
#EXT-X-SERVER-CONTROL:CAN-BLOCK-RELOAD=YES,PART-HOLD-BACK=1.536
The "CAN-BLOCK-RELOAD" declares that media server allows holding playlist request.

Hold playlist request. LL-HLS allows requesting the server to hold sending out the playlist until a specific chunk and/or part is available for the stream.
So a player may request some part which is going to be available within a few seconds from now, then Nimble Streamer will check if that part is available. Once the requested part is ready, Nimble will return a playlist.

Check this request example:
curl -k "https://localhost:8443/livell/stream/chunks.m3u8?nimblesessionid=1&_HLS_msn=59&_HLS_part=5"
The highlighted _HLS_msn=59 and _HLS_part=5 parameters indicate that the server must hold the request until Nimble Streamer has part number 5 of chunk number 59 or later and then it could return a playlist. You can use only _HLS_msn=59 parameter, in this case the playlist will be sent out only once full chunk is available.

The resulting chunklist will look like this:
#EXTM3U
#EXT-X-VERSION:7
#EXT-X-INDEPENDENT-SEGMENTS
#EXT-X-MAP:URI="audio.fmp4?nimblesessionid=1"
#EXT-X-TARGETDURATION:7
#EXT-X-MEDIA-SEQUENCE:55
#EXT-X-SERVER-CONTROL:CAN-BLOCK-RELOAD=YES,PART-HOLD-BACK=1.536
#EXT-X-PART-INF:PART-TARGET=0.512
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:34:26.599Z
#EXTINF:5.994,
a_6_330006_55.fmp4?nimblesessionid=1
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:34:32.593Z
#EXTINF:6.016,
a_6_336000_56.fmp4?nimblesessionid=1
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_0.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_1.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_2.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_3.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_4.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_5.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_6.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_7.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_8.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_9.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_10.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.363,URI="a_6_342016_57_11.fmp4?nimblesessionid=1"
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:34:38.609Z
#EXTINF:5.995,
a_6_342016_57.fmp4?nimblesessionid=1
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_0.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_1.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_2.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_3.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_4.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_5.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_6.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_7.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_8.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_9.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_10.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.363,URI="a_6_348011_58_11.fmp4?nimblesessionid=1"
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:34:44.604Z
#EXTINF:5.995,
a_6_348011_58.fmp4?nimblesessionid=1

#EXT-X-PART:DURATION=0.512,URI="a_6_354006_59_0.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_354006_59_1.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_354006_59_2.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_354006_59_3.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_354006_59_4.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_354006_59_5.fmp4?nimblesessionid=1" 
#EXT-X-PRELOAD-HINT:TYPE=PART,URI="
a_6_354006_59_6.fmp4?nimblesessionid=1"

 

You can see it ends with part a_6_354006_59_5.fmp4 - it's part number 5 of the upcoming chunk 59. That chunk will be available only a few seconds later, but the player can already perform the playback, this helps a lot with reducing the latency.

Pre-load hint. Check this line:
#EXT-X-PRELOAD-HINT:TYPE=PART,URI="a_6_354006_59_6.fmp4?nimblesessionid=1"
Here Nimble declares that part 6 of chunk 59 is being created right at this moment. The player may request this part and it will get it as soon as that part is available.

4. Playback


Low Latency HLS is now supported in all Apple devices that are capable of HLS playback. So you can use an iPhone with iOS 14 to play this HLS even in your browser. We also have support for LL HLS in Larix Player for iOS which uses platform playback component.

Also, THEOPlayer has a test page with their implementation of LLHLS which is working in various other browsers on other platforms. You can read configuration guide to see how you can adapt THEO Player to your use case.

Other platforms do not have LL HLS support yet. Once we know of that kind of solutions, we'll mention them here.

5. Known issues and troubleshooting


5.1. Chunk duration and key frame alignment
If the playback has issues, you need to make sure that chunk duration is equal to the duration that you specified for your stream settings.

Let's check a couple of examples.
In this example we set up chunk of 6 seconds with part duration of 1000ms. So you see 6 parts for each chunk, and every chunk has equal duration:
#EXTM3U
#EXT-X-VERSION:7
#EXT-X-INDEPENDENT-SEGMENTS
#EXT-X-MAP:URI="video.fmp4?nimblesessionid=249"
#EXT-X-TARGETDURATION:6
#EXT-X-MEDIA-SEQUENCE:1296
#EXT-X-SERVER-CONTROL:CAN-BLOCK-RELOAD=YES,PART-HOLD-BACK=3
#EXT-X-PART-INF:PART-TARGET=1
#EXT-X-PROGRAM-DATE-TIME:2020-12-14T01:39:14.406Z
#EXTINF:6,
v_80_7777992_1296.fmp4?nimblesessionid=249
#EXT-X-PROGRAM-DATE-TIME:2020-12-14T01:39:20.406Z
#EXTINF:6,
v_80_7783992_1297.fmp4?nimblesessionid=249
#EXT-X-PROGRAM-DATE-TIME:2020-12-14T01:39:26.406Z
#EXTINF:6,
v_80_7789992_1298.fmp4?nimblesessionid=249
#EXT-X-PROGRAM-DATE-TIME:2020-12-14T01:39:32.406Z
#EXTINF:6,
v_80_7795992_1299.fmp4?nimblesessionid=249
#EXT-X-PART:DURATION=1,URI="v_80_7801992_1300_0.fmp4?nimblesessionid=249",INDEPENDENT=YES
#EXT-X-PART:DURATION=1,URI="v_80_7801992_1300_1.fmp4?nimblesessionid=249"
#EXT-X-PART:DURATION=1,URI="v_80_7801992_1300_2.fmp4?nimblesessionid=249"
#EXT-X-PART:DURATION=1,URI="v_80_7801992_1300_3.fmp4?nimblesessionid=249",INDEPENDENT=YES
#EXT-X-PART:DURATION=1,URI="v_80_7801992_1300_4.fmp4?nimblesessionid=249"
#EXT-X-PART:DURATION=1,URI="v_80_7801992_1300_5.fmp4?nimblesessionid=249"
#EXT-X-PROGRAM-DATE-TIME:2020-12-14T01:39:38.406Z
#EXTINF:6,
v_80_7801992_1300.fmp4?nimblesessionid=249
#EXT-X-PART:DURATION=1,URI="v_80_7807992_1301_0.fmp4?nimblesessionid=249",INDEPENDENT=YES
#EXT-X-PART:DURATION=1,URI="v_80_7807992_1301_1.fmp4?nimblesessionid=249"
#EXT-X-PART:DURATION=1,URI="v_80_7807992_1301_2.fmp4?nimblesessionid=249"
#EXT-X-PART:DURATION=1,URI="v_80_7807992_1301_3.fmp4?nimblesessionid=249",INDEPENDENT=YES
#EXT-X-PART:DURATION=1,URI="v_80_7807992_1301_4.fmp4?nimblesessionid=249"
#EXT-X-PART:DURATION=1,URI="v_80_7807992_1301_5.fmp4?nimblesessionid=249"
#EXT-X-PROGRAM-DATE-TIME:2020-12-14T01:39:44.406Z
#EXTINF:6,
v_80_7807992_1301.fmp4?nimblesessionid=249
#EXT-X-PART:DURATION=1,URI="v_80_7813992_1302_0.fmp4?nimblesessionid=249",INDEPENDENT=YES
#EXT-X-PRELOAD-HINT:TYPE=PART,URI="v_80_7813992_1302_1.fmp4?nimblesessionid=249"

Now check your chunk list. Get your playlist via URL like https://yourdomain.com/test/stream/playlist.m3u8 and then find the chunklist that corresponds to the video, like https://yourdomain.com/test/stream/video.m3u8. You can do that via various tools like "curl".

Check the following real-life example of LLHLS stream.
#EXTM3U
#EXT-X-VERSION:7
#EXT-X-INDEPENDENT-SEGMENTS
#EXT-X-MAP:URI="video.fmp4?nimblesessionid=57"
#EXT-X-TARGETDURATION:11
#EXT-X-MEDIA-SEQUENCE:9873
#EXT-X-SERVER-CONTROL:CAN-BLOCK-RELOAD=YES,PART-HOLD-BACK=3.003
#EXT-X-PART-INF:PART-TARGET=1.001
#EXT-X-PROGRAM-DATE-TIME:2020-12-14T01:28:36.050Z
#EXTINF:5.338,
v_26_60085735_9873.fmp4?nimblesessionid=57
#EXT-X-PART:DURATION=1.001,URI="v_26_60091073_9874_0.fmp4?nimblesessionid=57",INDEPENDENT=YES
#EXT-X-PART:DURATION=1.001,URI="v_26_60091073_9874_1.fmp4?nimblesessionid=57"
#EXT-X-PART:DURATION=1.001,URI="v_26_60091073_9874_2.fmp4?nimblesessionid=57"
#EXT-X-PART:DURATION=1.001,URI="v_26_60091073_9874_3.fmp4?nimblesessionid=57"
#EXT-X-PART:DURATION=0.333,URI="v_26_60091073_9874_4.fmp4?nimblesessionid=57"
#EXT-X-PROGRAM-DATE-TIME:2020-12-14T01:28:41.223Z
#EXTINF:4.338,
v_26_60091073_9874.fmp4?nimblesessionid=57
#EXT-X-PART:DURATION=1.001,URI="v_26_60095411_9875_0.fmp4?nimblesessionid=57",INDEPENDENT=YES
#EXT-X-PART:DURATION=1.001,URI="v_26_60095411_9875_1.fmp4?nimblesessionid=57"
#EXT-X-PART:DURATION=1.001,URI="v_26_60095411_9875_2.fmp4?nimblesessionid=57"
#EXT-X-PART:DURATION=1.001,URI="v_26_60095411_9875_3.fmp4?nimblesessionid=57"
#EXT-X-PART:DURATION=1.001,URI="v_26_60095411_9875_4.fmp4?nimblesessionid=57"
#EXT-X-PROGRAM-DATE-TIME:2020-12-14T01:28:45.740Z
#EXTINF:5.005,
v_26_60095411_9875.fmp4?nimblesessionid=57
#EXT-X-PART:DURATION=1.001,URI="v_26_60100416_9876_0.fmp4?nimblesessionid=57",INDEPENDENT=YES
#EXT-X-PART:DURATION=1.001,URI="v_26_60100416_9876_1.fmp4?nimblesessionid=57"
#EXT-X-PART:DURATION=1.001,URI="v_26_60100416_9876_2.fmp4?nimblesessionid=57"
#EXT-X-PART:DURATION=1.001,URI="v_26_60100416_9876_3.fmp4?nimblesessionid=57"
#EXT-X-PART:DURATION=1.001,URI="v_26_60100416_9876_4.fmp4?nimblesessionid=57"
#EXT-X-PART:DURATION=1.001,URI="v_26_60100416_9876_5.fmp4?nimblesessionid=57"
#EXT-X-PART:DURATION=1.001,URI="v_26_60100416_9876_6.fmp4?nimblesessionid=57"
#EXT-X-PART:DURATION=1.001,URI="v_26_60100416_9876_7.fmp4?nimblesessionid=57"
#EXT-X-PART:DURATION=1.001,URI="v_26_60100416_9876_8.fmp4?nimblesessionid=57"
#EXT-X-PART:DURATION=1.001,URI="v_26_60100416_9876_9.fmp4?nimblesessionid=57"
#EXT-X-PROGRAM-DATE-TIME:2020-12-14T01:28:50.745Z
#EXTINF:10.01,
v_26_60100416_9876.fmp4?nimblesessionid=57
#EXT-X-PRELOAD-HINT:TYPE=PART,URI="v_26_60110426_9877_0.fmp4?nimblesessionid=57"
It has chunks of various durations which differ from the duration requested during the setup (you can see 4.338, 5.005 and even 10.01). This may cause some players to work incorrectly.
So your original stream source need to have proper key frames setup in order to produce playlists for smooth playback.

As you can see, LLHLS can be capricious when it comes to key frame alignment. So you have to make sure that your initial source stream is properly encoded and delivered prior to re-packaging into LLHLS. Reliable delivery protocols like SRT are excellent for the vast majority of streaming scenarios but this case may require additional efforts prior to transmuxing.
If you want to produce reliable output for last-mile delivery regardless of your source, you need to consider SLDP low latency protocol. We designed and implemented it having various real-life cases in mind and whatever you bring into Nimble Streamer, it will be properly played in SLDP-powered players.

5.2. Interleaving compensation
If you have video+audio stream you may have issues with playback due to interleaving as described in this article. This kind of issues becomes even more severe in case of low latency playback. In order to fix this you can try enabling interleaving compensation with Min. delay set to zero, see the image below.



Related materials


Watch Converting NDI to Apple Low Latency HLS video which demonstrates hot to get NDI source input and produce LLHLS output with proper latency.


Feel free to try Nimble Streamer with Low Latency HLS and let us know if you have any questions.

November 22, 2020

BuyDRM KeyOS support in Nimble Streamer

Softvelum team is glad to announce that Nimble Streamer DRM framework now supports BuyDRM KeyOS DRM solution for protecting the content using Widevine, FairPlay and Playready encryption technologies.


In order to use KeyOS, please follow setup instructions on Nimble DRM page and let us know of any questions.

November 20, 2020

Nimble Streamer on Amazon EC2: installation and SRT setup

We've released a new video tutorial about Nimble Streamer on Amazon EC2.

In this video we do the following:

  • create Amazon EC2 instance;
  • install Nimble Streamer there;
  • set it up to receive SRT stream;
  • run vMix to publish SRT stream;
  • test the playback via HLS, MPEG-DASH and SLDP.

You can watch it now:


Feel free to subscribe to Softvelum YouTube channel to watch more videos as they come out.

November 18, 2020

Nimble Live Transcoder API

Nimble Streamer Live Transcoder add-on has a convenient web UI allowing to control transcoding scenarios and pipelines via browsers. However many customers need even more flexibility so they'd like to use API to control server behavior.

WMSPanel control API now has Transcoder scenarios section which provides a list of methods to control Live Transcoder behavior via API calls.

The overall approach for Transcoder API usage is as follows.

  • Use web UI to create some general-purpose scenarios which you consider as templates. You may use your existing scenarios as well. The rest is done via the API calls.
  • Clone a designated scenario on the same server. This is what you will change via further API calls.
  • Apply any scenario to multiple servers just like you would do via the web UI by making its update and setting servers_to_apply parameter in your call.
  • Besides an update, each scenarios can be paused, resumed and even removed.
  • Each scenario has its video and audio pipelines (you can see them via UI) and you can remove them if they are not needed in some particular case.
  • For each pipeline, you will be able to change applications and streams names for encoders input and decoders output.
  • For each pipeline you can change some basic parameters of existing filters.

This way you may create some general purpose scenarios, pause them to make them to be templates, then clone a many as you want and then narrow them to specific cases.

Read more about WMSPanel API and transcoder control API methods on API reference page.


In addition to that, take a look at Transcoder documentation reference just to see if you've missed some of latest features. Our YouTube channel has Transcoder playlist with video tutorials on some popular use cases.

If you have any questions regarding the Transcoder or its API, please feel free to contact our heldpesk.

November 2, 2020

DRM-powered DVR in Nimble Streamer

Earlier this year we introduced DRM support in Nimble Streamer which included Widevine, Playready and FairPlay encryption support for MPEG-DASH and HLS live streams. Our customers like using live streaming along with DVR functionality which allows recording their streams for later playback via HLS and DASH.

Nimble Streamer now combines these two feature sets into one, allowing to encrypt recorded live streams when providing their playback.

When a viewer connects to MPEG-DASH or fMP4 HLS stream within an application protected by the DRM, its output will be encrypted with respective DRM engine, it's that simple.

Also notice that DVR DRM covers CENC-based encryption for MPEG-DASH and fMP4 HLS.

For DRM setup details visit DRM feature set page.

For DVR setup please refer to this digest page and to this DVR setup article as example.


Watch this video tutorial showing EZDRM setup for live, DVR and VOD content protection.


Contact us if you have any questions regarding this feature.



October 28, 2020

Larix team approach to quality control

Softvelum team puts a lot of efforts into quality assurance of our products, and our mobile products are not an exception. Larix Broadcaster quality control process allows our team to deliver products of a good quality.

We wanted to share some details of the approach that we use so our Larix SDK customers could use it as an example for their own QA procedures.

Check this Wiki from Larix test plans github repo as a starting point. It has links to test plans of Larix Broadcaster for Android and for iOS.

Those test plans describe what we call connectivity tests to check how Larix handles connections via RTMP(S), RTSP, SRT and RIST to a number of streaming recipients like Nimble Steamer, Wowza, Facebook, Twitch, vMix, OBS, and a number of other tools.

Check our Wiki for more details and let us know if you'd like to get even more details.


October 20, 2020

Pallycon DRM support in Nimble Streamer

Softvelum team is glad to announce that Nimble Streamer DRM framework now supports PallyCon multi-DRM for protecting the content using Widevine, FairPlay and Playready encryption technologies.

Softvelum is now among PallyCon technology partners to provide its technology for mutual customers.

Please take a look at our joint PallyCon announcement to find our more about this partnership.



In order to use PallyCon DRM, please follow setup instructions on Nimble DRM page and let us know of any questions.







September 30, 2020

Q3 2020 news: NDI in Nimble Streamer, RIST in Larix Broadcaster, SRT in TV Players

Softvelum team keeps improving the products bundle and we'd like to share the most significant updates with you.


Nimble Streamer

Our flagship product has a number of interesting updates.


Larix Broadcaster

Larix Broadcaster was described in a couple of tutorials:

As for new features, we have them as well.

  • RIST streaming protocol is now supported for Larix Broadcaster for both Android and iOS. It's built with libRIST version 3.0.0 and uses RIST Main profile. Watch video tutorial showing RIST streaming from mobile.
  • Larix Broadcaster now has streaming pause for both Android and iOS. Long tap on Start will pause the stream without disconnecting it, video track will contain a black screen, audio track will contain silence.

Last but not least. We've started to describe our approach to quality assurance of our mobile products in our github Larix testing repo. Here are the areas which we've covered already, more pages are coming soon, as well as new tests for RIST:


Larix Player

Larix Player is the new name of our playback solution for Android and iOS formerly known as "SLDP Player". It's capable of SRT playback as well as Icecast, SLDP and other protocols.

  • Larix Player free app is available in AppStore and Google Play while Larix Player SDK is available for premium licensing.
  • Larix Player is now available in Apple TV / tvOS which means SRT can be played on all Apple devices. Check our Apple developer page for all apps. Watch this video tutorial for setup.
  • Larix Player is also available on Android TV. This allows playing SRT on STBs and TVs, as well as creating your own playback apps with our SDK. Visit player page for details.


That's all for now.

Follow our social networks and channels for future update: TwitterFacebookTelegramLinkedInYouTube and Reddit.

September 29, 2020

RIST support in Larix Broadcaster

RIST protocol support is now available in Larix Broadcaster for iOS and Android. It has RIST Main Profile Baseline Level. As of December 2020, Beta version of Larix uses libRIST version 0.2.0 rc2.

To use RIST in Larix connection, enter URL with rist:// prefix, like
rist://192.168.0.114:2030
If you need to control RIST behavior via custom parameters, just add them into URL, like

rist://192.168.0.114:2030?buffer=2000

Full list of available parameters is available in libRIST headers here under "Rist URL parameter names for peer config" section.


This tutorial video shows the setup and usage example of RIST.



Check other features of Larix Broadcaster for your respective platform.

September 15, 2020

Building Quick Sync-only pipeline with Nimble Transcoder

Live Transcoder for Nimble Streamer provides many features for transforming live content using both software libraries and hardware acceleration.

Intel® Quick Sync is fully supported in Live Transcoder for decoding and encoding but all filtering operations were performed using CPU. That caused extra resources usage to transfer processed frames among CPU, GPU and RAM.

Nimble Live Transcoder now allows building transcoding pipelines which are performed completely with Quick Sync hardware acceleration. This is done using specific FFmpeg libraries which we use in addition to our own code.

This article shows how to set up this Quick Sync-powered processing chain.

1. Installation and initial setup


We assume you've already set up Nimble Streamer, it's been set up to get an incoming live stream and you've tested basic streaming. In our example we'll use a stream whose application name is "input" and stream name is "source".

If you're not familiar with Live Transcoder, take a look at Transcoder documentation reference.

Notice that the described functionality is available on Ubuntu 20.04 only. We'll support other upcoming LTS Ubuntu releases as well.

The basic steps to make Quick Sync working are as follows:
  1. Create a transcoder license and subscribe for it.
  2. Install Live Transcoder add-on.
  3. Create some simple scenario with CPU transcoding (e.g. downscale your stream to 240p). This way you'll make sure the transcoder was set up properly.
Now create a new scenario to start a new pipeline setup.

2. Decoder setup


Once you create a new scenario, drag and drop a blue decoder element onto the dashboard. There you need to specify "quicksync-ffmpeg" in Decoder field.


That's it. Now let's set up filtering.

3. Using filters


Once the frame is decoded you can process it via a set of ffmpeg filters which will work via Quick Sync. Nimble Transcoder supports a number of those, here are the most frequently used. Notice that you can refer to FFmpeg source code for more details about custom filters. As of September 2020, Nimble Transcoder uses FFmpeg 4.3.1 for some of its operations, including Quick Sync filters.

"Split" - allows creating several identical outputs from input video. It's available as a filter element in a tool box of Transcoder UI.

"Picture" - another filter available via UI element. Its setup for Quick Sync will look like this:



"fps" filter sets the frames per second rate and is defined via a custom filter. Its name is "fps" and the value is "fps=<number>", like "fps=30".

"scale_qsv" filter allows resizing the image. Add a custom filter into your scenario, set name to "scale_qsv" and then use filter parameters of your choice separated by comma. Common parameters are "w" (width) and "h" (height).


Other options include "format", "mode", "low_power", "hq". Please refer to FFmpeg sources file vf_scale_qsv.c for more details.

"vpp_qsv" filter allows transforming the content in a lot of ways, some of its parameters include "deinterlace", "denoise", "framerate" as well as flipping, scaling and a lot more. Please refer to FFmpeg source file vf_vpp_qsv.c for a full list of options.

4. Encoder setup


In order to encode video using Quick Sync within the hardware pipeline, you need to define Encoder as "FFmpeg" and set Codec to "h264_qsv" for H.264/AVC codec.


You can then define its custom parameters like "profile" for encoding profile , "b" for bitrate and many others which you can find in FFmpeg source file qsvenc_h264.c.

If you need to encode to H.265/HEVC, your Codec field must be set to "hevc_qsv".


It also supports a number of parameters like "b" for bitrate and you can find a full list in qsvenc_hevc.c from FFmpeg sources.

Once you finish your scenario and save it, you'll be able to transform your incoming video stream using only Quick Sync hardware acceleration.

When you have a video pipeline set up, you need to define the audio part. If you don't need any sound transformation, you can add a passthrough for it just like it's described in other setup examples.




If you have any questions, issues or questions, please feel free to contact us.

Related documentation


Live Transcoder for Nimble StreamerTranscoder documentation reference,

September 10, 2020

Make thumbnails for live streams

Nimble Streamer now allows generating thumbnails on-the-fly for any outgoing stream. At the moment it's a single-frame MP4 file which can be embedded into any web page via <video> tag.

In order to start generating those thumbnail, you need to enable Generate MP4 thumbnails parameter for either Global setting of the server or for specific applications' settings as shown below.

Once the parameter is enabled, you can access the generated thumbnail via this kind of URL:

http://server_URL:8081/live/stream/thumbnail.mp4


Here are the examples of global and per-application "Generate MP4 thumbnails" setting enabled.  The thumbnails are re-newed within a Interval period which you set up after checking the checkbox, by default it's 6 seconds.


Here's application setting:



Related documentations

Live streaming in Nimble Streamer

August 25, 2020

Install Nimble Streamer with SRT on MacOS using Docker

Nimble Streamer has full support of SRT protocol for the majority of platforms such as Linux and Windows. The MacOS support is missing due to a number of technical reasons. However our customers have been asking us about such capability.

With Docker, you can now bring SRT support in Nimble Streamer on macOS. Nimble will start and work properly on you Mac.

This article assumes you are familiar with macOS Terminal and you have an understanding of networking technologies. It's not a detailed description of Docker technology but rather a how-to describing our specific use case.

Here's a video that follows this article to demonstrate all steps. Read also read the text below to get more details on every step.




Now let's follow the steps below.

1. Install Docker for MacOS


Download Docker from its official website and follow installation instructions for MacOS. Those are very easy steps so we won't describe them in details.

2. Get Dockerfile from our GitHub


Create a new directory on your disk and download Dockerfile from our GitHub.

Change WMSPANEL_ACCOUNT and WMSPANEL_PASS parameters to your login and password in WMSPanel. Use WMSPANEL_SERVER_NAME parameter to set server name, it will be used in WMSPanel at the moment of container compilation and your server will be registered under that name.

3. Build image


Open the Terminal and go to the directory you've created earlier.

Run this command to create an image with the latest version of Nimble Streamer and SRT package:
docker build --no-cache -t nimble-srt:latest .
Now wait for a completion. If the operation was successful, you'll get "Successfully tagged nimble-srt:latest" message and your server will appear in WMSPanel marked as grey.

If anything goes wrong please contact our helpdesk with full log of your operations.

4. Run and Stop container


Now run your newly created container using this command:
docker run -d --rm -p 8081:8081 -p 1935:1935 -p 4444:4444/udp nimble-srt
After that you'll see your server in panel changing its color to green and status to "Running".

Once the container is launched you can connect to it using IP address of your Mac using mapped ports.

This new container is running in closed network of your docker so if you'd like to play streams produced by Nimble Streamer in that container, you need to specify proper IP in server settings at WMSPanel. Go to Servers top menu, then open server info page and click on Manage custom IP/Ports to add the IP of your Mac. After that you'll be able to perform the playback

To stop container, you need to get its ID using "docker ps" command and then stop using "docker stop CONTAINER_ID" command:

osa-vm-macos:nimble_img osa$ docker ps -q
53778662dec3
osa-vm-macos:nimble_img osa$ docker stop 53778662dec3
53778662dec3


5. Port mappings


In previous section we used TCP ports 8081 (for HLS and other HTTP-based protocols), 1935 (for RTMP) and UDP 4444 (for SRT). You may want to user other ports instead or add more ports. Please make sure these ports are not used by macOS.

To add another UDP port for container mapping just add "-p" parameter specifying required port, e.g. "-p 5555:5555/udp". Full command will look like this one:
docker run -d --rm -p 8081:8081 -p 1935:1935 -p 4444:4444/udp -p 5555:5555/udp nimble-srt
If you need TCP port then just remove "/udp".

6. Set up SRT in WMSPanel


Now you can set up SRT streaming using WMSPanel to control your Nimble instance in container with the previously mapped ports. We used port 4444 in our example so the SRT listener setup in "MPEGTS In" tab will look like this:


You can refer to SRT setup article to lean more about SRT setup process in Nimble Streamer.

Once you complete that you can start streaming SRT via IP and port specified above, using any SRT tool of your choice, like ffmpeg + srt-live-transmit, vMix. OBS or Larix Broadcaster.

7. Connect to the console of your container


If you need to log into the console of your container, get container ID first using this command
docker ps -q
Then use the result value in this command:
docker exec -it CONTAINER_ID /bin/bash

Once you get access, you can work with Nimble Streamer logs to track problems or update server parameters via nimble.conf file.

More on virtualization

The Dockerfile which we use in this article can be used for creating containers at other OSes.

Docker for macOS is basically using a small virtual Linux machine to work so you can also use other virtualization methods of running Nimble Streamer with SRT under Mac, like Virtual Box or VMware Workstation. Let us know if you need some special virtual machine to work with hypervisors.

Feel free to tell us about your experience of using Docker with Nimble Streamer.

August 18, 2020

Live Transcoder upgrade

Nimble Streamer Live Transcoder is widely used among Softvelum customers. The core technology of the Transcoder combines both Softvelum team's own know-how and third-parties' work. Those third parties are listed on a corresponding page. One of those elements is FFmpeg which is used for filtering and some decoding operations. We periodically upgrade our code base to work with one of the latest stable releases as it has a number of important fixes and improvements. So in order to keep pace with FFmpeg, our team had to make adjustments and use FFmpeg version 4.2.4.

New FFmpeg version requires changes in both Nimble Streamer and Live Transcoder. So if you decide to make upgrade of Nimble Streamer then in order to make smooth transition Nimble and Transcoder packages will have to be upgraded simultaneously. If one of the packages is upgraded without its counterpart, then live transcoding will stop working.

We'll be releasing new packages for all platforms during next few days.
If you plan upgrading Nimble Streamer to 3.7.0-1 please also upgrade Transcoder package.

Here is what you need to do in order to complete this upgrade the correct way.

For Ubuntu and Debian, run this command
apt-get update
apt-get install nimble nimble-transcoder

For CentOS, run this command
sudo yum makecache
sudo yum install nimble nimble-transcoder


You may also run procedures from Live Transcoder installation page first and then Nimble Streamer upgrade page one after another do get the same result. If you have Windows, you also need to follow this path.

So we recommend you to perform this simultaneous upgrade when you have time and resource for that.


After the upgrade is complete, your Nimble Streamer package version will be 3.7.0-1 and Live Transcoder package version will be 1.1.3-1.


If you have any questions or face any issues during the upgrade, please contact us using our helpdesk.

August 12, 2020

Larix Player on Apple TV

Softvelum team introduces Apple TV support for Larix Player.

You can now play SRT, SLDP, Icecast, RTMP and HLS live streams on tvOS devices.



If you'd like to create your custom Apple TV application you can subscribe for our Larix Player mobile SDK.

August 11, 2020

SRT playback protection and stats

Nimble Streamer has had SRT support since the protocol's inception and our team keeps adding new SRT features into our products.

We see a growing demand for SRT playback using various solutions including Softvelum Larix Player for Android, Android TV, iOS and tvOS. So our customers want to see two major features which they already have for other playback protocols.

Those familiar features are:

So our team followed the feedback and made support for these features.

We've made those features available via Addenda so you'll need to subscribe for this premium package for as long as you need them. Notice that Addenda also covers SRT Publisher Assistance Security Set which allow controlling the publication process via SRT.

Please make sure you subscribe and register an Addenda license on your Nimble server before moving forward.


SRT listener settings


To make reporting and paywall features work, you need to set up SRT listener properly. We assume you are already familiar with SRT setup in Nimble so we'll show just specific settings.

Go to your SRT output settings at Live streams settings menu UDP streaming tab and either create a new stream or edit an existing one.


Listen is currently the only supported mode for this feature set so you need to select it in Mode drop-down.

All other fields' values are defined just like you would do for regular playback. The Local IP and Local port for connections, Source application name and Source stream name for defining the source as well as latency and maxbw parameters for proper error recovery.

nimble-srt-report-stream-name is the new parameter which you need to set up to make this feature work. This parameter's value defines what will be the name of that stream: all playback sessions will be reported under that name and your paywall settings will also use it.

It's set as <app name>/<stream name>. In our example it's live/srtstream.


Viewing statistics in WMSPanel


When your viewer connects to a previously configured SRT listener, one connection will be reported into your stats. If you have several ports opened for different listeners with this feature enabled then you'll see several simultaneous connections.

You'll have these connections counted in all other reports of WMSPanel, such as daily stats, geo-location, streams stats, in-depth streams reports, unique visitors and others.

If you create separate listeners on separate ports but use the same nimble-srt-report-stream-name then both connections will be registered (counted) under that same stream name. This way you can combine data for a single content publisher.

If you have a stream with the same name available via different protocol, then their stats will be combined too. They'll also be show under different columns in daily stats, like you see on a screenshot above.

Multipoint listeners

Nimble streamer supports multipoint listeners option which means that more than one viewers can connect to the same port, and in case of this feature - to the same stream. If you'd like to enable multipoint, read "One-to-many multipoint listener" section in this article.

From statistics perspective, if you create a multipoint listener and 2 viewers connect there then 2 connections will be registered in our reporting.


Paywall support for SRT playback


With nimble-srt-report-stream-name set, a number of features from Softvelum paywall become available for that specific listener:

They are set up the same way as you do for other protocols, follow links above to see detailed instructions. The specific thing for SRT is that you need to specify app name and stream name according to nimble-srt-report-stream-name value. That is <app name>/<stream name> so you put app name into Application field and stream name into Stream field as shown below.

One limitation for paywall is that user agent and referrer restrictions do not apply to SRT.

All other features and approaches described on respective pages and in our paywall FAQ fully apply to SRT. For example, as described in Q6 you can allow stream for just one country and forbid for all others by defining the country in allow list and setting any non-empty password, with SRT checkbox enabled:



If you try playing this SRT stream, you won't be able to do it and you'll see "failed to accept srt socket" error in Nimble log.

That's just one example, visit Paywall page for more information.

Hotlink protection and PPV

This feature is set up the same way as you would do it for HLS or other protocols. You'll need to set up WMSAuth rule with password, then provide your viewers with a URL to your stream with a signature which is based on password and other parameters. Read this page to get all details and to get code examples.

Once you get a signature you will need to add it into your stream URL at streamid parameter this way:

srt://<ip>:<port>/?streamid=wmsAuthSign=<signature> 

Here's an example of running srt-live-transmit this way:

./srt-live-transmit srt://127.0.0.1:2020/?streamid=wmsAuthSign=c2VydmVyX3RpbWU9MDgvMTEvMjAyMCAwMzoxNzowMSBBTSZoYXNoX3ZhbHVlPVZvak5wT3RvSDJWaHJ1UW1UNUdDaHc9PSZ2YWxpZG1pbnV0ZXM9MTA= file://con > /dev/null

You may generate that signature on your website where people open your link with our Larix Player for Android and iOS which handles srt:// deep links\ to open them for playback. Or you can use your own method of delivery to give this link to your viwewers.

Pay-per-view framework is also set up the same way as for other protocols, with the signature placed into the URL at streamid as shown above.


That's it. Please subscribed for Addenda license to use this feature and if you have any questions pelase contact our helpdesk.


Related documentation


July 30, 2020

Streaming SRT via OBS Studio with Nimble Streamer and Larix Broadcaster

OBS Studio team added support for SRT in their software recently. That is a great improvement and we highly recommend using that protocol for your delivery to and from OBS.

In this article we're going to show how you can use OBS with two products of our team. Here are the scenarios we'll cover:
  1. Set up OBS Studio to receive SRT.
  2. Set up Larix Broadcaster to send SRT to OBS Studio.
  3. Set up OBS Studio to send out SRT stream.
  4. Set up Nimble Streamer to receive streams from OBS Studio.
This will make a complete pipe from your mobile device through OBS to a software media server located wherever you need it.

Notice that points 1 and 2 are also covered in our video tutorial about streaming SRT to OBS from Larix Broadcaster and Screencaster, you can also watch it below.

We assume you already installed all three products: the latest version (25.0.8+) of OBS on your local computer, Larix Broadcaster on your mobile and Nimble Streamer on one of your servers.

Set up Larix Broadcaster


Open Larix Broadcaster, click on gear icon to enter Settings menu. Tap on Connections.



In our example we already have some connections so let's add a new one by tapping on New connection.



Name field can be defined to distinct a new connection from existing ones.
URL field has to be similar to "srt://192.168.0.106:10000" - that is your computer IP and the port which you defined on previous step.

Save setting to see it in the list, then tap on its check box. This will enable this connection for streaming.



Now you can get back to video preview and tap on big red Recording button to start streaming to OBS.

Please refer to Larix documentation reference page for other articles and videos of Larix setup and usage.

Set OBS for SRT input from Larix Broadcaster


Open OBS and check Sources area.


Add new or change existing Media Source.




Enter a URL like this into Input field.
srt://192.168.0.114:10000?mode=listener
In our case 192.168.0.114 is the IP of your computer. You need to change IP address to the IP of your own computer. If you'd like to use some other port, you need to change it as well.

Don't forget to allow incoming UDP traffic in the firewall settings for the specified port.

That's it, click on OK to save the media source. If you have Larix Broadcaster streaming, you'll see the picture.

Also, take a look at the following tutorial of Larix Broadcaster and Screencaster setup for OBS Studio streaming.



Set OBS for SRT output to Nimble Streamer


Once you have some stream in your OBS, you can publish it to Nimble Streamer. Here's what you need to do for that.
Here are the steps to follow:
  1. Go to File -> Settings menu.
  2. Open Output tab.
  3. In Output mode field choose Advanced.
  4. Choose Recording tab.
  5. Select Type as Custom Output.
  6. In FFMpeg Output Type select Output to URL.
  7. In output URL enter "srt://192.168.0.106:2020" or IP/port which you're going to use with Nimble Streamer - we'll define that in the next section. In this example 192.168.0.106 is the IP address of your Nimble Streamer instance.
  8. Set Container format to "mpegts".
  9. You can use default encoders and decoders, or select the one you want to use, such as libx264.
Here are the settings we have as we complete them:



That's it. Now let's set up Nimble.

Set Nimble Streamer


Nimble Streamer has full support for SRT. This article describes full SRT setup in Nimble Streamer and you should refer to it for all details.

Now open Nimble Streamer -> Live streams setting menu, choose MPEGTS In tab and click on Add SRT stream button.

You see this dialog:


Here's what we do in our case:
  1. Set Receiver mode to Listen.
  2. Local IP can be set to "0.0.0.0" to process streams from all interfaces.
  3. Local port should be whatever you can use on your server. In our case it's "2020".
  4. Alias can be set to something you will see for this stream identification going forward in setup.
  5. If you click on Add outgoing stream and enter app and stream name, WMSPanel will create the outgoing stream automatically using the names which you provided.
Once you save, you'll have it all set.

To start streaming from OBS, pressing Start Recording button.

Once you do that, Nimble will provide the outgoing stream.

E.g. if HLS is enabled for this server instance, your HLS stream URL would be http://192.168.0.106:8080/obs/stream/playlist.m3u8

If you need to make further streaming via RTMP, check RMTP digest page and related articles there like RTMP republishing setup.

Please refer to full SRT setup instruction to see what else you can do with SRT setup or further re-streaming. Also, check Glass-to-glass delivery setup to see another example of SRT usage with our products.

Related documentation