February 26, 2020

Synchronized playback on multiple devices with SLDP

Playing a live stream simultaneously on multiple devices often requires synchronized playback.

The cases look simple:
  • One big screen shows something and viewers need to have the same audio on their individual devices.
  • A second screen application need to be in sync with ongoing live stream on TV.
  • Multiple screens in the same room show the same content with single source of sound.
  • A number of surveillance cameras need to be shown on the same page.

You probably know other cases where you might have the same requirement.

With traditional delivery protocols like HLS ad MPEG-DASH it's very hard to achieve without dramatically increasing the latency.

SLDP live streaming protocol allows delivering streams real time with low latency, adaptive bitrate and small zapping time. Now it also allows synchronizing the playback among devices and browsers for all the cases which you can see above. It's supported on both server side and client side.

Take a look at sneak previews below.

Here are two browsers running the same stream, with one of them catching up with the playback.



Here are iPhone and iPad running the same stream. In the first scene, the video is catching up with counterpart, in the second scene, the audio is catching up with the video.



Let's see how you can using this feature with SLDP.

Notice that all implementations use additional buffer to make proper synchronization which will increase latency. This buffer must be the same across all platforms. Check each player platform for parameter setup.

Enable feature in Nimble Streamer


We assume you are already familiar with Nimble Streamer setup and you have a working SLDP live stream. If not, please read SLDP setup and usage article to make it work.

On your server, edit nimble.conf to add this parameter and re-start Nimble Streamer:
sldp_add_steady_timestamps = true

You can visit Nimble Streamer parameters reference to learn more about operating that config file.

Once you re-start the server, every SLDP live stream will have a steady clock timestamps needed for playback adjustments. If the connected player doesn't request the steady clock time, Nimble Streamer will not have it in the output stream to avoid any overhead.

Playback in HTML5 SLDP player


If you want to have a synchronized playback in web browsers, use our freeware HTML5 SLDP player.

By default, the feature is turned off. To enable it, add sync_buffer buffer parameter which specifies the buffer size in milliseconds. Recommended values are from 1000 to 5000 and it needs to be the same in all players.

Playback on iOS


SLDP Player for iOS allows enabling and using synchronized playback.
  1. Install SLDP Player via AppStore.
  2. In connection setting, enable Steady clock flag as shown on the screenshot below.
  3. Use Buffering field do define the buffer for this feature. As mentioned above, it needs to be the same in all players.



Playback on Android


Android SLDP Player will have that feature soon.



Once you start playback on multiple devices and browsers with that feature enabled, your playback on all devices will catch up.

Let us know how it works for you and what improvements you'd like to have for it.

Related documentation


SLDP technology overview, SLDP support in Nimble Streamer, Softvelum playback solutions,

February 17, 2020

Fallback of published RTMP, RTSP and Icecast streams

RTMP, RTSP and Icecast live protocols can be pulled by Nimble Streamer for further processing, and in order to improve robustness each pulled stream can have fallback streams. So if a primary stream cannot be pulled for some reason from the origin, an alternative stream is pulled to do a failover. The playback is not stopped so the user experience is not harmed much.

The aforementioned protocols are often used in publishing mode when the stream is pushed into Nimble Streamer for processing. In this case there is no built-in way to cover that.

Nimble Streamer provides another reliable mechanism for covering fallback of RTMP, RTSP and Icecast published streams, the Live Transcoder hot swap feature set. It allows shifting to secondary stream if the primary one is down for some reason, while maintaining the playback output  for video and audio.

The following steps allow setting this up.

1. Install Live Transcoder


Hot swap feature set requires Live Transcoder premium add-on for Nimble Streamer.

There are two main reasons for Live Transcoder usage:

  • Secondary (substitution) stream needs to fit the primary (original) stream by video resolution and audio sample rate.
  • The primary stream need to be decoded in order to get the substitution smoothly.

You need to obtain a license for Transcoder, then install the add-on and register a license for it.

2. Set up published inputs


You need to have both primary (original) and secondary (substitution) stream being set up and published into Nimble Streamer. In case you haven't done it yet, check the articles for RTMP, RTSP and Icecast publication setup.

3. Set up hot swap failover


Having both streams ready and Transcoder installed, you can set up failover hot swap for them. Follow the instructions and make sure you complete all steps.

4. Test the setup


As always, you need to test the setup before using it in production use cases. If you have any questions or issues, please contact our team so we could help.

Related documentation


Live streaming via Nimble StreamerFailover hot swap, Emergency stream hot swap,

February 13, 2020

Live Transcoder control API

Nimble Streamer Live Transcoder is well known for its drag-and-drop web UI which allows setting up live stream transformation of any complexity using any browser.

However we have a number of users who need to have some automation of Transcoder operations.

Our team introduced our first approach to Transcoder API.

Visit this WMSPanel API page to see all details of API setup and usage.

The operations which you can do over Transcoder instance are as follows:

  • Retrieve the list of transcoder scenarios
  • Get details of particular scenario
  • Pause and resume particular scenario
  • Delete an existing transcoder scenario

So having a set of scenarios for your servers, you can operate them just like you can do it from scenarios list in UI.

If you need more API call, please feel free to share them using our helpdesk so we could prioritize features in our wishlist.

Related documentation


Nimble Streamer Live Transcoder, Transcoder documentation reference,

February 6, 2020

HbbTV MPEG-DASH support in Nimble Streamer

Hybrid Broadcast Broadband TV (HbbTV) has been working with MPEG-DASH for some time by now and Nimble Streamer MPEG-DASH implementation also has that support.

To enabled this support, a specific profile needs to be added to DASH outgoing streams. This can be done by adding the following parameter into nimble.conf file:

dash_live_profiles = urn:hbbtv:dash:profile:isoff-live:2012,urn:mpeg:dash:profile:isoff-live:2011

You need to re-start Nimble Streamer after changing the config. Read this page to learn more about operating config file.

Related documents


MPEG-DASH support in Nimble Streamer

January 23, 2020

Mobile streaming to DaCast and Akamai

Larix mobile SDK allows publishing live streams from mobile devices to wide variety of destinations like media servers and streaming services. Some destinations require special handling due to authorization or other concerns.

DaCast service provides a turn-key solution for live streaming. It uses Akamai CDN for ingest and delivery, making it simple for an average user to get it working. However, Akamai has its requirements for authorization and other stream parameters. Nimble Streamer allows publishing to Akamai already so we've added the same support into Larix Broadcaster.

Here is how you can get it working.

Set up DaCast stream


We assume you already have DaCast account, so just open the dashboard and add a new stream.



Click on the stream name to see its full setup details. Click on Encoder tab on top to see the encoder setup details.

Click on Other RTMP encoder to see full parameters of RTMP connection.



Here you see Login and Password values which you will use later in Larix.

Now click on "Click if your encoder has one field" link to see a field with full URL for publishing.


Copy this full URL for later use, it should look like this:
rtmp://p.ep123456.i.akamaientrypoint.net/EntryPoint/dclive_1_150@123456

While you're in DaCast dashboard, check Publish settings tab to get a player page in order to check future result of your setup.

Now let's get to Larix setup.

Set up Larix Broadcaster


Larix Broadcaster is available for both Android and iOS platforms so just install it as usual.

Open the app and and enter settings by tapping on gear icon.

Tap on Connections -> New connection to enter a form below.



  • Name field can contain any alias you want for your connection. Larix Broadcaster allows streaming to multiple destinations simultaneously, so this is how you will distinct them from one another.
  • URL field defines the target destination. Insert the URL which you copied in the previous section.
  • Target type must be set to Akamai/DaCast.
  • Login and Password need ot be exactly like you've seen in DaCast connection settings.

Save connection, get back to connections list and make sure you select this new connection.
Now return to image preview screen and just hit the red button to start streaming.

Now check DaCast player page from previous section to watch the results.

Akamai


This setup procedure is applied the same way for publishing to Akamai CDN via RTMP. The publishing URL will have the same structure with same type of credentials. Larix Broadcaster target type is also "Akamai/DaCast". Please refer to Akamai website to learn more about its setup.



If you have any issues with this feature set, just let us know.

Related documentation


Larix mobile apps and SDK, Nimble Streamer RTMP feature setPublishing to Akamai from Nimble Streamer 

January 20, 2020

FFmpeg custom build support in Live Transcoder

Live Transcoder for Nimble Streamer supports a variety of decoding, filtering and encoding libraries. All the libraries which we have there were checked for reliability, performance and proper licensing before being added into the deployment package.

Our customers ask us to add some new libraries into Transcoder deployment package so they could be available by default in the UI. Mostly those are some existing open-source encoders, or commercial encoder libraries, or even custom encoders built by our customers themselves. However we couldn't add all the libraries which we are requested and this kept the doors closed for new functionality and gave bad experience to our customers.

To solve this problem, it's now possible to use custom builds of FFmpeg libraries to utilize any video and audio encoders as well as filters which are not supported in the default Transcoder package. Live Transcoder uses FFmpeg and its libraries for certain tasks under LGPL license which allows re-building it as necessary. So now you can just add more libraries if you need.

Linux packages of Live Transcoder can pick up custom libraries and use them for further encoding.
Re-building FFmpeg on Windows is also possible. If you are familiar with building FFmpeg for Windows you may try it, however we do not provide support for this case.

Here's how you may re-build FFmpeg and use it further.

1. Legal disclaimer


This article describes the process of building custom third-party FFmpeg libraries and using them in Softvelum Live Transcoder in addition to the libraries which are deployed as part of Live Transcoder package.

Every custom library which is a result of building FFmpeg has its own licensing terms. So every library must be examined for its licensing terms prior to any usage or distribution, including but not limited to the patent licensing terms.

Softvelum, LLC is not responsible for any license or patent infringement which can occur as a result of any FFmpeg custom build usage by Live Transcoder users.

2. Building FFmpeg


This section describes how you can build FFmpeg for further usage in Transcoder.

We strongly recommend you to try custom build approach in testing environment first. Once you get consistent result there, you may apply it to your production environment.

If something goes wrong after any of the steps and you'd like to revert it, just re-install Live Transcoder. This will rewrite all libraries with their default copies.

2.1 Making default FFmpeg build


To make sure your environment is ready for custom builds, let's start with building FFmpeg with the default libraries for Live Transcoder.

First, download the FFmpeg package. As required by FFmpeg license, we've uploaded the FFmpeg package and its build script on our website.

Second, run the shell script in the same directory where you've just downloaded FFmpeg. It has all commands needed for getting a working copy of FFmpeg. Its compiled libraries can be used with Live Transcoder as is.

You may get errors related to missing packages, like Freetype or Speex libraries. Just install respective packages using this command for Ubuntu
sudo apt install libfreetype6-dev libspeex-dev
and this command for CentOS
yum install freetype-devel speex-devel bzip2

You'll be able to proceed with building after that.

2.2 Making and using custom FFmpeg build


Now when you have FFmpeg ready for builds, you may add third-party encoder. This depends on what encoder you'd like to add, so you need to refer to your library documentation for more details on installation.

Having an encoder installed, you need to modify your build script to include it. Change your build script and modify the following line:
--enable-encoder=aac,png,mjpeg,customvideocodec,customaudiocodec \
Append your custom encoder name into that line. This is the name which is used within FFmpeg and which will later be used in Live Transcoder. In this case you can see "customvideocodec" and "customaudiocodec". You may also need to append additional lines for other parameters, so check library documentation for more information.

You can find examples of other custom build scripts in our github.

Once the build is over, you can use the new library.

2.3 Using libraries


You can ingest the libraries to Live Transcoder by copying them from "build/lib/" subdirectory of build directory into proper location.

Run this command on Ubuntu to see where Transcoder libraries are located:
dpkg -L nimble-transcoder
Most probably your directory will be /usr/lib/x86_64-linux-gnu/.

On CentOS you can run this command to see where it is:
rpm -ql nimble-transcoder

Once you find the location, you can re-write the libraries by copying from your build directory to Transcoder location.

2.4 Re-start Nimble Streamer


The last step to make those new libraries start working, is to re-start Nimble Streamer using the command required by your specific OS.

For Ubuntu it's this one:
sudo service nimble restart
You can find other OSes on installation page.

3. Using custom libraries in Live Transcoder


Now when you have the custom library available, you can start using it from your Live Transcoder web UI.

Create a Transcoder scenario as usual and add a new encoder element. You can watch this tutorial video to see how transcoding scenarios are created.

For custom video codec follow these steps:
  1. Drop video encoder element.
  2. In the "Add output stream" dialog, the "Encoder" dropdown menu must be set to "FFmpeg" value.
  3. In "Codec" field you need to specify the encoder name as defined in video encoder library, e.g. "customvideocodec" in our example. See section 2.2 regarding codec name in build parameters.


Custom audio codec is added the same way:

  1. Drop audio encoder element.
  2. In "Add output stream" dialog, set the "Encoder" field to "FFmpeg".
  3. In Codec field, specify the encoder name as defined in audio encoder library, e.g. "customaudiocodec" in our example. See section 2.2 regarding codec name in build parameters.



Besides that you can specify whatever parameters that are used in your custom encoder.

That's it. Using this technique you may use some third-party libraries which are not yet available in Live Transcoder out-of-the-box.

If you have any questions regarding this feature set usage, don't hesitate to contact us and show us your use case.

Related documentation


Live Transcoder for Nimble Streamer, Transcoder tutorial videos, Transcoder documentation reference

December 30, 2019

2019 summary

As the the year of 2019 is over, we want to recap the most significant products and features which we introduced during past 12 months.

We'd like to remind you that you can track all of our latest Softvelum changes via our TwitterFacebookLinkedIn posts and even YouTube channel news podcasts.

Qosifire web service


This year we released a new product called Qosifire. It's a streaming quality monitoring web service which allows tracking live HLS, RTMP and Icecast streams. Qosifire checks for stream correctness from protocol integrity and general content consistency viewpoints. Qosifire agent software checks streams 24/7 using your own server, then our web service console collects data to analyse and sends alerts via email and mobile push notifications.

Read more about why you may need Qosifire for your streaming infrastructure and how you can get started with Qosifire. In addition, read a Qosifire review article by Jan Ozer and find more information in Qosifire knowledge base.

You can also run a free 30-seconds checkup for your live stream without a sign-up.

Nimble Streamer bundle updates


As we explained in January, Flash has been continuously removed in all browsers which caused the decline of RTMP playback. This affects primarily live low latency streaming, this is why we've been working on low latency improvements in our products.

SLDP. First of all, SLDP ABR capabilities ignited a wide adoption among our customers. They use Nimble Streamer for their delivery edges and play their content via HTML, Android and iOS players to have nearly a latency of just about a second long.

Apple introduced Low Latency HLS earlier and released it for developers community.
Now Apple Low Latency HLS is supported in Nimble Streamer with MPEGTS, audio-only and fMP4 containers. Read this introduction article which describes Nimble Streamer setup and LL-HLS usage. As of iOS 13.3, Apple hasn't released LL-HLS from beta stage yet, so we don't have player app available. But our iOS SLDP Player SDK is able to provide this for our subscribers.
BTW, LL-HLS is working on top of HTTP/2 implementation available in Nimble Streamer. You can use it for HLS and MPEG-DASH live streaming delivery.

SRT. Another outstanding technology which we improved over this year was SRT reliable delivery over UDP. Being a member of SRT Alliance we contributed to the community and kept improving user experience allowing to tune latency and maxbw to improve reliability. Our products always have the latest stable versions of SRT library to make sure they have all the latest improvements.

Icecast. Speaking of other improvements, we added more features related to Icecast metadata as described on our Icecast feature page.

SSL. For those of our customers who use Certbot with Let's Encrypt, we made a detailed description for using this duo with Nimble Streamer.


Live Transcoder has been improved in several ways as well. First, take a look at Transcoder overview screencast and Transcoder documentation reference page to see what we got.

We've added SVT-HEVC software library for H.265/HEVC encoding in Live Transcoder for Nimble Streamer.
This feature utilizes the latest improvement, the ability to run encoder out-of-process which allows securing the main Nimble Streamer process in case if some encoder library causes crashes.

The HEVC in general has been on the rise this year. To meet customers' demands we've released an experimental support of H.265/HEVC over RTMP in Nimble Streamer and Larix Broadcaster apps.

As for encoder libraries, QuickSync hardware acceleration is now available on Ubuntu which makes it easier to install.

Nimble Advertizer was improved through this year to handle SCTE-35 markers:
Read Advertizer spec to full details.

Reference pages. Last but not least, we added a couple of useful digest pages:


Mobile solutions


Our mobile solutions were improved over this year.

One of the most significant improvements is adding SRT playback into SLDP Player for Android and iOS. You can also take a look at our SRT digest page to find out more about product support for this technology.

As was mentioned earlier, our iOS SLDP Player SDK is able to provide Low Latency HLS playback capabilities for those who would like to try this new technology. Feel free to subscribe to iOS SLDP Player in order to obtain the latest version and build your own app with LL-HLS.

We also released Larix Screencaster for iOS - a highly anticipated great addition to our mobile apps bundle.

Larix Broadcaster is now able to produce RTMPS and RTSPS, which means RTMP and RTSP can be delivered via SSL. It's a great advantage for those who would like to secure their live streams in un-secure environments like mobile networks or public WiFi.
Larix also has ABR support for outgoing streams which means it can lower bitrate and framerate according to network conditions.

Softvelum website now has Larix documentation reference which has links to all articles and pages related to mobile streaming with our products.

You can read SDKs release notes to find out more about our updates and releases.




Softvelum team wishes you a Happy New Year and looks forward to bringing you more features and improvements!



Follow us via TwitterFacebookLinkedIn and YouTube to get updates on time.

December 29, 2019

The State of Streaming Protocols - 2019 Q4

Softvelum team keeps tracking the state of streaming protocols. It's based on stats from WMSPanel reporting service which handles data from Wowza Streaming Engine and Nimble Streamer servers. This quarter WMSPanel collected data about more than 17.8 billion views. Total view time for our server products is 2.25 billion hours this quarter, or 24+ million view hours per day.

The State of Streaming Protocols - Q4 2019

You can compare these numbers with metrics from Q3 2019:

The State of Streaming Protocols - Q3 2019

Most protocols state kept the same share with HLS controlling the most of delivery landscape.


We'll keep analyzing protocols to see the dynamics. Check our updates at FacebookTwitter and LinkedIn.

If you'd like to use these stats, please refer to this article by original name and URL.

December 24, 2019

Introducing Apple Low Latency HLS in Softvelum products

The HLS of RFC8216 is currently a de-facto standard for live and VOD delivery. Playlists and chunks delivered over HTTP/1.1 provide the simplicity of implementation, wide platforms availability, adaptive bitrate support and scalability. These advantages allowed it to get the biggest share of customer base. However it has some disadvantages for live streaming, primarily because chunks delivery means many seconds of delay at the start of a live stream. The industry has been making some improvements on top of HLS, and then Apple released its spec update to address existing issues.

Low Latency HLS


Low Latency HLS (LL-HLS) is the next generation of Apple's HTTP Live Streaming protocol introduced in early 2019.
Several key features improve it over regular HLS for live delivery:
  • Partial segments (parts) can be accessed before full chunks of content are available.
  • Server side can use HTTP/2 Push for sending parts.
  • Holding playlist requests for obtaining latest parts as soon as they appear.
Softvelum team added LL-HLS into the bundle. We provide customers with playback capabilities using SLDP Player for iOS, and Nimble Streamer allows generating LL-HLS for all supported containers such as fMP4, audio-only and MPEGTS.

1. SLDP Player to play LL-HLS


Apple still has LL-HLS in beta stage as of iOS 13.3 at the end of December of 2019, so there are a number of shortcomings in its implementation. The main concern is the fact that iOS native player implementation cannot be published into AppStore yet. Lack of browsers' and other platforms' availability is also a big blocker so far. So the only way to try the playback for development purposes is to build your own app for that.

SLDP Player SDK for iOS allows having full-featured Low Latency HLS playback on iOS devices. It covers live streams from any source capable of LL-HLS like Wowza Streaming Engine and Nimble Streamer, and it also supports regular HLS from any available source.

If you'd like to build your own low latency playback app, you can get player SDK from our team for further test and integration. Once the LL-HLS technology goes from Apple beta to production (in early 2020 as per Apple), you'll be able to have full-featured app and publish it under your brand.

2. Nimble Streamer for LL-HLS transmuxing


Nimble Streamer software media server allows taking any supported live input streams and re-packaging them into Low Latency HLS. Here are the steps you need to follow in order to make it work.

2.1 HTTP/2 and config setup


2.1.1. LL-HLS uses HTTP/2 via SSL as a transport protocol. So you need to enable it before performing any further setup.
Please follow this HTTP/2 setup article to make this protocol working for Nimble Streamer.

2.1.2. In addition to that, you need to add this parameter into nimble.conf and restart Nimble Streamer, read this page for config setup details:
hls_add_program_date_time = true
If a client tries to access LL-HLS stream via HTTP/1.1, or if HTTP/2 is not properly set up, then player will fall back to regular-legacy HLS and will not use any advantages of LL-HLS.

You can check if Nimble Streamer delivers HTTP/2 by checking access log. To enable access logs, add this parameter into nimble.conf the same way you've done it for other parameters:
log_access = file
Once you re-start Nimble, you'll be able to view the log. In Ubuntu it's located in /var/log/nimble/access.log by default. Now when you try to get your regular HLS live stream via https:// via curl or HTTP/2-capable player, you'll get this kind of record in access log:
Dec 24 17:43:09 nimble[5500]: [nimble-access.log] 192.18.1.2 - - "GET /livell/stream/playlist.m3u8 HTTP/2" 200 84 1114 372 "-" "AppleCoreMedia/1.0.0.17C54 (iPhone; U; CPU OS 13_3 like Mac OS X; en_us)"
You can see HTTP/2 there which means it's working. In other cases it will have HTTP/1.1 and this will mean you need to check what's wrong. Contact us in case of issues.

2.2 Live streaming setup


Now you need to set up transmuxing settings via WMSPanel web service. If you are not familiar with live streaming setup of Nimble Streamer, please refer to live streaming digest page, or respective input protocol pages, such as RTMP streaming. Please make sure you have a correctly set up live stream (like regular-latency HLS or SLDP) before trying to use LL-HLS.

Once you have a live stream set up in WMSPanel, go to Nimble Streamer top menu and select Live streams settings. You will see Global setting tab for selected server (and you may create application-specific setting as well).


Currently, Nimble Streamer supports all 3 containers available for HLS, you can see their respective checkboxes on the screenshot above:
  • HLS - HLS with audio-only container. Audio-only is optimized for audio delivery having a reduced size. The ID3 tags are also inserted in each audio part.
  • HLS (MPEGTS) - MPEG-TS: the only container with video and audio support for LL-HLS
  • fMP4 - fragmented MP4. Notice that fMP4 container playback has a couple of issues related to current Apple implementation of their player as of iOS 13.3, please refer to section "3. Known issues" below for more information.
Once you select either of those containers, WMSPanel will show Enable Apple's Low Latency HLS checkbox and you need to select it. It will also show HLS part duration edit box to define parts' duration in milliseconds, we recommend using default value of 1000ms, see section "3. Known issues" for details.

Once LL-HLS is enabled, you need to re-start the input stream so Nimble Streamer could start producing LL-HLS output stream.

2.3 Workflow and playlists


Now as the set up has been made, you can use the player to consume the stream using the usual playback URL. The main playlist will have proper chunklists which will have a content according to LL-HLS spec, as shown in the example below.
#EXTM3U
#EXT-X-VERSION:7
#EXT-X-INDEPENDENT-SEGMENTS
#EXT-X-MAP:URI="audio.fmp4?nimblesessionid=1"
#EXT-X-TARGETDURATION:7
#EXT-X-MEDIA-SEQUENCE:1
#EXT-X-SERVER-CONTROL:CAN-BLOCK-RELOAD=YES,PART-HOLD-BACK=1.536
#EXT-X-PART-INF:PART-TARGET=0.512
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:29:02.609Z
#EXTINF:5.995,
a_6_6016_1.fmp4?nimblesessionid=1
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_0.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_1.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_2.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_3.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_4.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_5.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_6.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_7.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_8.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_9.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_10.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.363,URI="a_6_12011_2_11.fmp4?nimblesessionid=1"
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:29:08.604Z
#EXTINF:5.995,
a_6_12011_2.fmp4?nimblesessionid=1
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_0.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_1.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_2.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_3.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_4.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_5.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_6.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_7.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_8.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_9.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_10.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.362,URI="a_6_18006_3_11.fmp4?nimblesessionid=1"
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:29:14.599Z
#EXTINF:5.994,
a_6_18006_3.fmp4?nimblesessionid=1
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_0.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_1.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_2.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_3.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_4.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_5.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_6.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_7.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_8.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_9.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_10.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.384,URI="a_6_24000_4_11.fmp4?nimblesessionid=1"
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:29:20.593Z
#EXTINF:6.016,
a_6_24000_4.fmp4?nimblesessionid=1


Parts in chunklist. Comparing to regular HLS, you see a lot of lines representing parts like this:
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_0.fmp4?nimblesessionid=1"
The full chunk which contain these parts will be described after all parts' lines:
a_6_24000_4.fmp4?nimblesessionid=1
All parts within chunks are numerated starting from zero. So "a_6_18006_3_0.fmp4" mean its the first part of chunk number 3.

Part length. This attribute declares a designated size of upcoming parts:
#EXT-X-PART-INF:PART-TARGET=0.512
In this example it's 512 milliseconds.

Can block reload
. Check this line:
#EXT-X-SERVER-CONTROL:CAN-BLOCK-RELOAD=YES,PART-HOLD-BACK=1.536
The "CAN-BLOCK-RELOAD" declares that media server allows holding playlist request.

Hold playlist request. LL-HLS allows requesting the server to hold sending out the playlist until a specific chunk and/or part is available for the stream.
So a player may request some part which is going to be available within a few seconds from now, then Nimble Streamer will check if that part is available. Once the requested part is ready, Nimble will return a playlist.

Check this request example:
curl -k "https://localhost:8443/livell/stream/chunks.m3u8?nimblesessionid=1&_HLS_msn=59&_HLS_part=5"
The highlighted _HLS_msn=59 and _HLS_part=5 parameters indicate that the server must hold the request until Nimble Streamer has part number 5 of chunk number 59 or later and then it could return a playlist. You can use only _HLS_msn=59 parameter, in this case the playlist will be sent out only once full chunk is available.

The resulting chunklist will look like this:
#EXTM3U
#EXT-X-VERSION:7
#EXT-X-INDEPENDENT-SEGMENTS
#EXT-X-MAP:URI="audio.fmp4?nimblesessionid=1"
#EXT-X-TARGETDURATION:7
#EXT-X-MEDIA-SEQUENCE:55
#EXT-X-SERVER-CONTROL:CAN-BLOCK-RELOAD=YES,PART-HOLD-BACK=1.536
#EXT-X-PART-INF:PART-TARGET=0.512
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:34:26.599Z
#EXTINF:5.994,
a_6_330006_55.fmp4?nimblesessionid=1
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:34:32.593Z
#EXTINF:6.016,
a_6_336000_56.fmp4?nimblesessionid=1
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_0.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_1.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_2.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_3.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_4.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_5.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_6.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_7.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_8.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_9.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_10.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.363,URI="a_6_342016_57_11.fmp4?nimblesessionid=1"
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:34:38.609Z
#EXTINF:5.995,
a_6_342016_57.fmp4?nimblesessionid=1
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_0.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_1.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_2.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_3.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_4.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_5.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_6.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_7.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_8.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_9.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_10.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.363,URI="a_6_348011_58_11.fmp4?nimblesessionid=1"
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:34:44.604Z
#EXTINF:5.995,
a_6_348011_58.fmp4?nimblesessionid=1
#EXT-X-PART:DURATION=0.512,URI="a_6_354006_59_0.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_354006_59_1.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_354006_59_2.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_354006_59_3.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_354006_59_4.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_354006_59_5.fmp4?nimblesessionid=1"
You can see it ends with part a_6_354006_59_5.fmp4 - it's part number 5 of the upcoming chunk 59. That chunk will be available only a few seconds later, but the player can already perform the playback, this helps a lot with reducing the latency.

Push the requested part. In addition to requesting specific part upon its arrival, a player may request Nimble Streamer to make HTTP/2 Push of that part to reduce the playback latency even further. This will be made by adding "_HLS_push=1" parameter in URL. If we look at Nimble Streamer access logs, we'll see the following actions:

Dec 24 18:43:04 nimble[5500]: [nimble-access.log] 192.18.1.2 - - "GET /livell/stream/chunks.m3u8?nimblesessionid=18&_HLS_msn=9&_HLS_part=0&_HLS_push=1 HTTP/2" 200 84 1114 372 "-" "AppleCoreMedia/1.0.0.17C54 (iPhone; U; CPU OS 13_3 like Mac OS X; en_us)"
Dec 24 18:43:04 nimble[5500]: [nimble-access.log] 192.18.1.2 - - "PUSH /livell/stream/l_4_27008_9_0.aac?nimblesessionid=18 HTTP/2" 200 0 49896 662 "-" "AppleCoreMedia/1.0.0.17C54 (iPhone; U; CPU OS 13_3 like Mac OS X; en_us)"

Dec 24 18:43:05 nimble[5500]: [nimble-access.log] 192.18.1.2 - - "GET /livell/stream/chunks.m3u8?nimblesessionid=18&_HLS_msn=9&_HLS_part=1&_HLS_push=1 HTTP/2" 200 84 1180 341 "-" "AppleCoreMedia/1.0.0.17C54 (iPhone; U; CPU OS 13_3 like Mac OS X; en_us)"
Dec 24 18:43:05 nimble[5500]: [nimble-access.log] 192.18.1.2 - - "PUSH /livell/stream/l_4_27008_9_1.aac?nimblesessionid=18 HTTP/2" 200 0 49828 568 "-" "AppleCoreMedia/1.0.0.17C54 (iPhone; U; CPU OS 13_3 like Mac OS X; en_us)"

As you can see the player is sending hold-playlist request (described earlier) with specific part number and _HLS_push=1 parameter. Nimble Streamer returns that playlist in response, as well as making HTTP/2 Push for the requested part.

Performance. With all these specific actions Nimble Streamer generates and serves parts within HLS stream with high efficiency. From resource consumption perspective, LL-HLS processing costs the same as handling regular-latency playlists and chunks.

3. Known issues and troubleshooting


At the moment Apple native player on iOS 13.3 has the following problems with LL-HLS implementation which may affect end-user experience.

1. fMP4 video+audio. If you use fMP4 container then you will be able to get either video or audio component working. Video+audio fMP4 streaming is not working properly yet. You can try using MPEGTS container for video+audio instead.

2. Part duration. If you set part duration to less than 1000 ms then video will not work at all. So we recommend setting part duration as "1000".

We are sure those issues will be fixed in Apple's upcoming releases. Meanwhile on iOS 13.3 you'll have to test it with aforementioned limitations.

3. Interleaving compensation. If you have video+audio stream you may have issues with playback due to interleaving as described in this article. This kind of issues becomes even more severe in case of low latency playback. In order to fix this you can try enabling interleaving compensation with Min. delay set to zero, see the image below.





Feel free to try Nimble Streamer with Low Latency HLS and buy SLDP Player SDK to get your hands on the iOS playback.

Let us know if you have any questions.

December 17, 2019

Running encoders in out-of-process mode

Live Transcoder for Nimble Streamer allows using a number of encoder libraries for producing output stream content. All encoding activities are performed in the same process with the other transcoding pipes.

However, there are cases when encoder libraries may cause issues. Some experimental encoders may be unstable during extensive usage and even legacy encoders in some cases may act unpredictably - that is because the encoding process is very complicated. These issues may cause the main Nimble Streamer process to go down and stop processing streams. This affects overall robustness and stability of streaming infrastructure.

To address that, we've made Live Transcoder to support running encoders in out-of-process mode. It makes Nimble Streamer to start a new separate process for each encoder in transcoder scenario. If the encoder fails, the process is stopped and automatically re-started without affecting overall transcoding. The output is also not interrupted so your end-users will notice just a short image freeze.

By default, out-of-process encoding is enabled only if you choose SVT-HEVC encoding library for specific encoder. For all other encoders, this capability is disabled by default.

To enable this feature, you need to open encoder details dialog, enter new nimble-execution-mode parameter on the left and then enter its out-of-process value to the right as shown on the image below.


Once you save the transcoding scenario and it's synced to Nimble Streamer instance (which happens 30 seconds), this will take immediate effect.

Out-of-process encoding is available in Nimble Streamer starting from version 3.6.3-2 and Live Transcoder starting from version 1.1.1-2.

This improvement brings ability to have more robust and reliable transcoding. Let us know if you have any questions regarding this feature usage.


Related documentation


Live Transcoder, Transcoder documentation reference