January 23, 2020

Mobile streaming to DaCast and Akamai

Larix mobile SDK allows publishing live streams from mobile devices to wide variety of destinations like media servers and streaming services. Some destinations require special handling due to authorization or other concerns.

DaCast service provides a turn-key solution for live streaming. It uses Akamai CDN for ingest and delivery, making it simple for an average user to get it working. However, Akamai has its requirements for authorization and other stream parameters. Nimble Streamer allows publishing to Akamai already so we've added the same support into Larix Broadcaster.

Here is how you can get it working.

Set up DaCast stream


We assume you already have DaCast account, so just open the dashboard and add a new stream.



Click on the stream name to see its full setup details. Click on Encoder tab on top to see the encoder setup details.

Click on Other RTMP encoder to see full parameters of RTMP connection.



Here you see Login and Password values which you will use later in Larix.

Now click on "Click if your encoder has one field" link to see a field with full URL for publishing.


Copy this full URL for later use, it should look like this:
rtmp://p.ep123456.i.akamaientrypoint.net/EntryPoint/dclive_1_150@123456

While you're in DaCast dashboard, check Publish settings tab to get a player page in order to check future result of your setup.

Now let's get to Larix setup.

Set up Larix Broadcaster


Larix Broadcaster is available for both Android and iOS platforms so just install it as usual.

Open the app and and enter settings by tapping on gear icon.

Tap on Connections -> New connection to enter a form below.



  • Name field can contain any alias you want for your connection. Larix Broadcaster allows streaming to multiple destinations simultaneously, so this is how you will distinct them from one another.
  • URL field defines the target destination. Insert the URL which you copied in the previous section.
  • Target type must be set to Akamai/DaCast.
  • Login and Password need ot be exactly like you've seen in DaCast connection settings.

Save connection, get back to connections list and make sure you select this new connection.
Now return to image preview screen and just hit the red button to start streaming.

Now check DaCast player page from previous section to watch the results.

Akamai


This setup procedure is applied the same way for publishing to Akamai CDN via RTMP. The publishing URL will have the same structure with same type of credentials. Larix Broadcaster target type is also "Akamai/DaCast". Please refer to Akamai website to learn more about its setup.



If you have any issues with this feature set, just let us know.

Related documentation


Larix mobile apps and SDK, Nimble Streamer RTMP feature setPublishing to Akamai from Nimble Streamer 

January 20, 2020

FFmpeg custom build support in Live Transcoder

Live Transcoder for Nimble Streamer supports a variety of decoding, filtering and encoding libraries. All the libraries which we have there were checked for reliability, performance and proper licensing before being added into the deployment package.

Our customers ask us to add some new libraries into Transcoder deployment package so they could be available by default in the UI. Mostly those are some existing open-source encoders, or commercial encoder libraries, or even custom encoders built by our customers themselves. However we couldn't add all the libraries which we are requested and this kept the doors closed for new functionality and gave bad experience to our customers.

To solve this problem, it's now possible to use custom builds of FFmpeg libraries to utilize any video and audio encoders as well as filters which are not supported in the default Transcoder package. Live Transcoder uses FFmpeg and its libraries for certain tasks under LGPL license which allows re-building it as necessary. So now you can just add more libraries if you need.

Linux packages of Live Transcoder can pick up custom libraries and use them for further encoding.
Re-building FFmpeg on Windows is also possible. If you are familiar with building FFmpeg for Windows you may try it, however we do not provide support for this case.

Here's how you may re-build FFmpeg and use it further.

1. Legal disclaimer


This article describes the process of building custom third-party FFmpeg libraries and using them in Softvelum Live Transcoder in addition to the libraries which are deployed as part of Live Transcoder package.

Every custom library which is a result of building FFmpeg has its own licensing terms. So every library must be examined for its licensing terms prior to any usage or distribution, including but not limited to the patent licensing terms.

Softvelum, LLC is not responsible for any license or patent infringement which can occur as a result of any FFmpeg custom build usage by Live Transcoder users.

2. Building FFmpeg


This section describes how you can build FFmpeg for further usage in Transcoder.

We strongly recommend you to try custom build approach in testing environment first. Once you get consistent result there, you may apply it to your production environment.

If something goes wrong after any of the steps and you'd like to revert it, just re-install Live Transcoder. This will rewrite all libraries with their default copies.

2.1 Making default FFmpeg build


To make sure your environment is ready for custom builds, let's start with building FFmpeg with the default libraries for Live Transcoder.

First, download the FFmpeg package. As required by FFmpeg license, we've uploaded the FFmpeg package and its build script on our website.

Second, run the shell script in the same directory where you've just downloaded FFmpeg. It has all commands needed for getting a working copy of FFmpeg. Its compiled libraries can be used with Live Transcoder as is.

You may get errors related to missing packages, like Freetype or Speex libraries. Just install respective packages using these commands
sudo apt install libfreetype6-dev libspeex-dev
You'll be able to proceed with building after that.

2.2 Making and using custom FFmpeg build


Now when you have FFmpeg ready for builds, you may add third-party encoder. This depends on what encoder you'd like to add, so you need to refer to your library documentation for more details on installation.

Having an encoder installed, you need to modify your build script to include it. Change your build script and modify the following line:
--enable-encoder=aac,png,mjpeg,customvideocodec,customaudiocodec \
Append your custom encoder name into that line. This is the name which is used within FFmpeg and which will later be used in Live Transcoder. In this case you can see "customvideocodec" and "customaudiocodec". You may also need to append additional lines for other parameters, so check library documentation for more information.

You can find examples of other custom build scripts in our github.

Once the build is over, you can use the new library.

2.3 Using libraries


You can ingest the libraries to Live Transcoder by copying them from "build/lib/" subdirectory of build directory into proper location.

Run this command on Ubuntu to see where Transcoder libraries are located:
dpkg -L nimble-transcoder
Most probably your directory will be /usr/lib/x86_64-linux-gnu/.

On CentOS you can run this command to see where it is:
rpm -ql nimble-transcoder

Once you find the location, you can re-write the libraries by copying from your build directory to Transcoder location.

2.4 Re-start Nimble Streamer


The last step to make those new libraries start working, is to re-start Nimble Streamer using the command required by your specific OS.

For Ubuntu it's this one:
sudo service nimble restart
You can find other OSes on installation page.

3. Using custom libraries in Live Transcoder


Now when you have the custom library available, you can start using it from your Live Transcoder web UI.

Create a Transcoder scenario as usual and add a new encoder element. You can watch this tutorial video to see how transcoding scenarios are created.

For custom video codec follow these steps:
  1. Drop video encoder element.
  2. In the "Add output stream" dialog, the "Encoder" dropdown menu must be set to "FFmpeg" value.
  3. In "Codec" field you need to specify the encoder name as defined in video encoder library, e.g. "customvideocodec" in our example. See section 2.2 regarding codec name in build parameters.


Custom audio codec is added the same way:

  1. Drop audio encoder element.
  2. In "Add output stream" dialog, set the "Encoder" field to "FFmpeg".
  3. In Codec field, specify the encoder name as defined in audio encoder library, e.g. "customaudiocodec" in our example. See section 2.2 regarding codec name in build parameters.



Besides that you can specify whatever parameters that are used in your custom encoder.

That's it. Using this technique you may use some third-party libraries which are not yet available in Live Transcoder out-of-the-box.

If you have any questions regarding this feature set usage, don't hesitate to contact us and show us your use case.

Related documentation


Live Transcoder for Nimble Streamer, Transcoder tutorial videos, Transcoder documentation reference

December 30, 2019

2019 summary

As the the year of 2019 is over, we want to recap the most significant products and features which we introduced during past 12 months.

We'd like to remind you that you can track all of our latest Softvelum changes via our TwitterFacebookLinkedIn posts and even YouTube channel news podcasts.

Qosifire web service


This year we released a new product called Qosifire. It's a streaming quality monitoring web service which allows tracking live HLS, RTMP and Icecast streams. Qosifire checks for stream correctness from protocol integrity and general content consistency viewpoints. Qosifire agent software checks streams 24/7 using your own server, then our web service console collects data to analyse and sends alerts via email and mobile push notifications.

Read more about why you may need Qosifire for your streaming infrastructure and how you can get started with Qosifire. In addition, read a Qosifire review article by Jan Ozer and find more information in Qosifire knowledge base.

You can also run a free 30-seconds checkup for your live stream without a sign-up.

Nimble Streamer bundle updates


As we explained in January, Flash has been continuously removed in all browsers which caused the decline of RTMP playback. This affects primarily live low latency streaming, this is why we've been working on low latency improvements in our products.

SLDP. First of all, SLDP ABR capabilities ignited a wide adoption among our customers. They use Nimble Streamer for their delivery edges and play their content via HTML, Android and iOS players to have nearly a latency of just about a second long.

Apple introduced Low Latency HLS earlier and released it for developers community.
Now Apple Low Latency HLS is supported in Nimble Streamer with MPEGTS, audio-only and fMP4 containers. Read this introduction article which describes Nimble Streamer setup and LL-HLS usage. As of iOS 13.3, Apple hasn't released LL-HLS from beta stage yet, so we don't have player app available. But our iOS SLDP Player SDK is able to provide this for our subscribers.
BTW, LL-HLS is working on top of HTTP/2 implementation available in Nimble Streamer. You can use it for HLS and MPEG-DASH live streaming delivery.

SRT. Another outstanding technology which we improved over this year was SRT reliable delivery over UDP. Being a member of SRT Alliance we contributed to the community and kept improving user experience allowing to tune latency and maxbw to improve reliability. Our products always have the latest stable versions of SRT library to make sure they have all the latest improvements.

Icecast. Speaking of other improvements, we added more features related to Icecast metadata as described on our Icecast feature page.

SSL. For those of our customers who use Certbot with Let's Encrypt, we made a detailed description for using this duo with Nimble Streamer.


Live Transcoder has been improved in several ways as well. First, take a look at Transcoder overview screencast and Transcoder documentation reference page to see what we got.

We've added SVT-HEVC software library for H.265/HEVC encoding in Live Transcoder for Nimble Streamer.
This feature utilizes the latest improvement, the ability to run encoder out-of-process which allows securing the main Nimble Streamer process in case if some encoder library causes crashes.

The HEVC in general has been on the rise this year. To meet customers' demands we've released an experimental support of H.265/HEVC over RTMP in Nimble Streamer and Larix Broadcaster apps.

As for encoder libraries, QuickSync hardware acceleration is now available on Ubuntu which makes it easier to install.

Nimble Advertizer was improved through this year to handle SCTE-35 markers:
Read Advertizer spec to full details.

Reference pages. Last but not least, we added a couple of useful digest pages:


Mobile solutions


Our mobile solutions were improved over this year.

One of the most significant improvements is adding SRT playback into SLDP Player for Android and iOS. You can also take a look at our SRT digest page to find out more about product support for this technology.

As was mentioned earlier, our iOS SLDP Player SDK is able to provide Low Latency HLS playback capabilities for those who would like to try this new technology. Feel free to subscribe to iOS SLDP Player in order to obtain the latest version and build your own app with LL-HLS.

We also released Larix Screencaster for iOS - a highly anticipated great addition to our mobile apps bundle.

Larix Broadcaster is now able to produce RTMPS and RTSPS, which means RTMP and RTSP can be delivered via SSL. It's a great advantage for those who would like to secure their live streams in un-secure environments like mobile networks or public WiFi.
Larix also has ABR support for outgoing streams which means it can lower bitrate and framerate according to network conditions.

Softvelum website now has Larix documentation reference which has links to all articles and pages related to mobile streaming with our products.

You can read SDKs release notes to find out more about our updates and releases.




Softvelum team wishes you a Happy New Year and looks forward to bringing you more features and improvements!



Follow us via TwitterFacebookLinkedIn and YouTube to get updates on time.

December 29, 2019

The State of Streaming Protocols - 2019 Q4

Softvelum team keeps tracking the state of streaming protocols. It's based on stats from WMSPanel reporting service which handles data from Wowza Streaming Engine and Nimble Streamer servers. This quarter WMSPanel collected data about more than 17.8 billion views. Total view time for our server products is 2.25 billion hours this quarter, or 24+ million view hours per day.

The State of Streaming Protocols - Q4 2019

You can compare these numbers with metrics from Q3 2019:

The State of Streaming Protocols - Q3 2019

Most protocols state kept the same share with HLS controlling the most of delivery landscape.


We'll keep analyzing protocols to see the dynamics. Check our updates at FacebookTwitter and LinkedIn.

If you'd like to use these stats, please refer to this article by original name and URL.

December 24, 2019

Introducing Apple Low Latency HLS in Softvelum products

The HLS of RFC8216 is currently a de-facto standard for live and VOD delivery. Playlists and chunks delivered over HTTP/1.1 provide the simplicity of implementation, wide platforms availability, adaptive bitrate support and scalability. These advantages allowed it to get the biggest share of customer base. However it has some disadvantages for live streaming, primarily because chunks delivery means many seconds of delay at the start of a live stream. The industry has been making some improvements on top of HLS, and then Apple released its spec update to address existing issues.

Low Latency HLS


Low Latency HLS (LL-HLS) is the next generation of Apple's HTTP Live Streaming protocol introduced in early 2019.
Several key features improve it over regular HLS for live delivery:
  • Partial segments (parts) can be accessed before full chunks of content are available.
  • Server side can use HTTP/2 Push for sending parts.
  • Holding playlist requests for obtaining latest parts as soon as they appear.
Softvelum team added LL-HLS into the bundle. We provide customers with playback capabilities using SLDP Player for iOS, and Nimble Streamer allows generating LL-HLS for all supported containers such as fMP4, audio-only and MPEGTS.

1. SLDP Player to play LL-HLS


Apple still has LL-HLS in beta stage as of iOS 13.3 at the end of December of 2019, so there are a number of shortcomings in its implementation. The main concern is the fact that iOS native player implementation cannot be published into AppStore yet. Lack of browsers' and other platforms' availability is also a big blocker so far. So the only way to try the playback for development purposes is to build your own app for that.

SLDP Player SDK for iOS allows having full-featured Low Latency HLS playback on iOS devices. It covers live streams from any source capable of LL-HLS like Wowza Streaming Engine and Nimble Streamer, and it also supports regular HLS from any available source.

If you'd like to build your own low latency playback app, you can get player SDK from our team for further test and integration. Once the LL-HLS technology goes from Apple beta to production (in early 2020 as per Apple), you'll be able to have full-featured app and publish it under your brand.

2. Nimble Streamer for LL-HLS transmuxing


Nimble Streamer software media server allows taking any supported live input streams and re-packaging them into Low Latency HLS. Here are the steps you need to follow in order to make it work.

2.1 HTTP/2 and config setup


2.1.1. LL-HLS uses HTTP/2 via SSL as a transport protocol. So you need to enable it before performing any further setup.
Please follow this HTTP/2 setup article to make this protocol working for Nimble Streamer.

2.1.2. In addition to that, you need to add this parameter into nimble.conf and restart Nimble Streamer, read this page for config setup details:
hls_add_program_date_time = true
If a client tries to access LL-HLS stream via HTTP/1.1, or if HTTP/2 is not properly set up, then player will fall back to regular-legacy HLS and will not use any advantages of LL-HLS.

You can check if Nimble Streamer delivers HTTP/2 by checking access log. To enable access logs, add this parameter into nimble.conf the same way you've done it for other parameters:
log_access = file
Once you re-start Nimble, you'll be able to view the log. In Ubuntu it's located in /var/log/nimble/access.log by default. Now when you try to get your regular HLS live stream via https:// via curl or HTTP/2-capable player, you'll get this kind of record in access log:
Dec 24 17:43:09 nimble[5500]: [nimble-access.log] 192.18.1.2 - - "GET /livell/stream/playlist.m3u8 HTTP/2" 200 84 1114 372 "-" "AppleCoreMedia/1.0.0.17C54 (iPhone; U; CPU OS 13_3 like Mac OS X; en_us)"
You can see HTTP/2 there which means it's working. In other cases it will have HTTP/1.1 and this will mean you need to check what's wrong. Contact us in case of issues.

2.2 Live streaming setup


Now you need to set up transmuxing settings via WMSPanel web service. If you are not familiar with live streaming setup of Nimble Streamer, please refer to live streaming digest page, or respective input protocol pages, such as RTMP streaming. Please make sure you have a correctly set up live stream (like regular-latency HLS or SLDP) before trying to use LL-HLS.

Once you have a live stream set up in WMSPanel, go to Nimble Streamer top menu and select Live streams settings. You will see Global setting tab for selected server (and you may create application-specific setting as well).


Currently, Nimble Streamer supports all 3 containers available for HLS, you can see their respective checkboxes on the screenshot above:
  • HLS - HLS with audio-only container. Audio-only is optimized for audio delivery having a reduced size. The ID3 tags are also inserted in each audio part.
  • HLS (MPEGTS) - MPEG-TS: the only container with video and audio support for LL-HLS
  • fMP4 - fragmented MP4. Notice that fMP4 container playback has a couple of issues related to current Apple implementation of their player as of iOS 13.3, please refer to section "3. Known issues" below for more information.
Once you select either of those containers, WMSPanel will show Enable Apple's Low Latency HLS checkbox and you need to select it. It will also show HLS part duration edit box to define parts' duration in milliseconds, we recommend using default value of 1000ms, see section "3. Known issues" for details.

Once LL-HLS is enabled, you need to re-start the input stream so Nimble Streamer could start producing LL-HLS output stream.

2.3 Workflow and playlists


Now as the set up has been made, you can use the player to consume the stream using the usual playback URL. The main playlist will have proper chunklists which will have a content according to LL-HLS spec, as shown in the example below.
#EXTM3U
#EXT-X-VERSION:7
#EXT-X-INDEPENDENT-SEGMENTS
#EXT-X-MAP:URI="audio.fmp4?nimblesessionid=1"
#EXT-X-TARGETDURATION:7
#EXT-X-MEDIA-SEQUENCE:1
#EXT-X-SERVER-CONTROL:CAN-BLOCK-RELOAD=YES,PART-HOLD-BACK=1.536
#EXT-X-PART-INF:PART-TARGET=0.512
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:29:02.609Z
#EXTINF:5.995,
a_6_6016_1.fmp4?nimblesessionid=1
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_0.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_1.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_2.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_3.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_4.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_5.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_6.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_7.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_8.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_9.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_10.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.363,URI="a_6_12011_2_11.fmp4?nimblesessionid=1"
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:29:08.604Z
#EXTINF:5.995,
a_6_12011_2.fmp4?nimblesessionid=1
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_0.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_1.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_2.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_3.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_4.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_5.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_6.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_7.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_8.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_9.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_10.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.362,URI="a_6_18006_3_11.fmp4?nimblesessionid=1"
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:29:14.599Z
#EXTINF:5.994,
a_6_18006_3.fmp4?nimblesessionid=1
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_0.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_1.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_2.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_3.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_4.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_5.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_6.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_7.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_8.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_9.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_10.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.384,URI="a_6_24000_4_11.fmp4?nimblesessionid=1"
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:29:20.593Z
#EXTINF:6.016,
a_6_24000_4.fmp4?nimblesessionid=1


Parts in chunklist. Comparing to regular HLS, you see a lot of lines representing parts like this:
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_0.fmp4?nimblesessionid=1"
The full chunk which contain these parts will be described after all parts' lines:
a_6_24000_4.fmp4?nimblesessionid=1
All parts within chunks are numerated starting from zero. So "a_6_18006_3_0.fmp4" mean its the first part of chunk number 3.

Part length. This attribute declares a designated size of upcoming parts:
#EXT-X-PART-INF:PART-TARGET=0.512
In this example it's 512 milliseconds.

Can block reload
. Check this line:
#EXT-X-SERVER-CONTROL:CAN-BLOCK-RELOAD=YES,PART-HOLD-BACK=1.536
The "CAN-BLOCK-RELOAD" declares that media server allows holding playlist request.

Hold playlist request. LL-HLS allows requesting the server to hold sending out the playlist until a specific chunk and/or part is available for the stream.
So a player may request some part which is going to be available within a few seconds from now, then Nimble Streamer will check if that part is available. Once the requested part is ready, Nimble will return a playlist.

Check this request example:
curl -k "https://localhost:8443/livell/stream/chunks.m3u8?nimblesessionid=1&_HLS_msn=59&_HLS_part=5"
The highlighted _HLS_msn=59 and _HLS_part=5 parameters indicate that the server must hold the request until Nimble Streamer has part number 5 of chunk number 59 or later and then it could return a playlist. You can use only _HLS_msn=59 parameter, in this case the playlist will be sent out only once full chunk is available.

The resulting chunklist will look like this:
#EXTM3U
#EXT-X-VERSION:7
#EXT-X-INDEPENDENT-SEGMENTS
#EXT-X-MAP:URI="audio.fmp4?nimblesessionid=1"
#EXT-X-TARGETDURATION:7
#EXT-X-MEDIA-SEQUENCE:55
#EXT-X-SERVER-CONTROL:CAN-BLOCK-RELOAD=YES,PART-HOLD-BACK=1.536
#EXT-X-PART-INF:PART-TARGET=0.512
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:34:26.599Z
#EXTINF:5.994,
a_6_330006_55.fmp4?nimblesessionid=1
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:34:32.593Z
#EXTINF:6.016,
a_6_336000_56.fmp4?nimblesessionid=1
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_0.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_1.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_2.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_3.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_4.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_5.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_6.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_7.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_8.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_9.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_10.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.363,URI="a_6_342016_57_11.fmp4?nimblesessionid=1"
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:34:38.609Z
#EXTINF:5.995,
a_6_342016_57.fmp4?nimblesessionid=1
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_0.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_1.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_2.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_3.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_4.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_5.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_6.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_7.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_8.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_9.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_10.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.363,URI="a_6_348011_58_11.fmp4?nimblesessionid=1"
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:34:44.604Z
#EXTINF:5.995,
a_6_348011_58.fmp4?nimblesessionid=1
#EXT-X-PART:DURATION=0.512,URI="a_6_354006_59_0.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_354006_59_1.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_354006_59_2.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_354006_59_3.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_354006_59_4.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_354006_59_5.fmp4?nimblesessionid=1"
You can see it ends with part a_6_354006_59_5.fmp4 - it's part number 5 of the upcoming chunk 59. That chunk will be available only a few seconds later, but the player can already perform the playback, this helps a lot with reducing the latency.

Push the requested part. In addition to requesting specific part upon its arrival, a player may request Nimble Streamer to make HTTP/2 Push of that part to reduce the playback latency even further. This will be made by adding "_HLS_push=1" parameter in URL. If we look at Nimble Streamer access logs, we'll see the following actions:

Dec 24 18:43:04 nimble[5500]: [nimble-access.log] 192.18.1.2 - - "GET /livell/stream/chunks.m3u8?nimblesessionid=18&_HLS_msn=9&_HLS_part=0&_HLS_push=1 HTTP/2" 200 84 1114 372 "-" "AppleCoreMedia/1.0.0.17C54 (iPhone; U; CPU OS 13_3 like Mac OS X; en_us)"
Dec 24 18:43:04 nimble[5500]: [nimble-access.log] 192.18.1.2 - - "PUSH /livell/stream/l_4_27008_9_0.aac?nimblesessionid=18 HTTP/2" 200 0 49896 662 "-" "AppleCoreMedia/1.0.0.17C54 (iPhone; U; CPU OS 13_3 like Mac OS X; en_us)"

Dec 24 18:43:05 nimble[5500]: [nimble-access.log] 192.18.1.2 - - "GET /livell/stream/chunks.m3u8?nimblesessionid=18&_HLS_msn=9&_HLS_part=1&_HLS_push=1 HTTP/2" 200 84 1180 341 "-" "AppleCoreMedia/1.0.0.17C54 (iPhone; U; CPU OS 13_3 like Mac OS X; en_us)"
Dec 24 18:43:05 nimble[5500]: [nimble-access.log] 192.18.1.2 - - "PUSH /livell/stream/l_4_27008_9_1.aac?nimblesessionid=18 HTTP/2" 200 0 49828 568 "-" "AppleCoreMedia/1.0.0.17C54 (iPhone; U; CPU OS 13_3 like Mac OS X; en_us)"

As you can see the player is sending hold-playlist request (described earlier) with specific part number and _HLS_push=1 parameter. Nimble Streamer returns that playlist in response, as well as making HTTP/2 Push for the requested part.

Performance. With all these specific actions Nimble Streamer generates and serves parts within HLS stream with high efficiency. From resource consumption perspective, LL-HLS processing costs the same as handling regular-latency playlists and chunks.

3. Known issues and troubleshooting


At the moment Apple native player on iOS 13.3 has the following problems with LL-HLS implementation which may affect end-user experience.

1. fMP4 video+audio. If you use fMP4 container then you will be able to get either video or audio component working. Video+audio fMP4 streaming is not working properly yet. You can try using MPEGTS container for video+audio instead.

2. Part duration. If you set part duration to less than 1000 ms then video will not work at all. So we recommend setting part duration as "1000".

We are sure those issues will be fixed in Apple's upcoming releases. Meanwhile on iOS 13.3 you'll have to test it with aforementioned limitations.

3. Interleaving compensation. If you have video+audio stream you may have issues with playback due to interleaving as described in this article. This kind of issues becomes even more severe in case of low latency playback. In order to fix this you can try enabling interleaving compensation with Min. delay set to zero, see the image below.





Feel free to try Nimble Streamer with Low Latency HLS and buy SLDP Player SDK to get your hands on the iOS playback.

Let us know if you have any questions.

December 17, 2019

Running encoders in out-of-process mode

Live Transcoder for Nimble Streamer allows using a number of encoder libraries for producing output stream content. All encoding activities are performed in the same process with the other transcoding pipes.

However, there are cases when encoder libraries may cause issues. Some experimental encoders may be unstable during extensive usage and even legacy encoders in some cases may act unpredictably - that is because the encoding process is very complicated. These issues may cause the main Nimble Streamer process to go down and stop processing streams. This affects overall robustness and stability of streaming infrastructure.

To address that, we've made Live Transcoder to support running encoders in out-of-process mode. It makes Nimble Streamer to start a new separate process for each encoder in transcoder scenario. If the encoder fails, the process is stopped and automatically re-started without affecting overall transcoding. The output is also not interrupted so your end-users will notice just a short image freeze.

By default, out-of-process encoding is enabled only if you choose SVT-HEVC encoding library for specific encoder. For all other encoders, this capability is disabled by default.

To enable this feature, you need to open encoder details dialog, enter new nimble-execution-mode parameter on the left and then enter its out-of-process value to the right as shown on the image below.



Once you save the transcoding scenario and it's synced to Nimble Streamer instance (which happens 30 seconds), this will take immediate effect.

Out-of-process encoding is available in Nimble Streamer starting from version 3.6.3-2 and Live Transcoder starting from version 1.1.1-2.

This improvement brings ability to have more robust and reliable transcoding. Let us know if you have any questions regarding this feature usage.


Related documentation


Live Transcoder, Transcoder documentation reference

December 12, 2019

SVT-HEVC H.265 encoding setup in Nimble Streamer Transcoder

Nimble Streamer Live Transcoder has support for various codecs using a number of encoding libraries. H.265 (HEVC) encoding has been supported only via NVENC and QuickSync hardware acceleration so we were looking for the best ways to provide software alternative to that.

Now Live Transcoder can use SVT-HEVC for software encoding. The Scalable Video Technology for HEVC Encoder by Intel® is an HEVC-compliant encoder library core highly optimized for Intel Xeon™ Scalable Processor and Xeon™ D processors. However it can also be used on other hardware supported by Live Transcoder.


The output can be delivered by Nimble Streamer via any protocol which supports HEVC delivery.

The library is delivered with Nimble Live Transcoder and can be used like any other software encoder. The setup process is described below.

Install Live Transcoder


Live Transcoder is a premium add-on for Nimble Streamer freeware media server. You'll need to subscribe for its license in order to start using it.

You need to follow these installation instructions in order to set it up for further usage.

Create transcoding scenario


Live streams transcoding is set up using transcoding scenarios. Each scenario is a graphical representation of content transformation for video and audio ingredients. It has decoding elements to specify how the stream is decoded, filter elements to define the content transformation and encoder elements to put the content into the right codec.

You can refer to Documentation reference for setup details, including video tutorials.

The next section explains how to use encoder element to setup HEVC encoding with SVT-HEVC.

SVT-HEVC encoder settings


Once you've set up the designated transcoding scenario, add encoder element for your output and choose libsvthevc from the list of Encoder field values.


You'll be able to specify Key frame alignment from the list of supported values similar to those used in libx264 setup. The profile values can vary between "main" and "main10". In addition to that you can define other parameters specific to SVT-HEVC.

Live Transcoder supports a subset of SVT-HEVC encoder parameters, you can read their respective description on project page:

  • asm
  • base-layer-switch-mode
  • brr
  • constrd-intra
  • deblock
  • encMode
  • fps-denom
  • fps-num
  • hierarchical-levels
  • hme
  • hrd
  • interlaced-video
  • intra-period
  • irefresh-type
  • lad
  • level
  • lp
  • max-qp
  • min-qp
  • pred-struct
  • profile
  • q
  • rc
  • rt
  • sao
  • scd
  • search-h
  • search-w
  • sharp
  • speed-ctrl
  • ss
  • tbr
  • threads
  • tier
  • tile_col_cnt
  • tile_row_cnt
  • tile_slice_mode
  • umv
  • use-default-me-hme
  • vbv-bufsize
  • vbv-init
  • vbv-maxrate


Also notice the SVT-HEVC is running in out-of-process mode by default, you can read about it in this article.


If you have any questions related to transcoding, feel free to contact us.

Related documentation

Live Transcoder, Transcoder documentation reference,
.

November 20, 2019

HTTP/2 in Nimble Streamer

The Internet as we know it was created on top of HTTP versions 1.0 and 1.1, with HTTP/1.1 being dominant for the last 20 years. The growing demand for new features and capabilities showed several cavities in it and they were addressed in HTTP/2, which has been developed and adopted as a successor for 1.1. You can read the Introduction to HTTP/2 by Google team to see what exactly HTTP/2 has to offer.

At the moment HTTP/2 is supported in all modern browsers and every time users are trying to reach some resource on the Internet, their browser tries to connect through this new protocol version. If it succeeds then all further collaboration is performed via this new channel. If HTTP/2 in unavailable then the browser switches to HTTP/1.1.

Softvelum team has implemented HTTP/2 for some of Nimble Streamer output HTTP-based protocols to provide our customers with one more advantage for their end-users and establish the ground for further development.

HTTP/2 in Nimble Streamer


Nimble Streamer now has support for HTTP/2 protocol in most popular live streaming scenarios. No change to input streams is required, you only need to enable the feature as described in "Enabling HTTP/2" section below.

HLS

HTTP Live Streaming (HLS) by Apple is a de-facto standard for end-user media consumption so we implemented full live streaming support for HTTP/2:

  • Live HLS streams with fMP4/CMAF, MPEG-TS and audio-only containers are fully supported.
  • Ads insertion in live HLS via Nimble Advertizer is working fine as usual.
  • HLS DVR output works fine with all of its features.

MPEG-DASH

Live MPEG-DASH streams can also be played via HTTP/2. Both HLS and DASH output can be generated from the same live input so having HTTP/2 enabled, you can get two outputs through it.


Other protocols

At the moment only aforementioned protocols live streaming is supported via HTTP/2. HTTP re-streaming, VOD, Icecast and HTTP MPEG-TS will be processed only via HTTP/1.1.

Enabling HTTP/2


HTTP/2 can be used only when Nimble Streamer streams over HTTPS, so in order to make it process HTTP/2 requests, you need to do the following.


After that you'll be able to use HTTP/2 to reach live streams with HLS and MPEG-DASH protocols enabled.

LiteSpeed HPACK library


Nimble Streamer uses LS-HPACK library for encoding and decoding HTTP headers using HPACK compression mechanism.

Softvelum team would like to say thank you to LiteSpeed team, we appreciate their effort and technical expertise. We've made several contributions to LS-HPACK code base and we plan to continue that support as long as we move forward in our HTTP/2 development.



Let us know if you have any thoughts or feedback regarding HTTP/2-based streaming.

Related documentation


Nimble StreamerNimble AdvertizerHLS in Nimble StreamerMPEG-DASH in Nimble Streamer

November 6, 2019

Wowza Streaming Engine and WMSPanel agent upgrade

Wowza Media Systems has just released Wowza Streaming Engine version 4.7.8 which has a big set of major updates. Softvelum has a wide feature set for this media server so we keep our software up-to-date with all new changes.

After this release, if you install WMSPanel agent using common procedure then it will work fine on any Wowza Streaming Engine version.

If you have WMSPanel agent installed on your server already then it will stop working after this Wowza update.

So if you plan upgrading to the latest Wowza engine release, please follow these steps:
  1. Upgrade Wowza Streaming Engine per Wowza instructions.
  2. Upgrade WMSPanel agent for Wowza using this procedure: https://wmspanel.com/docs/wowza_upgrade
The new WMSPanel agent version number will be no less than 4.0.0.10308 after that.

This will make a smooth transition from WMSPanel reporting viewpoint.

Please contact us if you have any issues or questions.

WMSPanel API to control push API

Nimble Streamer provides various push APIs which allow controlling the streaming process such as Publish controlPay-per-view or playback authorization.

Typically you set those APIs settings via web UI by opening "Control" -> "API setup" menu.

And now we have API for API setup! You can use "Push API settings" API methods to perform control by making API calls. Please open the respective section of API reference.
It has the following methods:

  • Get global push API settings gives parameters of "Global push API" tab.
  • Set global push API settings sets parameters for "Global push API" tab.
  • Get list of servers push API settings gets list of servers from "Servers push API" tab.
  • Get server push API settings gets parameters of selected server.
  • Set server push API settings sets parameters for a server.
  • Remove server push API settings removes parameters.

So having those you can dynamically control your push API based frameworks.

You can check our Nimble Streamer configuration reference page for more API descriptions.

Related documentation