October 30, 2017

fMP4 HLS ABR - live AVC and HEVC streaming in Nimble Streamer

Recently Apple announced the support for fragmented MP4, or fMP4, in HTTP Live Streaming protocol. This allows the content to be played in HLS without the need for the traditional MPEG-2 Transport Stream container.

Now proper support for fMP4 is available in latest versions of Apple's operating systems - iOS 11, macOS High Sierra and tvOS 11. So if your viewers have appropriate devices with latest OS updates you can use full power of HLS for live streaming.

Nimble Streamer is the first software media server which has full support for fMP4 in HLS for live streaming scenarios. Let's see what features of our products can be used to utilize that new capability.

fMP4 H.264/AVC live streaming


H264/AVC is fully supported in Nimble Streamer for all protocols that are able to carry it. As H.264 is the default codec for HLS, it's been in our server from the beginning, so now Nimble is capable of delivering it with fMP4 in live mode.

fMP4 HLS with H.264 can be generated by both transmuxing and transcoding. Nimble Streamer allows taking RTMP, RTSP, MPEG-TS, SRT, UDT and HLS as input with H.264.

Transmuxing engine of and re-packaging it to fMP4 of HLS. Whatever protocol is used, Nimble handles it equally efficient. You can read this article to see some examples of low resource consumption.

Live Transcoder allows generating the AVC content which is fully compatible with HLS via any container. It takes content via same protocols as transmuxing engine to perform further transformations. They include decoding with software decoder, Intel QuickSync or NVidia hardware accelerator. Having the decoded content, transcoder allows applying any available filters and then do efficient encoding. Encoding can be done with libx264, libVA, Intel Quick Sync or NVidia NVENC.

Transcoding scenarios control is available via drag-n-drop web UI dynamically without stream interruption. Just move scenario elements and apply settings - your users will see transformations in a few seconds.

Nimble Live Transcoder has low resource usage and high performance which you can see in a benchmark test with Tesla M60 made in IBM Bluemix Cloud Platform.

fMP4 H.265/HEVC live streaming


Nimble Streamer has wide support for H.265 (HEVC) in both transmuxing and transcoding. HEVC can be taken for processing via RTSP, MPEG-TS, SRT and UDT as input.

Transmuxing of protocols which contain HEVC is performed with the same highly efficient re-packaging engine as used for H.264.

Live Transcoder generates HEVC content fully compatible with fMP4. It allows decoding HEVC with software decoder, Intel Quick Sync and NVidia hardware. The encoding currently can be performed only on hardware using Intel Quick Sync and NVidia NVENC. All filtering and web control dynamic capabilities are available for HEVC the same way as for other codecs.

So HEVC content delivered with HLS can now be played on all compatible devices with latest OS updates.

Adaptive bitrate fMP4


One of HLS innovations is the ability to combine H.264/AVC and H.265/HEVC streams in the same adaptive bitrate stream. This allows using appropriate codecs for proper resolution according to products requirements.

Nimble Streamer provides an easy way to create and use ABR streams with this new capability. This process is described in ABR streaming setup article. You can see how you can combine streams and use them as you need.

Setup in Nimble Streamer


To generate fMP4-powered HLS streams, you can use any source of live streams supported by Nimble Streamer. Check instructions for use cases below:

To enable fMP4 streaming, on your Live Streams Settings page you need to select HLS (FMP4) checkbox. This will un-check "regular" HLS and HLS (MPEGTS). The first option generates standard HLS chunks. The second option can be used for audio-only streams with MPEG-TS chunks.
Warning! After you enable fMP4 for your application or for the entire server, its streams will be played well only on appropriate Apple devices. All other playback devices and software like Android or PC web players will not be able to play it. We recommend creating a separate application for fMP4 delivery if you have different target platforms for your streams. This may definitely change in future but for now you need to consider this factor.
In addition to fMP4 you can select any other supported output protocols as shown on a picture below.

Selecting HLS with fMP4
As in any other live streaming scenarios, the result output streams can be found in Nimble Streamer -> Live Streams page in Outgoing streams section. You can take a look at setup example in RTMP transmuxing article.

Take a look at How to Create a Live HLS Feed With HEVC article by Jan Ozer which describes HEVC HLS streaming setup.

To set up and use adaptive bitrate streams with fMP4, follow the ABR streaming setup article.

DVR for fMP4


Nimble Streamer DVR feature set has full support for fMP4 recording and playback. So if you set up DVR recording for fMP4-enabled application or server, both MPEG2TS and fMP4 playback will be available simultaneously via different URLs. Both HEVC (H.265) and AVC (H.264) codecs are supported.

Transcoding for MP4


Regarding H.264/AVC or H.265/HEVC transcoding setup please refer to the links located on our Live Transcoder main page and our Transcoder setup video guides on YouTube.


VOD fMP4 HLS


Nimble Streamer also supports fMP4 for VOD HLS for both H.264/AVC and H.265/HEVC content. It also supports ABR VOD via SMIL files.
Read this article for more details.



Feel free to contact us regarding any feedback or issues of this capability.

Related documentation


Nimble StreamerLive streaming with Nimble Streamer, HLS support in Nimble StreamerLive Transcoder, ABR streaming setup, Transcoder wildcard scenariosPaywall feature set in Nimble Streamer, Nimble Streamer supported codecs,  fMP4 for VOD HLS,

October 24, 2017

Publishing stream to Facebook Live

Nimble Streamer allows publishing live streams to various types of destinations, like other media servers, CDNs, YouTube or Twitch.

Facebook Live platform allows taking RTMPS streams as input, so Nimble Streamer users can use that as a target as well. The setup and usage is simple so we'll show you how to do it.

As a source for your stream you can use any delivery method supported by Nimble Streamer.

Once the input stream is received by Nimble, it can be re-published via RTMP to Facebook . Let's see how this can be set up.
Notice that you may also find useful Streaming from Larix Broadcaster to Facebook Live article.


In this example we assume that you have a stream which has application name "source_app" and stream name "source_stream".

Go to Nimble Streamer -> Live Streams Setting menu and click on Republishing tab.



Now you need to take a look at the settings provided by Facebook. They may look like this:
URL: rtmps://live-api-a.facebook.com:443/rtmp/
Key: 10156882222123456?ds=1&a=ATgNxRcW4p654321
Click on Add RTMP button to see the dialog for entering new settings. Fill them in with Source app and stream name and with Facebook parameters as shown on the picture below.



As you can see, Destination address includes the domain part of your original "URL" setting.
You also need to check "Use SSL" checkbox so value of Port field becomes "443". This is mandatory for Facebook publication.
Destination application field is "rtmp" in this case and it's taken from the same original URL.

The "Key" original setting is also split into 2 parts. The first sequence of numbers is put into Destination stream field while the rest of parameters are set in Destination stream parameters field but without a question mark.

Now select the servers where you'd like to apply these settings to and click OK - your settings will be applied within several seconds.

As soon as your original stream goes online, Facebook Live will start receiving it.

Further usage

You can use Nimble Streamer to deliver your content to multiple destinations simultaneously. Use these examples of RTMP republishing to see what can be used as well:

You may also consider re-publishing incoming RTMP streams with inserted ads. Nimble Advertizer provides a framework for inserting pre-roll and mid-roll ads into live streams for further output via RTMP, SLDP and Icecast with custom business logic and per-user ads. So if you create RTMP stream with ads insertion and pull it for further re-publishing, you can provide your target CDN with properly sponsored content.
Visit Advertizer web page to find out more about server-side ads insertion functionality.

If you need to change the outgoing content in any way, like change the bitrate, use our Live Transcoder for Nimble Streamer to transform. It has high performance and low resource usage.
For other live streaming scenarios, check our live streaming use cases.

Having that, you can create flexible delivery chains using Nimble Streamer for media hubs and WMSPanel for easy-to-use control panel. Install Nimble Streamer if you still haven't done that and contact us if your streaming scenarios need any updates of described functionality.

Related documentation


Live Streaming featuresLive Transcoder for Nimble StreamerRTMP feature setBuild streaming infrastructure

October 12, 2017

DVR timeline in Nimble Streamer

As most of our customers already know, DVR feature set in Nimble Streamer allows recording any incoming live streams and provide playback capabilities for them. It has time range selection and other features.

Now we introduce time line browsing and playback. Here's how you can enable and use it.

Browse time line 


To browse the recorded streams, you need to go to Nimble Streamer -> DVR streams menu. There you'll see the list of recorded streams.


To view the DVR steam directly you can click on a question mark on the right side f the list - this will open the sample player dialog showing current playback.

To view the timeline of an individual stream, click on a "clock" icon on the right side of a stream row. You will see the timeline dialog which will immediately start obtaining data from Nimble Streamer instance which has that stream archive.


This will take a few seconds - not more than a sync-up time which is typically 30 seconds. Once the data is received, the dialog will see full timeline of selected stream's recording.


Here you will see green lines which represent recorded fragments - i.e. the time which has recorded data. The blank space shows absence of any records.

Date and time selectors allow defining the intervals to view. As soon as you start selecting date or time, the display will immediately move the focus so you'll see how far you've moved.


Also, take a look at the navigation bar on top.

Previous and Next buttons will move the focus to the points of beginning and the ending of the recorded fragments. Once it moves, you will see red dot at that point.


Zoom in and out buttons allow changing the scale of the timeline. You may also scroll with your mouse or touchpad to do the same.

You may also drag the timeline by the scale at the bottom of the chart to move between zoomed fragments.

Fit timeline button will reset all zoom and scroll to show full archive again.

You can play the DVR stream from selected point simply by clicking on it. See Play records from selected point section below for more details.

Browsing multiple archives


Besides viewing single stream archive, you can browse and compare data from multiple streams.

Select required streams by clicking on respective checkboxes and click on DVR timeline button.



You will see the same timeline dialog but it will have all streams which you picked up, being shown at the same scale and common range.


Here you can do the same operations as you could for single stream. Moving among fragments via Previous and Next button will be done on the common time scale as well so you'll jump between streams according to thier break points. To play the stream from selected fragment, just click on it - see the next section for details.

Play records from selected point


You may click on any point at the timeline to start playback from that moment. When you click you will see the sample player dialog starting the stream from selected time.


Here you see the Choose URL to play dropdown list of playback URLs - if you have multiple IPs of your server, you will see streams for all servers. Currently only HLS is supported for fragmented playback.

Notice that for HLS you can use both fMP4 and MPEG2TS containers for playback simultaneously using separate URLs. That will be
playlist_dvr_range-12345667-787.m3u8
for MPMEG2TS and
playlist_fmp4_dvr_range-12345667-787.m3u8
for fMP4.

You'll be able to select a player which you'd like to test with this stream and also get the player code by clicking the Show player's code button.

And of course you will see the playback of the recorded stream in selected browser from the moment which you pointed it to. The recording will be played until the end of the fragment where you pointed.

In addition to timeline, you may access DVR stream thumbnails using specific URL to insert into your webpages.

Also take a look at this video tutorial to see timeline in action and to get familiar with DVR most useful features.


Read other documentation articles for more details and full description of available options.


That's it - feel free to browse your archives to see what was going on.

Let us know if you have any questions regarding this functionality.

Related documentation


October 4, 2017

Troubleshooting Live Transcoder

Live Transcoder for Nimble Streamer is being used by various companies around the globe in many live streaming scenarios. However, some customers run into typical problems which they address to our support.

Some typical troubleshooting techniques for Nimble Streamer itself are described in Before you post a question to helpdesk article.

We'd like to describe Live Transcoder issues that appear in some cases and can be solved by the customer.

Q1: Stream stops unexpectedly

Nimble Streamer goes offline, it's shown as "offline" in WMSPanel account.

Check Nimble Streamer logs located at /var/log/nimble/nimble.log

Look for this type of messages:
[2017-10-04 04:23:26 P1486-T2435] [encoder] E: encoder too slow, droped 2038/2039 audio frames for [live/stream]
[2017-10-04 04:24:58 P1486-T2435] [encoder] E: encoder too slow, droped 3765/3766 audio frames for [live/stream]
You can also check system logs in /var/log/kern.log or /var/log/messages (depending on your Linux distro) to find this type of messages
Oct 04 04:25:28 loft24330 kernel: [46756.231449] Out of memory: Kill process 1486 (nimble) score 957 or sacrifice child
Oct 04 04:25:28 loft24330 kernel: [46756.249823] Killed process 1486 (nimble) total-vm:52904636kB, anon-rss:32208832kB, file-rss:0kB
Both errors indicate that your server cannot encode all streams in time. The second system error shows that Linux Kernel shut down the process due to Out of Memory condition.

As a resolution, you should decrease the number of encoded streams on this server. You could set up another server and encoder and make those extra streams be processed there.

Speaking of the memory, you should also check and set up proper RAM cache size to and make sure that value is enough to store your live stream chunk cache.

Q2: Unsupported format 12


You may see errors like these
[2017-10-13 22:18:14 P86076-T96639] [encoder] E: failed to encode video frame for [live/stream]
[2017-10-13 22:18:16 P86076-T96641] [encoder] E: unsupported frame format, 12 
This means you need to change pixel format.
Open your scenario and add new custom parameter "format" element prior to your encoder element. The parameter name will be format and the value will be pix_fmts=yuv420p as shown on this picture.


Once you apply settings, the error error should be gone.

Q3: NVENC is not available


If you fail using NVENC as you decoder and/or encoder, you may see messages like this:
nvenc is not available
This means you don't have NVENC driver installed or properly set up in your system.

Make sure you have the right packages installed. Those would be nvidia-encode-<driver_version> and nvidia-decode-<driver_version>, e.g. nvidia-encode-525.

Q4: What NVENC drivers should be used?


We highly recommend using official NVidia drivers.
We may decline requests for technical support in case if a customer uses any un-official or "patched" drivers from any third-parties.

Q5: Transcoder not found after installation on Windows


Some customers install Live Transcoder on Windows 2008 and after registering the license they still get a message like
Please install 'nimble-transcoder' package
with additional info about installation procedure.

First please try re-starting Nimble Streamer and try that again.
If that doesn't help, you may need to install update KB2533623 by Microsoft. Please follow this instruction to make update and then re-start Nimble Streamer.

Q6: Failed to create NVENC context


If you use NVENC and experience some issues you may see messages like these:
[2017-10-18 12:32:54 P23976-T24004] [tranmain] E: failed to create cuda context, gpu=1, flag=4
[2017-10-18 12:32:54 P23976-T24004] [tranmain] I: create cuda ctx, gpu=1, flag=4...
[2017-10-18 12:32:54 P23976-T24004] [tranmain] E: failed to create nvenc ctx for gpu=1, flag=4, res=2
[2017-10-18 12:32:54 P23976-T24004] [tranmain] E: failed to create cuda context, gpu=1, flag=4 
The res=2 flag means that NVENC ran out of memory and couldn't create proper amount of contexts for content processing.

You need to use context sharing approach described in this article.

Q7: Failed to encode, status 10


You may see these messages when using NVENC:
[2017-10-16 10:58:36 P138405-T141721] [encoder] E: failed to init encoder=0x7f1b540008c0, status=10
[2017-10-16 10:58:36 P138405-T141721] [encoder] E: failed to encode video frame for [live/stream_1080]
[2017-10-16 10:58:36 P138405-T141721] [encoder] E: failed to flush nvenc encoder
This also indicates too many contexts in use. Please refer to Q5 for more details on how to use context cache and find the right number of contexts.

Q8: Video and audio are un-synced


Sometimes streams are published to external destinations like Akamai or other CDNs with un-synced video and audio. This may be caused by the originating encoders.

Try applying interleaving compensation to your streams to eliminate the un-sync.

Q9: Closed captions are doubled up after FPS change


When a stream is up-scaled via filter (e.g. from 29.97FPS to 59.94FPS), the closed captions are doubled in the video stream. Closed captions are embedded into frames, FPS filter duplicates frames with all side data, and this leads to duplicated characters.
To fix that, add video_filter_preserve_cc_enabled = true parameter into Nimble configuration file, read this reference page for more details about Nimble configuration.



Feel free to contact us if you have any other questions or issues with Live Transcoder.

Related documentation


Live Transcoder for Nimble Streamer, Nimble Streamer performance tuningLive Streaming featuresZabbix monitoring of Nimble Streamer with NVidia GPU status.

October 1, 2017

Softvelum 2017 Q3 news

Third quarter of 2017 is now over so it's time to make a summary of what our team has accomplished.

General announcements


Notice that we've improved our Terms of Service, please have a moment to read them. Notice that it refers to Before you post a question to helpdesk article - this is what you'll need to get familiar with prior to sending a question to our team.

Another general improvement: WMSPanel now supports two-factor authentication which can be enabled per-user.

A few words about the money: FastSpring, our payment gateway, now accepts UnionPay. So it's yet another way to pay for our products.

As always, take a moment to read our State of Streaming Protocols to see how the streaming landscape technologies are currently used among our customers. Spoiler: DASH keeps rising, HLS goes a bit down, SLDP gains momentum.

Nimble Streamer



First, we're glad to announce an article in Streaming Media by Jan Ozer about our Live Transcoder:
Review: Softvelum Nimble Streamer Is Flexible and Well-Featured
We appreciate Jan sharing his opinion and we'll keep improving our products per his feedback.

Now let's check new features.

Our team continuously improves SLDP low-latency streaming technology.

SLDP HTML5 JavaScript SDK for creating low latency HTML5 players is now available for licensing in addition to existing mobile SDKs.

Other SLDP updates you may find useful:
SRT streaming technology, originally created by Haivision and opened to open-source community, is now available in Nimble Streamer. It's a protocol which adds reliability to UDP transmission with error correction, encryption and other features which make it a great method to deliver live content across unreliable networks. Softvelum was also among first participants of SRT Alliance to take part in improving the protocol.
You can enable SRT in Nimble by installing SRT package and making respective settings.

Speaking of new protocols, UDT streaming protocol is now available in Nimble Streamer. Read this article for setup details.

Live Transcoder has been updated: VP8 and VP9 decoding and encoding are now supported. Read this article for more details about setup and usage. VP8 was also added to VA API implementation in our transcoder to give more flexibility.

Other interesting Nimble Streamer updates are:


Mobile Streaming


The last but not the least.

Mobile playback solutions are now available among our products.
This includes SLDP Player apps for iOS and Android, as well as respective SDKs for adding playback capabilities into your own apps. Currently SLDP and RTMP protocols are supported.
Free apps are available in Google Play and AppStore.

Mobile streaming products were updated with multiple improvements on Android, iOS and Windows Phone. This includes Larix Broadcaster and Larix Screencaster apps as wells as respective SDKs.
You can check the history of our mobile releases for all details of recent SDK releases.


Stay tuned for more features - follow us at FacebookTwitter or Google+ to get latest news and updates of our products and services.