December 29, 2017

Softvelum 2017 summary

Happy New Year!

The year of 2017 is over so we'd like to look back at what our company has accomplished during these past months.

It's been a successful year for us - we've stepped into several new areas of streaming landscape and we're glad to provide new products and features to our customers.

Speaking of new territories, our team made an article to share our opinion on the business competition in our industry - it's an answer to a very frequently asked question, enjoy.

Before moving forward, take a look at the state of streaming protocols: we compared data from 2017, 2016 and 2015 to see the dynamics.

Let's see what we've introduced in 2017.

SLDP


Softvelum Low Delay Protocol is a new technology for reducing the latency for last-mile delivery for end-users. It's based on WebSockets for better accessibility.
The core features are:
  • Sub-second delay between origin and player.
  • Codec-agnostic
  • ABR support
  • HTTP and HTTPS on top of TCP
  • Buffer offset support
Take a look at SLDP usage description and also most frequent questions that show the protocol best practices. In addition to low latency, SLDP provides great level of live streams protection, read this article for more details. Server side support is available in Nimble Streamer - it covers all capabilities of SLDP.

Client side is covered by SLDP Players.
Web player is working in any MSE-enabled web browsers on desktop and connected devices. SLDP HTML5 JavaScript SDK for creating low latency HTML5 players is available for licensing.

SLDP is also supported by mobile SDKs. Both Android and iOS have free apps to perform SLDP and RTMP playback, you may also use SDKs to add your own playback capabilities into your apps.

SRT


SRT streaming technology, originally created by Haivision and opened to open-source community, is available in Nimble Streamer at x64 and ARM platforms. It's a protocol which adds reliability to UDP transmission with error correction, encryption and other features which make it a great method to deliver live content across unreliable networks. Softvelum was also among first participants of SRT Alliance (created by Haivision and Wowza) to take part in improving the protocol.

You can enable SRT in Nimble by installing SRT package and making respective settings.

SRT is a great choice for building delivery networks based on un-reliable environment. For last-miile delivery, you may use SLDP described above.

Nimble Streamer and Live Transcoder


We're glad to announce an article in Streaming Media magazine by Jan Ozer about our Live Transcoder: Review: Softvelum Nimble Streamer Is Flexible and Well-Featured
We appreciate Jan sharing his opinion and we'll keep improving our products per his feedback.


fMP4
Apple released the support for fMP4 as part of HLS standard recently. Softvelum was the first to introduce fMP4 for live streaming. You may set up Nimble Streamer to produce fMP4 HLS live streams along with other protocols. It supports both HEVC/H.265 and AVC/H.264 video so new Apple devices are able to perform its playback.

DVR
We've also added fMP4 HLS DVR support in Nimble Streamer DVR - it allows streaming recorded content in both fMP4 and legacy containers at the same time.
Speaking of recording, Nimble Streamer and WMSPanel now provide DVR timeline browsing and playback. This allows browsing through the recorded history and play what you need from any point.

Wildcard ABR was added to support dynamic names for adaptive bitrate streams in addition to pre-defined ABR settings which was widely used previously.

We've added Periscope stream publication and Facebook Live publication support in our setup UI for convenience of our users.

UDT streaming protocol is available in Nimble Streamer. Read this article for setup details.

We've ported Nimble to IBM POWER8 architecture. It's a good addition to traditional x64 and ARM which were supported before.


Our Live Transcoder was updated significantly to add new features.

First, check an extensive testing of latest NVidia Tesla M60 graphic card in IBM Bluemix Cloud Platform to see how much it increases the performance of Live Transcoder.

New codecs were also added into the Transcoder.

HEVC
  • You can decode HEVC using NVENC and QuickSync as well as software decoder.
  • Encoding can be done with NVENC and QuickSync.
Read more about NVENC encoding and QuickSync encoder parameters. Also, QuickSync setup description was updated with Media Server Studio 2017.


VP8 and VP9
Read this article for more details about setup and usage of those codecs. VP8 was also added to VA API implementation in our transcoder to give more flexibility.

MP3
You can encode audio to MP3 from AAC, MP3, Speex, PCM and MP2.

Hot swap
Other major update for transcoder is streams' hot swap with Live Transcoder. It now allows using the following opposite scenarios:
  • Streams failover hot swap where main stream is backed up by secondary stream in case of publication fail.
  • Emergency swap where main stream is replaced with some replacement stream when it becomes available. This is used in cases like US Emergency Alert System (EAS).
The streams are swapped without streams interruption so user playback will not be affected.

FDK and VAAPI
We've also added more encoding libraries in addition to already supported ones: FDK AAC for audio and VA API (libVA) for video.

If you have any issues with Live Transcoder, you can take a look at Transcoder troubleshooting guide which is updated with new cases as long as we provide support to our customers.
Please also take a look at "Before you post a question to helpdesk" article - this is what you'll need to get familiar with prior to sending a question to our team. This article also helps analyzing the issues without waiting for our team response.

Mobile SDKs


Our mobile streaming products were extended significantly.

Larix Broadcaster
Larix Broadcaster and its SDK were updated all the time to include new features. Its UI was reworked to allow multiple connections and other features.
Take a look at mobile SDK FAQ to see answers you might be looking for.

SLDP Player
Mobile playback solutions are now available among our products.
This includes SLDP Player apps for iOS and Android, as well as respective SDKs for adding playback capabilities into your own apps. Currently SLDP and RTMP protocols are supported. iOS supports H.264/AVC and H.265/HEVC video with AAC and MP3 audio, Android has the same plus VP8 and VP9 on top.
Free apps are available in Google Play and AppStore to check all that in action.

You may also check SDKs release notes for all latest updates and use this page to subscribe for SDKs and their support.

WMSPanel


Our WMSPanel web service has been a foundation for most of our products and we keep improving it to keep us with new capabilities.


That's it. Stay tuned for more features in upcoming new year of 2018 - follow us at FacebookTwitter or Google+ to get latest news and updates of our products and services.


December 27, 2017

The State of Streaming Protocols - 2017 summary

Softvelum team continues analyzing the state of streaming protocols. It's based on stats from WMSPanel reporting service which handles data from Wowza Streaming Engine and Nimble Streamer servers - there were 3600+ servers on average this year. A number of views has grown from 20 billion in 2016 to 34 billion in 2017 - our customer base is growing, more and more large customers are coming from various verticals, so all major the protocols are well represented.

Let's take a look at the chart and numbers of this year:

The State of Streaming Protocols - 2017

You can compare that to the picture of 2016 protocols landscape:

The State of Streaming Protocols - 2016
Here's what what we can see from this year.

  • HLS keeps the throne with 60% of the share - it's a de-facto standard for end-user media consumption. Apple keeps improving it with fMP4 and HEVC support so it's not going way any time soon.
  • RTMP is still in use for live streaming scenarios, especially when is comes to real-time delivery. Regardless of continuous Flash decline, this protocol will remain active for some time.
  • SLDP is a new protocol created by Softvelum for those who cannot use Flash and RTMP but still need to deliver real-time streaming. It's based on WebSockets and it can deliver content to MSE-enabled browsers on most platforms. Also, native mobile apps and SDKs can be used for adding SLDP playback on Android and iOS. It was introduced back in late May but it's already got a few millions of views in our customers' networks.
  • MPEG-DASH got a great launch from nearly zero to 6% and it's still gaining momentum. Being DASH Industry Forum member, Softvelum keeps contributing to the community and supports the wider adoption of this technology.
  • Progressive download is traditionally strong in VOD scenarios and many of our customers use it for media delivery along with HLS.
  • Icecast was around 5% through the last couple of years and this trend shows significant interest of our customers in audio streaming. Nimble Streamer has full support for this technology
  • Other protocols are becoming more niche, getting less percentage over time as you can see.

Bonus track: 2015

Check our stats from 2015 when we collected the data from 2300+ servers.

The State of Streaming Protocols - 2015

We'll keep analyzing protocols to see the dynamics. Check our updates at Facebook and Twitter.

If you'd like to use these stats, please refer to this article by original name and URL.

November 26, 2017

Mobile SDKs update for November 2017

Hello,

Our team is glad to make a brief announcement for our mobile solutions released this month.


The following updates were made for the SDKs.

Larix SDK for Android version "1.0.32"
  • Audio callback added for custom audio processing
  • Several improvements
Larix SDK for iOS version "1.0.32"
  • Updated to Swift 4 & XCode 9.1
  • Several improvements
SLDP Player SDK for Android version "2017-11"
  • ABR support
  • HEVC support
  • VP8 and VP9 support
  • MP3 support
SLDP Player SDK for iOS version "2017-11"
  • ABR support
  • HEVC support
  • MP3 support

Subscribe on this page for the respective SDK if you'd like to get any updates. You can try them all in action using free apps listed on this page.


Take a look at full list of releases and their changes on this release notes page.

If you have any questions regarding the library usage, please feel free to send request to our engineering helpdesk.


Follow us at FacebookTwitter or Google+ to get latest news and updates of our products and services.

November 20, 2017

DVR for fMP4 HLS with HEVC and AVC in Nimble Streamer

As our customers already know, Apple announced the support for fragmented MP4, or fMP4, in HTTP Live Streaming protocol.  Proper support is available in latest versions of Apple's operating systems.

Softvelum recently introduced live streaming fMP4 support in Nimble Streamer. It allows transmuxing of incoming streams with HEVC (H.265) and AVC (H.264) content. Nimble Streamer is the first software media server to have full support for fMP4 in HLS for live streaming scenarios.

Now we're the first to introduce full support for fMP4 DVR.

Nimble Streamer allows recording any incoming live streams with HEVC (H.265) and AVC (H.264) content to provide playback capabilities - with fragmented MP4 container in addition to existing MPEG2TS container.

The DVR setup is done the same way as before - read this DVR setup article for more details. It has no difference between AVC and HEVC content.

You can play fMP4 and MPEG2TS HLS DVR streams simultaneously with no need for additional settings - the only difference is in output URL. If you have a live stream name like
http://yourhost/live/stream/playlist.m3u8
then your MPEG2TS-based DVR stream URL will be
http://yourhost/live/stream/playlist_dvr.m3u8
To use fMP4 playback DVR stream, your URL will be
http://yourhost/live/stream/playlist_fmp4_dvr.m3u8
So you can provide proper links for different target device types according to their capabilities.

The ABR DVR stream for fMP4 would be played the same way - just specify ABR app name in URL with the same playlist name:
http://yourhost/live_abr/stream/playlist_fmp4_dvr.m3u8
Of course, it can also be used with MPEG2TS ABR streams.

With this new type of URL you can use all the same DVR features:

Watch DVR recording and playback in Nimble Streamer tutorial to get familiar with the most used features.

That's it - feel free to use this new feature a let us know if you have any questions regarding this functionality.


Nimble Streamer also supports fMP4 for VOD HLS for both H.264/AVC and H.265/HEVC content. It also supports ABR VOD via SMIL files. Read this article for more details.

Follow us in social media to get updates about our new features and products: YouTubeTwitterFacebookLinkedInRedditTelegram

Related documentation


October 30, 2017

fMP4 HLS ABR - live AVC and HEVC streaming in Nimble Streamer

Recently Apple announced the support for fragmented MP4, or fMP4, in HTTP Live Streaming protocol. This allows the content to be played in HLS without the need for the traditional MPEG-2 Transport Stream container.

Now proper support for fMP4 is available in latest versions of Apple's operating systems - iOS 11, macOS High Sierra and tvOS 11. So if your viewers have appropriate devices with latest OS updates you can use full power of HLS for live streaming.

Nimble Streamer is the first software media server which has full support for fMP4 in HLS for live streaming scenarios. Let's see what features of our products can be used to utilize that new capability.

fMP4 H.264/AVC live streaming


H264/AVC is fully supported in Nimble Streamer for all protocols that are able to carry it. As H.264 is the default codec for HLS, it's been in our server from the beginning, so now Nimble is capable of delivering it with fMP4 in live mode.

fMP4 HLS with H.264 can be generated by both transmuxing and transcoding. Nimble Streamer allows taking RTMP, RTSP, MPEG-TS, SRT, UDT and HLS as input with H.264.

Transmuxing engine of and re-packaging it to fMP4 of HLS. Whatever protocol is used, Nimble handles it equally efficient. You can read this article to see some examples of low resource consumption.

Live Transcoder allows generating the AVC content which is fully compatible with HLS via any container. It takes content via same protocols as transmuxing engine to perform further transformations. They include decoding with software decoder, Intel QuickSync or NVidia hardware accelerator. Having the decoded content, transcoder allows applying any available filters and then do efficient encoding. Encoding can be done with libx264, libVA, Intel Quick Sync or NVidia NVENC.

Transcoding scenarios control is available via drag-n-drop web UI dynamically without stream interruption. Just move scenario elements and apply settings - your users will see transformations in a few seconds.

Nimble Live Transcoder has low resource usage and high performance which you can see in a benchmark test with Tesla M60 made in IBM Bluemix Cloud Platform.

fMP4 H.265/HEVC live streaming


Nimble Streamer has wide support for H.265 (HEVC) in both transmuxing and transcoding. HEVC can be taken for processing via RTSP, MPEG-TS, SRT and UDT as input.

Transmuxing of protocols which contain HEVC is performed with the same highly efficient re-packaging engine as used for H.264.

Live Transcoder generates HEVC content fully compatible with fMP4. It allows decoding HEVC with software decoder, Intel Quick Sync and NVidia hardware. The encoding currently can be performed only on hardware using Intel Quick Sync and NVidia NVENC. All filtering and web control dynamic capabilities are available for HEVC the same way as for other codecs.

So HEVC content delivered with HLS can now be played on all compatible devices with latest OS updates.

Adaptive bitrate fMP4


One of HLS innovations is the ability to combine H.264/AVC and H.265/HEVC streams in the same adaptive bitrate stream. This allows using appropriate codecs for proper resolution according to products requirements.

Nimble Streamer provides an easy way to create and use ABR streams with this new capability. This process is described in ABR streaming setup article. You can see how you can combine streams and use them as you need.

Setup in Nimble Streamer


To generate fMP4-powered HLS streams, you can use any source of live streams supported by Nimble Streamer. Check instructions for use cases below:

To enable fMP4 streaming, on your Live Streams Settings page you need to select HLS (FMP4) checkbox. This will un-check "regular" HLS and HLS (MPEGTS). The first option generates standard HLS chunks. The second option can be used for audio-only streams with MPEG-TS chunks.
Warning! After you enable fMP4 for your application or for the entire server, its streams will be played well only on appropriate Apple devices. All other playback devices and software like Android or PC web players will not be able to play it. We recommend creating a separate application for fMP4 delivery if you have different target platforms for your streams. This may definitely change in future but for now you need to consider this factor.
In addition to fMP4 you can select any other supported output protocols as shown on a picture below.

Selecting HLS with fMP4
As in any other live streaming scenarios, the result output streams can be found in Nimble Streamer -> Live Streams page in Outgoing streams section. You can take a look at setup example in RTMP transmuxing article.

Take a look at How to Create a Live HLS Feed With HEVC article by Jan Ozer which describes HEVC HLS streaming setup.

To set up and use adaptive bitrate streams with fMP4, follow the ABR streaming setup article.

DVR for fMP4


Nimble Streamer DVR feature set has full support for fMP4 recording and playback. So if you set up DVR recording for fMP4-enabled application or server, both MPEG2TS and fMP4 playback will be available simultaneously via different URLs. Both HEVC (H.265) and AVC (H.264) codecs are supported.

Transcoding for MP4


Regarding H.264/AVC or H.265/HEVC transcoding setup please refer to the links located on our Live Transcoder main page and our Transcoder setup video guides on YouTube.


VOD fMP4 HLS


Nimble Streamer also supports fMP4 for VOD HLS for both H.264/AVC and H.265/HEVC content. It also supports ABR VOD via SMIL files.
Read this article for more details.



Feel free to contact us regarding any feedback or issues of this capability.

Related documentation


Nimble StreamerLive streaming with Nimble Streamer, HLS support in Nimble StreamerLive Transcoder, ABR streaming setup, Transcoder wildcard scenariosPaywall feature set in Nimble Streamer, Nimble Streamer supported codecs,  fMP4 for VOD HLS,

October 24, 2017

Publishing stream to Facebook Live

Nimble Streamer allows publishing live streams to various types of destinations, like other media servers, CDNs, YouTube or Twitch.

Facebook Live platform allows taking RTMPS streams as input, so Nimble Streamer users can use that as a target as well. The setup and usage is simple so we'll show you how to do it.

As a source for your stream you can use any delivery method supported by Nimble Streamer.

Once the input stream is received by Nimble, it can be re-published via RTMP to Facebook . Let's see how this can be set up.
Notice that you may also find useful Streaming from Larix Broadcaster to Facebook Live article.


In this example we assume that you have a stream which has application name "source_app" and stream name "source_stream".

Go to Nimble Streamer -> Live Streams Setting menu and click on Republishing tab.



Now you need to take a look at the settings provided by Facebook. They may look like this:
URL: rtmps://live-api-a.facebook.com:443/rtmp/
Key: 10156882222123456?ds=1&a=ATgNxRcW4p654321
Click on Add RTMP button to see the dialog for entering new settings. Fill them in with Source app and stream name and with Facebook parameters as shown on the picture below.



As you can see, Destination address includes the domain part of your original "URL" setting.
You also need to check "Use SSL" checkbox so value of Port field becomes "443". This is mandatory for Facebook publication.
Destination application field is "rtmp" in this case and it's taken from the same original URL.

The "Key" original setting is also split into 2 parts. The first sequence of numbers is put into Destination stream field while the rest of parameters are set in Destination stream parameters field but without a question mark.

Now select the servers where you'd like to apply these settings to and click OK - your settings will be applied within several seconds.

As soon as your original stream goes online, Facebook Live will start receiving it.

Further usage

You can use Nimble Streamer to deliver your content to multiple destinations simultaneously. Use these examples of RTMP republishing to see what can be used as well:

You may also consider re-publishing incoming RTMP streams with inserted ads. Nimble Advertizer provides a framework for inserting pre-roll and mid-roll ads into live streams for further output via RTMP, SLDP and Icecast with custom business logic and per-user ads. So if you create RTMP stream with ads insertion and pull it for further re-publishing, you can provide your target CDN with properly sponsored content.
Visit Advertizer web page to find out more about server-side ads insertion functionality.

If you need to change the outgoing content in any way, like change the bitrate, use our Live Transcoder for Nimble Streamer to transform. It has high performance and low resource usage.
For other live streaming scenarios, check our live streaming use cases.

Having that, you can create flexible delivery chains using Nimble Streamer for media hubs and WMSPanel for easy-to-use control panel. Install Nimble Streamer if you still haven't done that and contact us if your streaming scenarios need any updates of described functionality.

Related documentation


Live Streaming featuresLive Transcoder for Nimble StreamerRTMP feature setBuild streaming infrastructure

October 12, 2017

DVR timeline in Nimble Streamer

As most of our customers already know, DVR feature set in Nimble Streamer allows recording any incoming live streams and provide playback capabilities for them. It has time range selection and other features.

Now we introduce time line browsing and playback. Here's how you can enable and use it.

Browse time line 


To browse the recorded streams, you need to go to Nimble Streamer -> DVR streams menu. There you'll see the list of recorded streams.


To view the DVR steam directly you can click on a question mark on the right side f the list - this will open the sample player dialog showing current playback.

To view the timeline of an individual stream, click on a "clock" icon on the right side of a stream row. You will see the timeline dialog which will immediately start obtaining data from Nimble Streamer instance which has that stream archive.


This will take a few seconds - not more than a sync-up time which is typically 30 seconds. Once the data is received, the dialog will see full timeline of selected stream's recording.


Here you will see green lines which represent recorded fragments - i.e. the time which has recorded data. The blank space shows absence of any records.

Date and time selectors allow defining the intervals to view. As soon as you start selecting date or time, the display will immediately move the focus so you'll see how far you've moved.


Also, take a look at the navigation bar on top.

Previous and Next buttons will move the focus to the points of beginning and the ending of the recorded fragments. Once it moves, you will see red dot at that point.


Zoom in and out buttons allow changing the scale of the timeline. You may also scroll with your mouse or touchpad to do the same.

You may also drag the timeline by the scale at the bottom of the chart to move between zoomed fragments.

Fit timeline button will reset all zoom and scroll to show full archive again.

You can play the DVR stream from selected point simply by clicking on it. See Play records from selected point section below for more details.

Browsing multiple archives


Besides viewing single stream archive, you can browse and compare data from multiple streams.

Select required streams by clicking on respective checkboxes and click on DVR timeline button.



You will see the same timeline dialog but it will have all streams which you picked up, being shown at the same scale and common range.


Here you can do the same operations as you could for single stream. Moving among fragments via Previous and Next button will be done on the common time scale as well so you'll jump between streams according to thier break points. To play the stream from selected fragment, just click on it - see the next section for details.

Play records from selected point


You may click on any point at the timeline to start playback from that moment. When you click you will see the sample player dialog starting the stream from selected time.


Here you see the Choose URL to play dropdown list of playback URLs - if you have multiple IPs of your server, you will see streams for all servers. Currently only HLS is supported for fragmented playback.

Notice that for HLS you can use both fMP4 and MPEG2TS containers for playback simultaneously using separate URLs. That will be
playlist_dvr_range-12345667-787.m3u8
for MPMEG2TS and
playlist_fmp4_dvr_range-12345667-787.m3u8
for fMP4.

You'll be able to select a player which you'd like to test with this stream and also get the player code by clicking the Show player's code button.

And of course you will see the playback of the recorded stream in selected browser from the moment which you pointed it to. The recording will be played until the end of the fragment where you pointed.

In addition to timeline, you may access DVR stream thumbnails using specific URL to insert into your webpages.

Also take a look at this video tutorial to see timeline in action and to get familiar with DVR most useful features.


Read other documentation articles for more details and full description of available options.


That's it - feel free to browse your archives to see what was going on.

Let us know if you have any questions regarding this functionality.

Related documentation


October 4, 2017

Troubleshooting Live Transcoder

Live Transcoder for Nimble Streamer is being used by various companies around the globe in many live streaming scenarios. However, some customers run into typical problems which they address to our support.

Some typical troubleshooting techniques for Nimble Streamer itself are described in Before you post a question to helpdesk article.

We'd like to describe Live Transcoder issues that appear in some cases and can be solved by the customer.

Q1: Stream stops unexpectedly

Nimble Streamer goes offline, it's shown as "offline" in WMSPanel account.

Check Nimble Streamer logs located at /var/log/nimble/nimble.log

Look for this type of messages:
[2017-10-04 04:23:26 P1486-T2435] [encoder] E: encoder too slow, droped 2038/2039 audio frames for [live/stream]
[2017-10-04 04:24:58 P1486-T2435] [encoder] E: encoder too slow, droped 3765/3766 audio frames for [live/stream]
You can also check system logs in /var/log/kern.log or /var/log/messages (depending on your Linux distro) to find this type of messages
Oct 04 04:25:28 loft24330 kernel: [46756.231449] Out of memory: Kill process 1486 (nimble) score 957 or sacrifice child
Oct 04 04:25:28 loft24330 kernel: [46756.249823] Killed process 1486 (nimble) total-vm:52904636kB, anon-rss:32208832kB, file-rss:0kB
Both errors indicate that your server cannot encode all streams in time. The second system error shows that Linux Kernel shut down the process due to Out of Memory condition.

As a resolution, you should decrease the number of encoded streams on this server. You could set up another server and encoder and make those extra streams be processed there.

Speaking of the memory, you should also check and set up proper RAM cache size to and make sure that value is enough to store your live stream chunk cache.

Q2: Unsupported format 12


You may see errors like these
[2017-10-13 22:18:14 P86076-T96639] [encoder] E: failed to encode video frame for [live/stream]
[2017-10-13 22:18:16 P86076-T96641] [encoder] E: unsupported frame format, 12 
This means you need to change pixel format.
Open your scenario and add new custom parameter "format" element prior to your encoder element. The parameter name will be format and the value will be pix_fmts=yuv420p as shown on this picture.


Once you apply settings, the error error should be gone.

Q3: NVENC is not available


If you fail using NVENC as you decoder and/or encoder, you may see messages like this:
nvenc is not available
This means you don't have NVENC driver installed or properly set up in your system.

Make sure you have the right packages installed. Those would be nvidia-encode-<driver_version> and nvidia-decode-<driver_version>, e.g. nvidia-encode-525.

Q4: What NVENC drivers should be used?


We highly recommend using official NVidia drivers.
We may decline requests for technical support in case if a customer uses any un-official or "patched" drivers from any third-parties.

Q5: Transcoder not found after installation on Windows


Some customers install Live Transcoder on Windows 2008 and after registering the license they still get a message like
Please install 'nimble-transcoder' package
with additional info about installation procedure.

First please try re-starting Nimble Streamer and try that again.
If that doesn't help, you may need to install update KB2533623 by Microsoft. Please follow this instruction to make update and then re-start Nimble Streamer.

Q6: Failed to create NVENC context


If you use NVENC and experience some issues you may see messages like these:
[2017-10-18 12:32:54 P23976-T24004] [tranmain] E: failed to create cuda context, gpu=1, flag=4
[2017-10-18 12:32:54 P23976-T24004] [tranmain] I: create cuda ctx, gpu=1, flag=4...
[2017-10-18 12:32:54 P23976-T24004] [tranmain] E: failed to create nvenc ctx for gpu=1, flag=4, res=2
[2017-10-18 12:32:54 P23976-T24004] [tranmain] E: failed to create cuda context, gpu=1, flag=4 
The res=2 flag means that NVENC ran out of memory and couldn't create proper amount of contexts for content processing.

You need to use context sharing approach described in this article.

Q7: Failed to encode, status 10


You may see these messages when using NVENC:
[2017-10-16 10:58:36 P138405-T141721] [encoder] E: failed to init encoder=0x7f1b540008c0, status=10
[2017-10-16 10:58:36 P138405-T141721] [encoder] E: failed to encode video frame for [live/stream_1080]
[2017-10-16 10:58:36 P138405-T141721] [encoder] E: failed to flush nvenc encoder
This also indicates too many contexts in use. Please refer to Q5 for more details on how to use context cache and find the right number of contexts.

Q8: Video and audio are un-synced


Sometimes streams are published to external destinations like Akamai or other CDNs with un-synced video and audio. This may be caused by the originating encoders.

Try applying interleaving compensation to your streams to eliminate the un-sync.

Q9: Closed captions are doubled up after FPS change


When a stream is up-scaled via filter (e.g. from 29.97FPS to 59.94FPS), the closed captions are doubled in the video stream. Closed captions are embedded into frames, FPS filter duplicates frames with all side data, and this leads to duplicated characters.
To fix that, add video_filter_preserve_cc_enabled = true parameter into Nimble configuration file, read this reference page for more details about Nimble configuration.



Feel free to contact us if you have any other questions or issues with Live Transcoder.

Related documentation


Live Transcoder for Nimble Streamer, Nimble Streamer performance tuningLive Streaming featuresZabbix monitoring of Nimble Streamer with NVidia GPU status.

October 1, 2017

Softvelum 2017 Q3 news

Third quarter of 2017 is now over so it's time to make a summary of what our team has accomplished.

General announcements


Notice that we've improved our Terms of Service, please have a moment to read them. Notice that it refers to Before you post a question to helpdesk article - this is what you'll need to get familiar with prior to sending a question to our team.

Another general improvement: WMSPanel now supports two-factor authentication which can be enabled per-user.

A few words about the money: FastSpring, our payment gateway, now accepts UnionPay. So it's yet another way to pay for our products.

As always, take a moment to read our State of Streaming Protocols to see how the streaming landscape technologies are currently used among our customers. Spoiler: DASH keeps rising, HLS goes a bit down, SLDP gains momentum.

Nimble Streamer



First, we're glad to announce an article in Streaming Media by Jan Ozer about our Live Transcoder:
Review: Softvelum Nimble Streamer Is Flexible and Well-Featured
We appreciate Jan sharing his opinion and we'll keep improving our products per his feedback.

Now let's check new features.

Our team continuously improves SLDP low-latency streaming technology.

SLDP HTML5 JavaScript SDK for creating low latency HTML5 players is now available for licensing in addition to existing mobile SDKs.

Other SLDP updates you may find useful:
SRT streaming technology, originally created by Haivision and opened to open-source community, is now available in Nimble Streamer. It's a protocol which adds reliability to UDP transmission with error correction, encryption and other features which make it a great method to deliver live content across unreliable networks. Softvelum was also among first participants of SRT Alliance to take part in improving the protocol.
You can enable SRT in Nimble by installing SRT package and making respective settings.

Speaking of new protocols, UDT streaming protocol is now available in Nimble Streamer. Read this article for setup details.

Live Transcoder has been updated: VP8 and VP9 decoding and encoding are now supported. Read this article for more details about setup and usage. VP8 was also added to VA API implementation in our transcoder to give more flexibility.

Other interesting Nimble Streamer updates are:


Mobile Streaming


The last but not the least.

Mobile playback solutions are now available among our products.
This includes SLDP Player apps for iOS and Android, as well as respective SDKs for adding playback capabilities into your own apps. Currently SLDP and RTMP protocols are supported.
Free apps are available in Google Play and AppStore.

Mobile streaming products were updated with multiple improvements on Android, iOS and Windows Phone. This includes Larix Broadcaster and Larix Screencaster apps as wells as respective SDKs.
You can check the history of our mobile releases for all details of recent SDK releases.


Stay tuned for more features - follow us at FacebookTwitter or Google+ to get latest news and updates of our products and services.



September 30, 2017

The State of Streaming Protocols - 2017 Q3

Softvelum team which operates WMSPanel reporting service continues analyzing the state of streaming protocols.

Third quarter of 2017 has passed so let's take a look at stats. The media servers connected to WMSPanel processed more than 9 billion connections from 3400+ media servers (operated by Nimble Streamer and Wowza) during past 3 months.

First, let's take a look at the chart and numbers:

The State of Streaming Protocols - 2017 Q3
The State of Streaming Protocols - 2017 Q3

You can compare that to the picture of 2017 Q2 protocols landscape:

September 21, 2017

Two-factor authentication in WMSPanel

Softvelum team is continuously improving the security and reliability of our products. As one of the steps, we now add two-factor authentication for accessing WMSPanel. Two-factor authentication (also known as 2FA) is a method of confirming a user's claimed identity by utilizing a combination of two different components.

In case of WMSPanel this means using user/password pair and then generate access code using your mobile device. So when you log into your account, you'll need to enter login information and then use your Android or iOS device to create 6-digit code and provide it to WMSPanel.

Here's how you can use this new security measure.

Install mobile app


To use 2FA on your mobile device, you need to install one of the apps which support it. Some popular apps are Google Authenticator or Duo Mobile but you can use any app you like. Use their respective links to install and setup proper apps before moving further.

Enable two-factor authentication


Go to Settings menu and select Security tab.


Click on Enable two-factor authentication button to see the following wizard.


In your authenticator application you need to scan the appearing QR in order to get 6-digit code. Enter and submit it to see the page with backup codes. You will be able to use then in case you don't have mobile device at hand.


Once it's set up you will be able to to disable this auth method as well as generate new backup codes if you want.



Use two-factor authentication


Once you log into WMSPanel next time, after entering login and password you'll see the following form requesting your 6-digit code:


Use your authenticator app to generate and use it.


Secure your account in 3 easy steps
 article gives more ideas about working securely in WMSPanel.

We keep searching to better ways of making our customers more secure so we look forward to getting your feedback.

Related documentation



September 8, 2017

ABR live streaming setup

Nimble Streamer has wide Live streaming feature set which includes receiving streams via various protocols and generating output streams. Each stream can be transported via several protocols simultaneously, like HLS, MPEG-DASH, RTMP, SLDP, SRT etc.

Some protocols - HLS, MPEG-DASH and SLDP - support adaptive bitrate streaming (ABR) which allows a player to switch between streams with different bandwidth in case if networks conditions change over time. Take a look at setting up real-time ABR streaming via SLDP as example. WMSPanel control web-service allows setting ABR for those protocols at Nimble Streamer using both pre-defined set of streams and wildcard pattern. Let's see how this can be done in our web UI.

Prepare streams


Adaptive-bitrate stream is formed from a number of single-bitrate streams. Which means that you need to have at least two of those single-bitrate streams in order to make ABR.

Nimble supports a variety of protocols and they all have their outgoing streams setup specifics. E.g. if you use RTMP or RTSP, once you get incoming stream, you will get outgoing streams of the same name. For MPEG-TS and SRT, you need to define the outgoing streams directly as one channel may contain multiple streams. Check respective articles for setup details of outgoing streams.

Also notice that you can use Live Transcoder to create different resolution and bitrate streams from a single stream.

Once you have some streams to make ABR from, you can proceed with setup.

Go to Nimble Streamer -> Live Streams menu to open the list of servers.


There you need to look for the server you'd like to add new ABR stream for. Now either click on respective number of streams in "ABR streams" column or click on server name and chooso Adaptive stream on the opened page - those actions open the same page shown below.


At this page you can click on Add ABR setting to open the setting dialog for a new ABR stream. Both pre-defined and wildcard ABR are created in the same dialog.

Create ABR with pre-defined streams


In this picture you see typical ABR stream being set up.

New ABR setting dialog for predefined streams

ABR application name and ABR stream name fields allow setting the names which will later be used for playback URL - see the last section.

Source application name and Source stream name fields allow specifying which existing streams need to be used as part of ABR. As you type the name you will get suggestions from available streams. Bandwidth value will be used in a playlist by your player. If you select Audio only for a stream, it will not stream video, only audio part.

Advanced settings will open a drop-down list which will allow defining the sorting option for the streams in the result playlist.

You can see the list of servers in the bottom of the dialog which you can select. Current ABR setting will apply to those servers which you select by the check boxes

After clicking on Save button your settings to be synced with each selected servers within a few seconds.

Using multiple audio tracks


If you'd like to add different audio tracks into your live transmission, this can also be done via ABR stream setting.

Adding multiple audio tracks for ABR HLS live streams article described how you can do that.

Create wildcard ABR


Wildcard ABR is done in a similar way, the fields are the same. The difference is in streams' names.



Leave the stream name blank in ABR stream name field and use {STREAM} placeholder in Source stream name fields. This will allow getting any single-bitrate stream directly by its name.

For example, if you have source streams live/sport_720, live/sport_480 etc. you can access ABR as
http://server/live_wildcard/sport/playlist.m3u8
with respective single-bitrate chunklists as
http://server/live_wildcard/sport_720/chunks.m3u8 
http://server/live_wildcard/sport_480/chunks.m3u8
etc. They will be used by your player for playback.

Please notice that only 1 wildcard ABR setting can be created for 1 application. If you happen to create several ones, only one setting will be used.

Playback


Once the ABR streams are defined you will see them in the list.



Clicking on "?" icon (a question mark) will show a playback dialog where you'll be able to get the stream URL and test the stream in most popular players. You'll also be able to get sample code for those players for further embedding (with your own player licenses if needed).


The player will use full list of available bandwidths to show to the user.


Graceful adaptive bitrate


When the player starts the playback from ABR playlist it may select the stream which is most preferred for your current bandwidth. When the network capability changes, new chunks are started to pick up from the proper sub-playlist. The main playlist is continuously requested again to make sure it has latest changes.

During the playback, the RTMP streams' source may loose some of its incoming streams due to many reasons. A camera may shut down, a failure may appear within the encoder itself or some network may be unavailable for transmission. So one or more incoming streams may become unavailable for ABR streaming.

Nimble Streamer provides graceful handling of cases like that. When respective RTMP stream goes down, it is excluded from outgoing streams list, thus it's excluded from the ABR playlist. So when new player gets playlist the broken stream will not be there.

This behavior means that an end-user will have good user experience with no playback interruption.

When a broken stream gets back online, a player receives its description and starts using it in its workflow.

Troubleshooting


Normally Nimble Streamer generates chunk name with the following template: l_<stream_index>_<timestamp>_<number>.ts
To skip stream_index part for certain cases, you can use add_stream_index_to_hls_chunk_name = false parameter in Nimble config.
With this parameter set to false, different substreams in ABR stream will have the same chunk names (e.g. l_250920_31.ts). It may affect behavior of Apple devices as they look up the same chunk name when switching into another resolution, and play from first present chunk if it was not found (so it "jumps back"). For other device types chunk names don't matter.

Live Transcoder


As was mentioned, you can use Live Transcoder to create multiple resolutions streams from a single input. This will allow creating ABR streams as described above.
For your convenience, we've added wildcard capabilities to Live Transcoder in order to simplify the setup process for ABR, read this article for more details.


That's it - you can use this approach to create streams for any supported protocols.

Related documentation


Nimble Streamer, Live streaming feature set, Live Transcoder, ABR control API,