September 5, 2019

Larix Screencaster for iOS - live streaming from any app

Softvelum provides a number of solutions for mobile streaming and playback.

Today we introduce Larix Screencaster for iOS.

Larix Screencaster application allows capturing the content of user device and streaming it to the target media server or service. The list of supported protocols includes RTMP/RTMPS, RTSP/RTSPS and SRT, you can stream AVC/H.264 and HEVC/H.265.

Install Larix Screencaster here.

The setup of streaming is similar to Larix Broadcaster, you can read documentation reference for connectivity details.

As for the screen recording part, Apple requires additional setup:

Please study these sources to perform the setup.

Let us know if you have any questions regarding Larix Screencaster or other mobile solutions.

September 4, 2019

Generating Icecast metadata from RTMP

Nimble Streamer can re-package incoming RTMP stream into outgoing Icecast stream, it's part of our extensive audio streaming feature set.
RTMP protocol has the ability to carry Icecast metadata and encoders like Omnia Z/IPStream X/2StreamS Live Legacy Encoder and some others may produce this kind of streams.
So Nimble Streamer can pass that metadata from RTMP input to Icecast output.

If you need to process the input RTMP Icecast metadata for further use in Icecast, you need to enable this capability on server or per-application level. If you want to pass the RTMP Icecast metadata through Live Transcoder, you also need to enable it.

To enable the feature globally for entire server, go to Nimble Streamer / Live streams settings menu to open the setup page and choose the required server from the drop-down list. By default, the Global tab will be opened. Here you need to select Icecast protocol to have it in the output. This will show Generate Icecast metadata checkbox which you also need to select as shown below.

If you'd like to define this separately for specific application then choose Applications tab. Click on existing application's settings or click on Add application settings button to create a new one. Its settings will be similar to those described above so you need to select Icecast among protocols and click Generate Icecast metadata checkbox.

Once you save settings and re-start your input RTMP stream, the streams in the affected applications will have the Icecast metadata. You can use it as part of Icecast streaming or for passing through Transcoder scenarios.

Notice that RTMP output streams will have the Icecast metadata regardless or the described setting. This means that if you re-publish such RTMP stream or make it available for further pulled, it will have the Icecast metadata. However if you decide to transcode the stream and keep the Icecast metadata, you'll have to follow this setup process.

Contact us if you have any questions regarding any of our features.

Related documentation

Live Transcoder for Nimble StreamerAudio streaming feature setRTMP support

Forwarding RTMP Icecast metadata through Live Transcoder

RTMP protocol has the ability to carry Icecast metadata. Nimble Streamer can forward that metadata into output stream when transmuxing from RTMP to Icecast. However some scenarios may require Live Transcoder to be involved in order to transform audio. In this case the metadata from RTMP stream need to be carried through the transcoder pipeline by setting parameters in a transcoding scenario. Let's see how this can be done.

Notice that Nimble Live Transcoder is able to passthrough the Icecast metadata. Please make appropriate setup for respective protocol use case. 
If you have incoming Icecast with Icecast metadata, please follow this instruction and skip this article.
If you have incoming RTMP with Icecast metadata, you need to follow the instruction below.

Enable RTMP Icecast metadata

First, the incoming stream must have the RTMP Icecast metadata delivered. So you need to enable RTMP Icecast metadata processing as described in this article. If you don't set up the designated application or entire server processing the RTMP Icecast metadata, it will not be available for transcoder scenarios.

Set up Transcoder

In this sample scenario we use transcoder for re-sampling audio from the existing RTMP stream. You may see a decoder, then a filter to perform sampling and then an encoder.

The forwarding needs to be specified on both decoder and encoder sides. In the audio decoder element you need to check the Forward RTMP metadata checkbox to set the transcoder to take metadata into the pipeline.

Now go to video encoder element and click on Expert setup.

Here you need to check the Forward RTMP metadata checkbox to make the transcoder grab the metadata from the pipeline and make it part of output stream.

Once you save the scenario and re-start the input stream, the output stream will have the metadata which then can be used for transmuxing into Icecast or re-publishing.

Contact us if you have any questions regarding any of our features.

Related documentation

Live Transcoder for Nimble StreamerAudio streaming feature setRTMP support

September 3, 2019

Processing DVB subtitles in Nimble Streamer and Transcoder

DVB (Digital Video Broadcasting) subtitles are used for media streaming from various sources via MPEG2TS and HLS. Nimble Streamer allows transferring DVB subtitles from incoming streams to its output.

Enable the DVB subtitles processing for MPEG2TS and HLS streams

If your incoming MPEG2TS or HLS stream has DVB subtitles and you need to have those subtitles to be passed through to output MPEG-TS and HLS, you need to add the following parameter into your Nimble config:
dvb_subtitles_processing_enabled = true
Once you add it and re-start Nimble Streamer, your output HLS and MPEG-TS will have DVB subtitles if your source streams have them.

Passing DVB subtitles through Live Transcoder

If you need to transcode your streams (e.g. make multiple resolutions or put graphics on top) and keep DVB subtitles from the source stream, you'll need to make some additional changes to your transcoding scenarios. Also notice that you must have dvb_subtitles_processing_enabled parameter to be enabled as described above.

Let's take a look at a simple scenario which allows keeping original stream as well as make lower resolutions.

Here you see /live/source stream being processed to get /live/output_480p output stream down-scaled to 480p (via Scale filter box) and keep the original rendition as /live/output_original stream.

There are 2 ways of passing the DVB - via video and via audio pipeline. We'll demonstrate both ways.

The /live/output_original stream has video being passed through - you can see long line with blue-to-orange gradient. This way, if you have full HD stream as input, you won't waste resources decoding and encoding the content to the same rendition. But the audio is split to decoder and encoder. Let's see their settings.


In the source stream decoder you see the enabled Forward DVB subtitles checkbox. It enables the transcoder to grab the subtitles for processing. In audio encoder setting, you click on Expert setup section to see the Forward DVB subtitles checkbox again. Check it to make the encoder take the subtitles which were previously grabbed into the pipeline by the decoder. After saving the transcoder scenario and re-starting the incoming stream, the output will have the DVB subtitles in the result stream.

The /live/output_480p stream has audio being passed through while video is transformed to lower rendition. So here we use video pipeline to pass DVB subtitles. It's set up the same way - both in decoder and encoder elements.

You need to check Forward DVB subtitles checkbox in video decoder and in video encoder settings under Expert setup section. As in case of audio, once you save the scenario and re-start input stream, the output will have the initial DVB subtitles.

Related documentation

Live TranscoderMPEG2TS streamingHLS streaming

August 29, 2019

NVidia drivers upgrade issues

Recently some of our Live Transcoder NVENC users have faced specific problems with NVidia usage so we'd like to draw your attention to this cases.

NVENC is a technology which combines capabilities of powerful hardware and sophisticated software drivers which allow using that power for customers' applications. There are cases when software update may change the behavior of hardware and affect your transcoding experience, causing errors and all kinds of related troubles. This month NVidia has released new version of their drivers and that update caused transcoding issues across users. Some of them were users of Nimble Transcoder so our team made proper research to eliminate that problem. Those customers had to roll back to previous drivers version to wait for updates from NVidia.

So here are our recommendations to those customers who use NVidia hardware for their transcoding process.

1. Once you've set up NVENC and Live Transcoder and make it work properly, record NVidia drivers version.

2. Make backups of all NVidia drivers which you use over time. You may also find them on this page.

3. If you'd like to make upgrade to some those components, please make one upgrade at a time. So if you decide to upgrade NVENC and Nimble Streamer, make sure you first upgrade Nimble, then let it work for a few days, and then upgrade NVENC. So going step-by-step must be your key strategy.

4. Whatever component you upgrade, please make regression testing of your existing streams. E.g. you upgrade Nimble and your tests show that your streams are fine, after that you upgrade NVENC and your tests fail (streams are not encoded properly) so in that case you know that driver change has introduced the issue.

5. If some of your components failed after upgrade, make sure you roll back to previous version of that component. Once you do the rollback, don't forget to run your regression testing again.

If you use NVENC in your streaming scenarios, we strongly recommend you to use shared contexts to improve performance. In order to do that, you need to add the following parameters into your Nimble config file:
nvenc_context_share_enable = true
nvenc_context_share_lock_enable = true
Please read NVENC shared context usage article for all related details.

We also recommend to check NVidia developers' forums like this one to keep in touch with changes related to NVENC.

If you have any issues or questions regarding NVENC usage in Live Transcoder, pelase feel free to contact us any time.

August 23, 2019

Streaming Media Readers' Choice Awards 2019

Streaming Media magazine has started a vote for Readers' Choice Awards 2019.

Softvelum products have been nominated in 6 categories:

  • Analytics/Quality of Service Platform - Softvelum Qosifire
  • Encoding Software - Softvelum Nimble Live Transcoder 
  • Media Server - Softvelum Nimble Streamer
  • Quality Control/Monitoring Platform - Softvelum Qosifire
  • Server-Side Ad Insertion Solution - Softvelum Nimble Advertizer
  • Video Player Solution/SDK - Softvelum SLDP Player

Open the voting page and select our products to vote as shown on a screenshot below. Voting closes on 1st of October. At that point, all voters will receive an email asking them to to confirm their votes and only the confirmed votes will be counted.

Your support is very important for our team, so have a minute to cast your vote for us!

Here's how you can find us in a huge list of nominations and nominees:

Feel free to support our team, every vote matters!

August 21, 2019

Live Transcoder upgrade

Nimble Streamer Live Transcoder was released in early 2016 and since then it has got many useful features which our customers use widely. The core technology of the Transcoder combines both Softvelum team's own know-how and third-parties' work. Those third parties are listed on a corresponding page.

As we add new functionality, some core third-party technologies also advance forward. For instance, FFmpeg which is used for filtering and some decoding operations, has moved from version 3 to version 4, getting some important fixes and improvements. So in order to keep pace with FFmpeg, our team had to make adjustments and use latest version 4.

New FFmpeg version requires changes in both Nimble Streamer and Live Transcoder. So if you decide to make upgrade of Nimble Streamer then, in order to make smooth transition among versions, Nimble and Transcoder packages will have to be upgraded simultaneously. If one of the packages is upgraded without its counterpart, then live transcoding will stop working.

So here is what you need to do in order to complete this upgrade the correct way.

For Ubuntu, run this command
sudo apt-get update
sudo apt-get install nimble nimble-transcoder

For Debian, run this command
apt-get update
apt-get install nimble nimble-transcoder

For CentOS, run this command
sudo yum makecache
sudo yum install nimble nimble-transcoder

You may also run procedures from Live Transcoder upgrade page first and then Nimble Streamer upgrade page one after another do get the same result. If you have Windows, you also need to follow this path.

So we recommend you to perform this simultaneous upgrade when you have time and resource for that.

After the upgrade is complete, your Nimble Streamer package version will be 3.6.0-1 and Live Transcoder package version will be 1.1.0-1.

If you have any questions or face any issues during the upgrade, please contact us using our helpdesk.

August 20, 2019

Trigger SCTE-35 marker insertion into live stream

Nimble Advertizer has wide variety of options to insert ads, including inserting ads per SCTE-35 markers. So if your original stream has those markers and your Advertizer is set up to trigger ads for it, your output will have proper ads.

In addition to reacting to available SCTE-35 markers, Nimble Advertizer is able to insert new markers right at the moment you need them using Nimble API. This will trigger almost immediate ads insertion so it's a good way to implement your own "big red button" for per-request ads insertion. Here's how you can do it.

Enable SCTE-35 processing

Before you start using SCTE-35 markers in Nimble Streamer you need to enable SCTE-35 processing using scte35_processing_enabled=true parameter as described in this article.

Set up Advertizer

SCTE-35 insertion feature set is part of Nimble Advertizer. So first you need to create and subscribe for Advertizer license.

Then you need to set up Advertizer according to Advertizer tech spec. This includes preparation of ads files and setup configuration. As part the setup, you'll define what streams will be reacting to markers. You can read article explaining SCTE-35 markers handling mechanics, including example of handler response.

Set up Nimble API

Markers insertion is performed via Nimble native API. Use this API page to enable API first and then get familiar with "Insert SCTE-35 marker" call. Try to call that method on some test streams to see how it triggers the insertion of ads.

Use markers insertion

Once you have your Advertizer ready for markers handling and know how to use proper API call, you may develop further setup for triggering ads via API call in your production environment.

Besides immediate insertion via Advertizer, you may pass through the inserted SCTE-35 markers for further processing. Read this article for details of this functionality.

If you find any issues with SCTE-35 feature set, please file us a support ticket so we could help you.

Related documentation

Nimble Advertizer, Inserting ads via SCTE-35 markersNimble Streamer live streaming scenarios, SCTE-35 markers passthrough

August 15, 2019

ABR in SLDP real-time streaming

Streaming media industry currently moves away from RTMP protocol to other real-time streaming technologies due to its future decline. The next generation of technologies tries not just to replace the protocol but also add new features like ABR or new codecs support.

Our team introduced SLDP - Softvelum Low Delay Protocol - as our vision of how real-time streaming should be implemented. Of course, we added capabilities that were missing in RTMP.

ABR capabilities

Adaptive bitrate (ABR) is one of the key features available as part of SLDP. It's supported on both sides of transmission:
  • Nimble Streamer server allows switching per player command among available bitrates which are set up as part of ABR stream
  • SLDP Player provides controls for switching between bitrates if a stream has information about available sub-streams.
Switching of channels may be performed nearly instantly. A player sends command to a server to send media from another bitrate and once the data is received, it takes just a time equal to GOP duration to start displaying a new sub-stream.
Every sub-stream in ABR may have its own codec. With this feature you can combine high resolution sub-stream with H.265/HEVC and other sub-streams having H.264/AVC or use other codecs like VP8/VP9.

SLDP and ABR setup

General process of SLDP playback setup in Nimble Streamer doesn't differ from RTMP playback setup. This article shows step-by-step procedure which is very simple with WMSPanel control panel web UI.

ABR setup in Nimble Streamer is described at this article and it's also straight-forward. It includes input streams processing and output setup. Notice the graceful adaptive bitrate stream approach used in Nimble.

If you don't have various renditions from and need to create them from your original stream, you may use our Live Transcoder, and try wildcard setup in particular to simplify this process. Also check this Transcoder video showing the setup process.

If you use Transcoder, you should also perform key frame alignment for all single-bitrate streams. Sometimes when you switch between different stream's renditions you can see a some short glitch. This happens because a player need a new GOP to start the playback. Different streams may have their key frames aligned differently, so each new GOP will start from different point. To avoid that, you need to perform key frame alignment. Use this article to set key frame alignment in your transcoding scenarios.

SLDP Player usage

Having SLDP stream, you can now use our players to provide the playback to your users.

You may use 3 options:

  • HTML5 player provides playback in any browser which supports MSE. This includes basically any Windows, Linux and MacOS platform. Even Android browsers will allow you to load and play SLDP via HTML5 SLDP player.
  • Android native app provides playback in case you don't want to use web playback on user devices.
  • iOS native app is needed in case you need to play SLDP on Apple mobile devices. You can use it as a fallback for browser player.

All players are free of charge. They also have respective SDKs so you could customize them for your user experience.

SLDP HTML5 player setup also may optional parameters to tune the ABR playback.

If you'd like to set initial resolution which you want your uses to see by default when the player is loaded, use this setting:
initial_resolution = <width>x<height>
If you use key frame alignment as described above, you should use this parameter to obtain smooth rendition switching:
key_frame_alignment = true
You may also use latency_tolerance parameter to tune the streaming latency as described in this article.

That's it. With steps described above you will have full-featured SLDP ABR playback on any platform you need.

Also, you can take a look at the SLDP frequent questions to improve your SLDP usage. And read Reliable Low Latency Delivery with SRT+SLDP post in Haivision blog describing a combination of both protocols for building reliable delivery networks.

Visit SLDP website and contact us in case of any questions regarding SLDP technology usage.

August 7, 2019

Using Certbot with Nimble Streamer

Certbot is a popular tool for working with Let's Encrypt certificates. Nimble Streamer has full support for SSL-protected streaming so let's see how you can use Certbot with Nimble Streamer for your convenience.

1. Set up Certbot

First go to Certbot website and scroll down to "My HTTP website is running" line. Choose "None of the above" option in Software field and then your OS in "System" field.

Let's use Ubuntu 18.04 for our example.

You'll be redirected to Certbot page with necessary instructions.

Follow steps 1 through 4 to install and setup Certbot.

2. Set up certificate

On step 5 - "Install your certificate" - you need to add use your new certificate in Nimble Streamer configuration.

Add these lines to your /etc/nimble/nimble.conf file:
ssl_port = 443
ssl_certificate = /etc/letsencrypt/live/
ssl_certificate_key = /etc/letsencrypt/live/
and then re-start Nimble Streamer with this command:
sudo service nimble restart
You can find more info about nimble.conf on this page.

If you need more complex setup scenario like multiple domains or encryption methods, you can follow this article to set up SSL certificate properly.

By this step, you'll have Nimble Streamer instance running with valid SSL certificate.

3. Set up certificate renewal

The last step will be to set up the automatic renewal of certificate. Certbot does this perfectly, however we'll need to make it call Nimble Streamer for reload the certificate. This can be done via Nimble Streamer native API.

First, set up management API as described on this page under "Starting point: enable API access" point.
Here's an example you can use:
management_listen_interfaces =
management_port = 8083
Then re-start Nimble Streamer instance:
sudo service nimble restart

Second step will be to run the renew command as described in "Test automatic renewal" Certbot page section, with additional post-hook parameter like this:
sudo certbot renew --post-hook 'curl -X POST'
You can use the above command (which makes proper API call) to manually renew SSL certificates without Nimble Streamer restart.

The latest version of Certbot provides pre-configured automated renewal for Ubuntu via systemd timers.
To make sure that Certbot’s systemd timer is installed, use the following command:
systemctl list-timers
Its output should contain certbot.timer at UNIT column.

To make Nimble Streamer apply new certificate settings without restart, please use following commands to create post-hook timer:
sudo echo -e '#!/bin/sh\ncurl -s -X POST' > /etc/letsencrypt/renewal-hooks/post/ 
sudo chmod 750 /etc/letsencrypt/renewal-hooks/post/
Now you certificates will be renewed automatically.

That's it. If you have any questions or issues, feel free to contact us via helpdesk.

Related documentation

SSL setup for Nimble Streamer, Paywall feature set

July 31, 2019

Constant bitrate and mux rate in UDP streaming

Nimble Streamer has wide feature set for MPEG2TS streaming. This includes full UDP support to allow Nimble Streamer receive and send content via that protocol.

Streaming without bitrate setup

When you stream from Nimble Streamer via UDP, a variable bitrate is used for output by default. You can find all details of UDP streaming setup in this article.

However, in some cases of using DVB/ATSC and other hardware, a lot of ETR101290, CC and PCR errors appear on a regular basis making it hard to use it.

Take a look at this Wireshark session which logged an output of UDP stream.

You can see large spikes which cause problematic behavior on sensitive hardware.

Streaming with mux rate

The first step to mitigate this is to specify exact mux rate for outgoing stream to set the upper limit. To specify mux rate parameters, click on "Mux rate" checkbox in UDP settings as shown below.

Here you can define Mux rate field value with the bitrate you'd like to have. Notice that it should be 20% to 30% bigger than the maximum bandwidth of your original content. Other mux rate tuning fields like Mux delay and Max mux delay can be specified as well.

Now you can re-start your stream to see how it affected the output. First we looked at the stream using Wireshark with 1 second interval.

It looks much better, however if we select 100ms interval, the picture is much less smooth.

Even though an average bitrate is near the target value, you see the spikes which will still affect many types of hardware.

Streaming with constant bitrate

To keep the bitrate on the same level, Nimble Streamer allows setting constant bitrate (CBR). Nimble calculates the exact time interval between packets using MPEG-TS packet size and required bitrate, then sends those packets out strictly at designated time slots. With CBR technique, the packets are kept and processed via the CBR buffer.

CBR behavior is enabled by Max CBR buffer field. It defines the maximum total duration of packets in CBR buffer for further processing. If the incoming data overflows the Max CBR buffer, the stream will be reset.

When the stream has been started and the incoming data starts arriving, Nimble Streamer stores the packets into CBR buffer. Start CBR buffer parameter defines the duration of packets which Nimble collects before starting to send them out.
E.g. if you set this to 1000ms and start the stream, Nimble will keep getting data and putting packets into buffer, and after the total duration of packets in the buffer reaches 1000ms, Nimble will start sending the packets with calculated intervals.

As long as data keeps arriving, the packets will be placed in the buffer for further sending.

If the CBR buffer runs out of packets (due to some incoming stream problems), Nimble Streamer will then wait for incoming data to fill the CBR buffer for the duration set in Start CBR buffer before starting to send the data out again. So Start CBR buffer value is used in 2 cases - when the stream has just been started and when the stream has had some problems.

Notice that the bigger Start CBR buffer you set, the bigger the output latency you will have. If you set it to some low value or just keep it blank, then you may face some glitches in Nimble output, just because it will depend on your incoming stream's conditions.

The numbers may differ according to your input streams. If your source is an RTMP stream and your network conditions are good, you can use 1000ms Start CBR buffer and 10000ms in Max CBR buffer.

If you use HLS as an input, those numbers need to be larger as it's a chunk-based protocol. E.g. for a stream with standard 10-seconds chunks the start buffer needs to be 20000 and max buffer can be 60000.

With these settings, once we start the input RTMP stream, the 100ms chart in Wireshark looks like this.

Some rare spikes still present once in a while due to system network deviations but the line is generally flat on the target bitrate.

Notice that CBR works only when Nimble Streamer is installed on Linux or MacOS because Windows doesn't allow working with precise intervals required for this technique without using complex and risky methods.

PCR metrics

Overall, according to StreamGuru, when it comes to UDP streaming, Nimble Streamer has PCR accuracy of 100%. While hardware receivers accept up to 500ns PCR drift, Nimble Streamer produces PCR with 0ns drift. Also, PCR interval is <20 ms while 40ms is acceptable by most hardware.

If you have any questions about working with UDP streaming via Nimble Streamer, please contact us.

Related documentation

July 28, 2019

Handling session IP change in Nimble Streamer

Nimble Streamer uses sessions to track users' stats as it's important to see how the content is consumed. It's used in every end-user connection unless you use HTTP origin feature to remove the session identifier.

Every session is based on user IP to be able to distinct them from one another. Normally if a session starts with some IP, it keeps using that IP until the connection is closed. If the original connection IP changes, Nimble Streamer will close the connection because IP change most probably means that someone else uses your session ID which is not good from security standpoint. The viewers with closed connection will get response code 403.

However in some cases the IP change doesn't mean anything bad. For example, if your Android users use Lite mode (also known as Data Saver), Google will use its own proxy servers to accelerate the data usage. Also, your users may use other trusted proxies for their own legit purposes.

For cases like these you may use a few features of Nimble Streamer.

Disable IP check

First of all you may disable session IP check. This can be done using this parameter
restrict_session_ip = false
in nimble.conf. Please read configuration reference page for details on parameters' setup and usage.

Once you disable it, your streams' direct links may be used by several viewers so you should use this approach only in case your viewers use trusted proxy servers.

Tune hotlink protection

If you use hotlink protection from WMSAuth paywall feature set and your viewers use proxies as described above, they will get error 403 and you'll find "cannot find hash match" in Nimble logs. That will happen even if you disable session IP check.

So the next thing you should do after disabling restrict_session_ip, is to use different headers for obtaining end-user IP in WMSAuth code for your web page. This can be X_FORWARDED_FOR header or others, depending on your server and proxy software. Read this article regarding proxy usage to learn more about headers' usage.

Let us know if you experience any issues with the described features.

Related documentation

Nimble Streamer configuration referenceWMSAuth paywall, Using paywall with proxy servers,

July 25, 2019

Streaming Media European Readers' Choice Awards 2019

Streaming Media Europe magazine has started a vote for European Readers' Choice Awards 2019.
At the moment the voting is closed, if you are a voter, you receive an email asking them to to confirm the votes - please confirm it in order for it to be counted!

Please also have a couple of minutes to participate in Readers' Choice Awards 2019!

July 24, 2019

RTMP re-publishing control API in Nimble Streamer

Nimble Steamer has its own native status and control API which can be used for interacting directly with server instance rather than using WMSPanel control API. This native API can be convenient in case some  action needs to be applied without any delay.

There's a wide RTMP feature set in Nimble Streamer which covers the vast majority of aspects for live streaming. Re-publishing use case if very popular for building streaming infrastructure with origins and edges so our customers have been asking us for native API methods to work with it.

Now you can use new methods to do he following:

  • Get list of re-publishing rules;
  • Get details of selected re-publishing rule;
  • Create new rule;
  • Delete existing rule;
  • Get status of all current re-publishing rules.
You can visit native API reference page to find all details about these methods.

Notice that re-publishing setting defined by "create" and "delete" API calls are not persistent and they are reset after Nimble Streamer re-load. If you'd like to keep them, you should use WMSPanel control API.

Related documentation

July 3, 2019

Passing Icecast metadata in Nimble Live Transcoder

Nimble Streamer has an advanced audio streaming feature set. Live audio streaming covers transmuxing of Icecast pulled and published streams, audio transcoding in Nimble Live Transcoder and ads insertion via Nimble Advertizer.

Metadata is an important part of Icecast usage. We've previously added appending Icecast metadata to live streams and Icecast metadata passthrough tags support. However, by default Live Transcoder doesn't pass metadata through when transforming audio content.

Our long-term customers StreamGuys, who extensively use our audio streaming features, asked if we might add a passthrough of Icecast metadata while using Live Transcoder. Our clients' feedback has always driven our development so we keep adding elements which are helpful to entire audio streamers community.

We're glad to announce that we've improved Live Transcoder to allow setting up metadata passthrough for audio. This feature needs some additional setup so follow the instructions below if you'd like to use it.

1. Set up Nimble configuration

First, this feature needs to be explicitly enabled for the server.

Add this parameter into your nimble.conf file:
icecast_forward_metadata_through_encoder_enabled = true
This reference page describes how you can work with Nimble configuration in general.

2. Add new parameter in transcoding scenario

We assume you're already familiar with Live Transcoder setup and you already know how to create a transcoding scenario where you need the metadata to pass. You can take a look at Transcoder website and a set of videos to refresh your knowledge.

To make the output stream to have a metadata from some incoming stream, you need to add icecast_metadata_source custom parameter to audio encoder settings.

As the encoder element of transcoder scenario allows receiving content and other data from any decoded stream, with additional filtering on top, you need to specify which stream is used as a source for your metadata.

Specifying exact name

Take a look at an example of encoder setting.

Here you see original content blue decoding element with application name "live" and stream name "origin". For encoder output element you see new app and stream names and also additional "icecast_metadata_source" parameter with "live/origin" value which indicates the exact original stream.

Specifying wildcard stream name

Live Transcoder allows using wildcard stream name for the convenience of setup. Take a look at this example.

Here you can see decoder element has "live" app name and "*" as a stream name. This means decoder has no stream name specified which allows using whatever incoming stream name you have. If you look at encoder settings, you can see "{STREAM}_output" as a stream name, where {STREAM} is a placeholder for any input stream. So in the example above if your input is "live/radio", your output will be "live_radio/radio_output".

This wildcard approach is also used for "icecast_metadata_source" parameter. In our example you see "live/{STREAM}" as a value. This means that if your input source is "live/radio", then "live/radio" will be used as a source of metadata.

You can also simplify the setup process further. If your decoder app name and encoder app name are the same, you can skip the app name in parameter value and keep only {STREAM} as shown below.

In this case the application name is "inherited" from the encoder application ("live" in this case) which is equal to source decoder app name.

3. Apply new setting

Metadata passthrough cannot be applied on-the-fly to a transcoding scenario unlike other parameters. In order to make it work you need to save your scenario and then perform either of these steps:

  • Re-start the input stream
  • Or pause/resume the scenario. Go to scenarios list, click Pause icon, wait for the pause to sync with the transcoder, click on resume icon to start it again.

Once you do any of those 2 options, the metadata will become available in your output.

RTMP Icecast metadata

RTMP protocol is able to carry Icecast metadata and Nimble Streamer can put such RTMP metadata into Icecast streams. However, if you'd like to pass that metadata through Live Transcoder, you need to follow these instructions for proper setup.

Let us know if you have any suggestions or questions about our Icecast feature set, we're opened for discussions.

Related documentation

June 30, 2019

2019 Q2 summary

Our team has been working on improvements through Q2 of 2019 so let's see what we've accomplished.


Last quarter we've introduced Qosifire live streaming monitoring service. It allows tracking availability and consistency of HLS, RTMP and Icecast live streams. The HLS support was added this quarter, take a look at this page for more details and also watch this video on our channel showing this feature in action. We've also added debugging capabilities for HLS - you can track DTS/PTS timestamps and see their dynamics real-time. Read this article and watch this video for more details.

If you still haven't tried using Qosifire, read why Qosifire might be useful for you and how you can get started with Qosifire, the step-by-step guide to using our quality monitoring control.

Products snapshots

We've added a couple of new pages in our gallery of Softvelum products snapshots which show how you can use our products for building your streaming infrastructure.

Nimble Streamer and more

Nimble Advertizer, our server side ads insertion framework, now has SCTE-35 markers support. So now ads insertion time point can by defined by absolute and relative time as well as SCTE-35 markers.
Also, Nimble Streamer transmuxing engine now supports passthrough of SCTE-35 markers during live streaming.

The Advertizer now also allows using AAC and MP3 containers for advertisement files during audio-only streaming. Take a look at Advertizer tech spec for more details.

We have some updates for SRT users of Nimble Streamer:
  • SRT packet loss for Sender in Listener mode has been fixed. Read this article for more details and upgrade your SRT package.
  • SRT sender "maxbw" parameter is highly useful for preventing from excessive bandwidth usage during lost packets re-transmission. We recommend using it for all streams on sender side. Read this article for more information.
  • SRT "latency" parameters is also very important for re-transmission control and it should be used with "maxbw". Read this article to find out more.

If SRT technology is part of your streaming infrastructure, please read those articles to be up-to-date with latest improvements.

Nimble Streamer now supports RTMP republishing into Limelight Networks, Inc. Realtime Streaming service, take a look at Realtime Streaming Guide for setup details.

We've also improved RTMP to support HEVC for Nimble Streamer and Larix Broadcaster. This is an experimental non-standard feature available in limited number of players.

Starting from May 1st, Facebook deprecates the usage of RTMP and requires RTMP over SSL (RTMPS). And we've glad to confirm our Nimble Streamer and Larix Broadcaster products have full support for RTMPS.

Streaming to social media becomes more and more popular so we've made a couple of articles about that.
We have also collected all documentation for Larix applications in documentation reference page.

Another big update of Larix Broadcaster for iOS and Android and Larix Screencaster for Android is the release of adaptive bitrate (ABR). ABR (adaptive bitrate) is available in 2 modes:
  • Logarithmic descend - gracefully descend from max bitrate down step by step.
  • Ladder ascend - first cut bitrate by 2/3 and increase it back to normal as much as possible.

You can visit Larix website to find out full feature set.

Last but now least, take a look at The State of Streaming Protocols for Q2 2019 with RTMP declining and HLS going up.

We'll keep you updated on our latest features and improvements. Stay tuned for more updates and follow us at Facebook and Twitter to get latest news and updates of our products and services.

The State of Streaming Protocols - 2019 Q2

Softvelum team continues analyzing the state of streaming protocols. It's based on stats from WMSPanel reporting service which handles data from Wowza Streaming Engine and Nimble Streamer servers - there were 4000+ servers on average this quarter. WMSPanel collected data about more than 14 billion views. Total view time for our server products is 2 billion hours this quarter, or 21+ million view hours per day.

Let's take a look at the chart and numbers of this quarter.

You can see HLS share has increased to 78% with RTMP going down to 7%. People are moving towards de-facto standard of delivery, and they are also shifting from RTMP due to its future decline.

The State of Streaming Protocols - Q2 2019

You may compare that to the picture of Q1 streaming protocols landscape:

The State of Streaming Protocols - Q1 2019 

We'll keep analyzing protocols to see the dynamics. Check our updates at Facebook and Twitter.

If you'd like to use these stats, please refer to this article by original name and URL.

June 21, 2019

Efficient usage of SRT latency and maxbw parameters

We continue educating streaming people on SRT protocol usage. It's a streaming protocol which adapts to the network conditions, helps compensate for jitter and other fluctuations with error recovery mechanism to minimize the packet loss. This leads to the lost packets re-transmission, which increases the bandwidth usage between sender and receiver.

We'll take a look at "latency" parameter which is tightly related to the re-transmission process in general.

The "latency" parameter specifies the delay which is used if the packet reached the destination quickly (and therefore it's way ahead of "time to play"). But when the packet was lost, this gives it an extra time to recover the packet before its "time to play" comes. Original packet delivery takes some round-trip time (RTT), so it will take another RTT to send the correction. And if some issues happen again on its way to receiver, the sender will re-transmit it again and again until it's correctly delivered, spending time during this process.

So too small value of "latency" may cause the denial of re-transmission. Let's suppose you have some RTT from sender to receiver equal 200ms, and your "latency" is set to 300ms. Obviously, the receiver won't be able to get any re-transmission in case of first transmission because the latency "window" will close.

The SRT community recommends setting "latency" as follows:

  • Your default value should be 4 times the RTT of the link. E.g. if you have 200ms RTT, the "latency" parameters should not be less than 800ms.
  • If you'd like to make low latency optimization on good quality networks, this value shouldn't be set less than 2.5 times the RTT.
  • Under any conditions you should never set it lower than default 120ms.
The "latency" value can be set on any side of transmission:
  • sender;
  • receiver;
  • both sides.
The effective latency will be the maximum value of both sides, so if you don't set it on sender (so it'll have default 120ms) and 800ms on receiver, the active value would be 800ms.

We've previously shown how you can control the maximum bandwidth available for transmission over SRT using "maxbw" parameter. As you can see, you need to be aware not only about the bandwidth on your sender, but also latency settings.

So we highly encourage you setting both "latency" and "maxbw" parameters during SRT setup. Otherwise you will face one of these issues:
  • When you set "maxbw" to cover all network problems, the latency can be too low to tolerate the losses.
  • When you set proper "latency" without "maxbw", it will cause exhaustion of bandwidth.

Both configuration options are available for SRT setup in Nimble Streamer. Please refer to SRT setup article to get familiar with the general setup process. Once you complete all required steps, use custom parameter fields to define appropriate entries as shown below.

The "maxbw" is defined in bytes per second, NOT bits per second. So if you'd like to set maximum bandwidth to 2Mbps, you'll have to specify 250000 as your designated value. This is an SRT-specific approach to value definition. The "latency" is defined in milliseconds.

Larix Broadcaster mobile app allows streaming via SRT with defined "maxbw" and "latency" as well.

If you have any questions regarding SRT setup and usage, please feel free to contact our helpdesk so we could advise.

Related documentation

Nimble Streamer SRT feature set, SRT setup in Nimble StreamerHaivision blog: Reliable Low Latency Delivery with SRT+SLDP

June 13, 2019

SCTE-35 markers in Nimble Advertizer

Nimble Advertizer is a server-side ads insertion framework for Nimble Streamer. Its business logic may be defined by both a dynamic handler and a pre-defined config, you can find full tech specification here.

Ads insertion time points within a live stream can be set in 3 ways:
  1. Specify the exact moment of time in GMT.
  2. Time spots relative to the beginning of the viewing session.
  3. According to SCTE-35 markers from the original stream.
The SCTE-35 markers are part of original streams delivered via MPEG-TS and HLS streams which Nimble Streamer is able to accept for input.

Before you start using SCTE-35 markers in Nimble Streamer you need to enable SCTE-35 processing using scte35_processing_enabled=true parameter as described in this article.

Here's a brief summary of how Nimble Streamer handles the markers for ads insertion:
  1. You create a handler app which will tell Nimble how to operate ads. This may also be plain JSON file with static pre-defined setting. Advertizer spec described handler in more details.
  2. Nimble Streamer processes incoming MPEG-TS and HLS streams with SCTE-35 markers to get the original content.
  3. Nimble Advertizer calls your handler app or static config and gets response with ads scenarios.
  4. Advertizer gets files with ads content to process them via Nimble Streamer according to ads scenarios logic defined by handler response.
  5. If the handler response defines that current stream needs to insert ads according to SCTE-35 markers (by using "time-sync":"scte35" field), Nimble inserts the ads into original media right at the time points specified in SCTE-35 marker.
  6. You can use this file as example of handler response which defines the usage of markers.
  7. End user connects to Nimble and watches media stream containing original content mixed with advertisements.

You can find example of handler response with marker-based insertion as well as other samples in Advertizer github repo. You can read all information regarding Advertizer usage on its website.

You may also insert SCTE-35 markers any moment to trigger ads insertion or pass them though for further use.

If you find any issues with SCTE-35 ads insertion, please file us a support ticket so we could help you.

Related documentation

Nimble AdvertizerNimble Streamer live streaming scenariosSCTE-35 markers passthrough

June 11, 2019

Streaming from Larix Broadcaster to YouTube Live

YouTube Live became extremely popular lately and many users of Larix Broadcaster for Android and iOS and Larix Screencaster for Android also started publishing live video there. We get questions about correct setup of Larix products so this brief instruction explains this process.

To proceed with YouTube Live you need to get familiar with service setup using this support page. When you set up YouTube Live transmission, you get these settings:

Primary Server URL: rtmp://
Stream Name: username.1234-5678-abcd-efgh
Larix uses single line for target URL setup so just need to use URL+Key as your connection URL, like this one:

From Larix UI perspective the setup is performed with the following steps. We assume you've installed Larix using links from this web page.

Open the app and click on gear icon to see the main menu. Go to Connections then click on New connection menu.

It will open a new connection dialog. Enter any descriptive name for a new connection and then insert your connection URL.

Once you save changes, a new connection will appear in connections list. To use this new connection for further streaming, just click on its respective checkbox.

Once you get on the main video preview screen, you can tap on big red circle to start streaming.

This setup is applicable to Larix Broadcaster for both Android and iOS, as well as to Larix Screencaster for Android.

Visit our documentation reference page for more setup information.

If you have questions regarding Larix Broadcaster or other mobile products, please contact us via our helpdesk.

Related documentation

Softvelum mobile solutionsLarix documentation reference pagePublish from Nimble Streamer to YouTube Live,

Streaming from Larix Broadcaster to Facebook Live

Facebook Live became extremely popular lately and many users of Larix Broadcaster for Android and iOS and Larix Screencaster for Android also started publishing live to Facebook. We get questions about correct setup of Larix products so this brief instruction explains this process.

When you set up Facebook Live transmission on Facebook setup page, you get these settings:

URL: rtmps://
Key: 1310310017654321?s_bl=0&s_sw=0&s_vt=api-s&a=Abw47R4F21234567
Larix uses single line for target URL setup so just need to use URL+Key as your connection URL, like this one:

Notice rtmps:// prefix and port 443 - Facebook now requires to use RTMP over SSL for live streaming. Our products have full support for it.

From Larix UI perspective the setup is performed with the following steps. We assume you've installed Larix using links from this web page.

Open the app and click on gear icon to see the main menu. Go to Connections then click on New connection menu.

It will open a new connection dialog. Enter any descriptive name for a new connection and then insert your connection URL.

Once you save changes, a new connection will appear in connections list. To use this new connection for further streaming, just click on its respective checkbox.

Once you get on the main video preview screen, you can tap on big red circle to start streaming.

This setup is applicable to Larix Broadcaster for both Android and iOS, as well as to Larix Screencaster for Android.

Visit our documentation reference page for more Larix setup information.

If you have questions regarding Larix Broadcaster or other mobile products, please contact us via our helpdesk.

Related documentation

Softvelum mobile solutionsLarix documentation reference pagePublish from Nimble Streamer to Facebook Live,

June 3, 2019

SRT sender max bandwidth

SRT is a streaming protocol which adapts to the network conditions, helps compensate for jitter, bandwidth fluctuations and has error recovery mechanism to minimize the packet loss.

These excellent features lead to the lost packets re-transmission, which increases the bandwidth between sender and receiver. This is why if the network conditions are bad enough, the re-sent packets may consume all available throughput.

As you cannot control the network environment completely, you should set up the maximum bandwidth available for transmission over SRT for every SRT sender. Even if your network is fine now, it's a good practice to prevent any fluctuations in the future.

This maximum bandwidth limitation is set up using "maxbw" parameter for SRT streaming.
It needs to be set on sender side.

We recommend having maxbw value twice as much as your stream bandwidth, e.g. if you send 1Mbps, your maxbw should be 2Mbps.

We also strongly encourage you to set both "maxbw" and "latency" parameters for streaming via SRT. Please read this article to learn more about this parameter.

Proper configuration option is also available for SRT setup in Nimble Streamer. First, please refer to SRT setup article to get familiar with the general setup process. Once you complete all required steps, use custom parameter fields to define appropriate entry as shown on the figure below.

Please notice that maxbw is defined in bytes per second, NOT bits per second. So if you'd like to set maximum bandwidth to 1 Mbps, you'll have to specify 125000 as your designated value. This is an SRT-specific approach to value definition.

If you have any questions regarding SRT setup and usage, please feel free to contact our helpdesk so we could advise.

Related documentation

Nimble Streamer SRT feature setHaivision blog: Reliable Low Latency Delivery with SRT+SLDP

SRT packet loss is fixed

Softvelum team is an active participant of SRT community and we follow up with all updates as well as contribute to the code base.

Recently the SRT library was updated with a commit which fixes packet loss for the case when SRT Sender is working in Listen mode.

This fix is on the master branch already, and it's on its way to the next SRT package release. Nimble Streamer team made changes to SRT Nimble Streamer package so our customers could make proper update and fix the problem for their streaming scenarios.

So if you use SRT Sender in Listen mode, we highly recommend you to upgrade SRT package.
This will fix the aforementioned problem and will improve your streaming experience.

If you have any questions, please feel free to contact our helpdesk so we could advise.

Related documentation

Nimble Streamer SRT feature set, SRT streaming setup,

May 20, 2019

Support for HEVC over RTMP in Softvelum products

RTMP protocol is widely used for origin delivery, e.g. delivery from encoders to edge servers or delivery between origins and edges. The content codecs defined by the spec are H.264 and VP6.

However the recent increase of H.265/HEVC usage inspired third parties for making changes in RTMP so it could carry this new codec. E.g. ffmpeg has forks with proper support.

Thus Softvelum team has added HEVC support for RTMP into our products:

  • Nimble Streamer supports HEVC via RTMP/RTMPS when it's published into Nimble or pulled by it, in order to transmux into HLS, MPEG-DASH and RTSP;
  • Larix Broadcaster allows publishing via RTMP/RTMPS from mobile devices with Android and iOS.

So if you need to use RTMP instead of RTSPSRT or MPEG-TS for HEVC, you can try this new approach.

Please notice that RTMP support for HEVC is a non-standard experimental feature. In order to use it properly, both sender and receiver sides need to support it.

In case of any questions or issues please contact our helpdesk.

Related documentation

Nimble Streamer, Nimble Streamer supported codecsRTMP support in Nimble Streamer