July 3, 2019

Passing Icecast metadata in Nimble Live Transcoder

Nimble Streamer has an advanced audio streaming feature set. Live audio streaming covers transmuxing of Icecast pulled and published streams, audio transcoding in Nimble Live Transcoder and ads insertion via Nimble Advertizer.

Metadata is an important part of Icecast usage. We've previously added appending Icecast metadata to live streams and Icecast metadata passthrough tags support. However, by default Live Transcoder doesn't pass metadata through when transforming audio content.

Our long-term customers StreamGuys, who extensively use our audio streaming features, asked if we might add a passthrough of Icecast metadata while using Live Transcoder. Our clients' feedback has always driven our development so we keep adding elements which are helpful to entire audio streamers community.

We're glad to announce that we've improved Live Transcoder to allow setting up metadata passthrough for audio. This feature needs some additional setup so follow the instructions below if you'd like to use it.

1. Set up Nimble configuration


First, this feature needs to be explicitly enabled for the server.

Add this parameter into your nimble.conf file:
icecast_forward_metadata_through_encoder_enabled = true
This article describes how you can work with Nimble configuration in general.

2. Add new parameter in transcoding scenario


We assume you're already familiar with Live Transcoder setup and you already know how to create a transcoding scenario where you need the metadata to pass. You can take a look at Transcoder website and a set of videos to refresh your knowledge.

To make the output stream to have a metadata from some incoming stream, you need to add icecast_metadata_source custom parameter to audio encoder settings.

As the encoder element of transcoder scenario allows receiving content and other data from any decoded stream, with additional filtering on top, you need to specify which stream is used as a source for your metadata.

Specifying exact name

Take a look at an example of encoder setting.


Here you see original content blue decoding element with application name "live" and stream name "origin". For encoder output element you see new app and stream names and also additional "icecast_metadata_source" parameter with "live/origin" value which indicates the exact original stream.

Specifying wildcard stream name

Live Transcoder allows using wildcard stream name for the convenience of setup. Take a look at this example.


Here you can see decoder element has "live" app name and "*" as a stream name. This means decoder has no stream name specified which allows using whatever incoming stream name you have. If you look at encoder settings, you can see "{STREAM}_output" as a stream name, where {STREAM} is a placeholder for any input stream. So in the example above if your input is "live/radio", your output will be "live_radio/radio_output".

This wildcard approach is also used for "icecast_metadata_source" parameter. In our example you see "live/{STREAM}" as a value. This means that if your input source is "live/radio", then "live/radio" will be used as a source of metadata.

You can also simplify the setup process further. If your decoder app name and encoder app name are the same, you can skip the app name in parameter value and keep only {STREAM} as shown below.


In this case the application name is "inherited" from the encoder application ("live" in this case) which is equal to source decoder app name.

3. Apply new setting


Metadata passthrough cannot be applied on-the-fly to a transcoding scenario unlike other parameters. In order to make it work you need to save your scenario and then perform either of these steps:

  • Re-start the input stream
  • Or pause/resume the scenario. Go to scenarios list, click Pause icon, wait for the pause to sync with the transcoder, click on resume icon to start it again.


Once you do any of those 2 options, the metadata will become available in your output.


Let us know if you have any suggestions or questions about our Icecast feature set, we're opened for discussions.

Related documentation


June 30, 2019

2019 Q2 summary

Our team has been working on improvements through Q2 of 2019 so let's see what we've accomplished.

Qosifire

Last quarter we've introduced Qosifire live streaming monitoring service. It allows tracking availability and consistency of HLS, RTMP and Icecast live streams. The HLS support was added this quarter, take a look at this page for more details and also watch this video on our channel showing this feature in action. We've also added debugging capabilities for HLS - you can track DTS/PTS timestamps and see their dynamics real-time. Read this article and watch this video for more details.

If you still haven't tried using Qosifire, read why Qosifire might be useful for you and how you can get started with Qosifire, the step-by-step guide to using our quality monitoring control.


Products snapshots

We've added a couple of new pages in our gallery of Softvelum products snapshots which show how you can use our products for building your streaming infrastructure.

Nimble Streamer and more


Nimble Advertizer, our server side ads insertion framework, now has SCTE-35 markers support. So now ads insertion time point can by defined by absolute and relative time as well as SCTE-35 markers.
Also, Nimble Streamer transmuxing engine now supports passthrough of SCTE-35 markers during live streaming.

The Advertizer now also allows using AAC and MP3 containers for advertisement files during audio-only streaming. Take a look at Advertizer tech spec for more details.

We have some updates for SRT users of Nimble Streamer:
  • SRT packet loss for Sender in Listener mode has been fixed. Read this article for more details and upgrade your SRT package.
  • SRT sender "maxbw" parameter is highly useful for preventing from excessive bandwidth usage during lost packets re-transmission. We recommend using it for all streams on sender side. Read this article for more information.
  • SRT "latency" parameters is also very important for re-transmission control and it should be used with "maxbw". Read this article to find out more.

If SRT technology is part of your streaming infrastructure, please read those articles to be up-to-date with latest improvements.

Nimble Streamer now supports RTMP republishing into Limelight Networks, Inc. Realtime Streaming service, take a look at Realtime Streaming Guide for setup details.

We've also improved RTMP to support HEVC for Nimble Streamer and Larix Broadcaster. This is an experimental non-standard feature available in limited number of players.

Starting from May 1st, Facebook deprecates the usage of RTMP and requires RTMP over SSL (RTMPS). And we've glad to confirm our Nimble Streamer and Larix Broadcaster products have full support for RTMPS.

Streaming to social media becomes more and more popular so we've made a couple of articles about that.
We have also collected all documentation for Larix applications in documentation reference page.

Another big update of Larix Broadcaster for iOS and Android and Larix Screencaster for Android is the release of adaptive bitrate (ABR). ABR (adaptive bitrate) is available in 2 modes:
  • Logarithmic descend - gracefully descend from max bitrate down step by step.
  • Ladder ascend - first cut bitrate by 2/3 and increase it back to normal as much as possible.

You can visit Larix website to find out full feature set.

Last but now least, take a look at The State of Streaming Protocols for Q2 2019 with RTMP declining and HLS going up.


We'll keep you updated on our latest features and improvements. Stay tuned for more updates and follow us at Facebook and Twitter to get latest news and updates of our products and services.

The State of Streaming Protocols - 2019 Q2

Softvelum team continues analyzing the state of streaming protocols. It's based on stats from WMSPanel reporting service which handles data from Wowza Streaming Engine and Nimble Streamer servers - there were 4000+ servers on average this quarter. WMSPanel collected data about more than 14 billion views. Total view time for our server products is 2 billion hours this quarter, or 21+ million view hours per day.

Let's take a look at the chart and numbers of this quarter.

You can see HLS share has increased to 78% with RTMP going down to 7%. People are moving towards de-facto standard of delivery, and they are also shifting from RTMP due to its future decline.

The State of Streaming Protocols - Q2 2019

You may compare that to the picture of Q1 streaming protocols landscape:

The State of Streaming Protocols - Q1 2019 

We'll keep analyzing protocols to see the dynamics. Check our updates at Facebook and Twitter.

If you'd like to use these stats, please refer to this article by original name and URL.

June 21, 2019

Efficient usage of SRT latency and maxbw parameters

We continue educating streaming people on SRT protocol usage. It's a streaming protocol which adapts to the network conditions, helps compensate for jitter and other fluctuations with error recovery mechanism to minimize the packet loss. This leads to the lost packets re-transmission, which increases the bandwidth usage between sender and receiver.

We'll take a look at "latency" parameter which is tightly related to the re-transmission process in general.

The "latency" parameter specifies the delay which is used if the packet reached the destination quickly (and therefore it's way ahead of "time to play"). But when the packet was lost, this gives it an extra time to recover the packet before its "time to play" comes. Original packet delivery takes some round-trip time (RTT), so it will take another RTT to send the correction. And if some issues happen again on its way to receiver, the sender will re-transmit it again and again until it's correctly delivered, spending time during this process.

So too small value of "latency" may cause the denial of re-transmission. Let's suppose you have some RTT from sender to receiver equal 200ms, and your "latency" is set to 300ms. Obviously, the receiver won't be able to get any re-transmission in case of first transmission because the latency "window" will close.

The SRT community recommends setting "latency" as follows:

  • Your default value should be 4 times the RTT of the link. E.g. if you have 200ms RTT, the "latency" parameters should not be less than 800ms.
  • If you'd like to make low latency optimization on good quality networks, this value shouldn't be set less than 2.5 times the RTT.
  • Under any conditions you should never set it lower than default 120ms.
The "latency" value can be set on any side of transmission:
  • sender;
  • receiver;
  • both sides.
The effective latency will be the maximum value of both sides, so if you don't set it on sender (so it'll have default 120ms) and 800ms on receiver, the active value would be 800ms.

We've previously shown how you can control the maximum bandwidth available for transmission over SRT using "maxbw" parameter. As you can see, you need to be aware not only about the bandwidth on your sender, but also latency settings.

So we highly encourage you setting both "latency" and "maxbw" parameters during SRT setup. Otherwise you will face one of these issues:
  • When you set "maxbw" to cover all network problems, the latency can be too low to tolerate the losses.
  • When you set proper "latency" without "maxbw", it will cause exhaustion of bandwidth.

Both configuration options are available for SRT setup in Nimble Streamer. Please refer to SRT setup article to get familiar with the general setup process. Once you complete all required steps, use custom parameter fields to define appropriate entries as shown below.



The "maxbw" is defined in bytes per second, NOT bits per second. So if you'd like to set maximum bandwidth to 2Mbps, you'll have to specify 250000 as your designated value. This is an SRT-specific approach to value definition. The "latency" is defined in milliseconds.

Larix Broadcaster mobile app allows streaming via SRT with defined "maxbw" and "latency" as well.


If you have any questions regarding SRT setup and usage, please feel free to contact our helpdesk so we could advise.

Related documentation


Nimble Streamer SRT feature set, SRT setup in Nimble StreamerHaivision blog: Reliable Low Latency Delivery with SRT+SLDP

June 13, 2019

SCTE-53 markers in Nimble Advertizer

Nimble Advertizer is a server-side ads insertion framework for Nimble Streamer. Its business logic may be defined by both a dynamic handler and a pre-defined config, you can find full tech specification here.

Ads insertion time points within a live stream can be set in 3 ways:
  1. Specify the exact moment of time in GMT.
  2. Time spots relative to the beginning of the viewing session.
  3. According to SCTE-35 markers from the original stream.
The SCTE-35 markers are part of original streams delivered via MPEG-TS and HLS streams which Nimble Streamer is able to accept for input.

Here's a brief summary of how Nimble Streamer handles the markers for ads insertion:

  1. You create a handler app which will tell Nimble how to operate ads. This may also be plain JSON file with static pre-defined setting. Advertizer spec described handler in more details.
  2. Nimble Streamer processes incoming MPEG-TS and HLS streams with SCTE-35 markers to get the original content.
  3. Nimble Advertizer calls your handler app or static config and gets response with ads scenarios.
  4. Advertizer gets files with ads content to process them via Nimble Streamer according to ads scenarios logic defined by handler response.
  5. If the handler response defines that current stream needs to insert ads according to SCTE-35 markers (by using "time-sync":"scte35" field), Nimble inserts the ads into original media right at the time points specified in SCTE-35 marker.
  6. End user connects to Nimble and watches media stream containing original content mixed with advertisements.

You can find example of handler response with marker-based insertion as well as other samples in Advertizer github repo.

You can read all information regarding Advertizer usage on its website.

If you find any issues with SCTE-35 ads insertion, please file us a support ticket so we could help you.

Related documentation


Nimble AdvertizerNimble Streamer live streaming scenarios,



June 11, 2019

Streaming from Larix Broadcaster to YouTube Live

YouTube Live became extremely popular lately and many users of Larix Broadcaster for Android and iOS and Larix Screencaster for Android also started publishing live video there. We get questions about correct setup of Larix products so this brief instruction explains this process.

To proceed with YouTube Live you need to get familiar with service setup using this support page. When you set up YouTube Live transmission, you get these settings:

Primary Server URL: rtmp://a.rtmp.youtube.com/live2
Stream Name: username.1234-5678-abcd-efgh
Larix uses single line for target URL setup so just need to use URL+Key as your connection URL, like this one:
rtmp://a.rtmp.youtube.com/live2/username.1234-5678-abcd-efgh

From Larix UI perspective the setup is performed with the following steps. We assume you've installed Larix using links from this web page.

Open the app and click on gear icon to see the main menu. Go to Connections then click on New connection menu.



It will open a new connection dialog. Enter any descriptive name for a new connection and then insert your connection URL.


Once you save changes, a new connection will appear in connections list. To use this new connection for further streaming, just click on its respective checkbox.


Once you get on the main video preview screen, you can tap on big red circle to start streaming.

This setup is applicable to Larix Broadcaster for both Android and iOS, as well as to Larix Screencaster for Android.

Visit our documentation reference page for more setup information.

If you have questions regarding Larix Broadcaster or other mobile products, please contact us via our helpdesk.

Related documentation


Softvelum mobile solutionsLarix documentation reference pagePublish from Nimble Streamer to YouTube Live,


Streaming from Larix Broadcaster to Facebook Live

Facebook Live became extremely popular lately and many users of Larix Broadcaster for Android and iOS and Larix Screencaster for Android also started publishing live to Facebook. We get questions about correct setup of Larix products so this brief instruction explains this process.

When you set up Facebook Live transmission on Facebook setup page, you get these settings:

URL: rtmps://live-api-s.facebook.com:443/rtmp/
Key: 1310310017654321?s_bl=0&s_sw=0&s_vt=api-s&a=Abw47R4F21234567
Larix uses single line for target URL setup so just need to use URL+Key as your connection URL, like this one:
rtmps://live-api-s.facebook.com:443/rtmp/1310310017654321?s_bl=0&s_sw=0&s_vt=api-s&a=Abw47R4F21234567

Notice rtmps:// prefix and port 443 - Facebook now requires to use RTMP over SSL for live streaming. Our products have full support for it.

From Larix UI perspective the setup is performed with the following steps. We assume you've installed Larix using links from this web page.

Open the app and click on gear icon to see the main menu. Go to Connections then click on New connection menu.



It will open a new connection dialog. Enter any descriptive name for a new connection and then insert your connection URL.


Once you save changes, a new connection will appear in connections list. To use this new connection for further streaming, just click on its respective checkbox.


Once you get on the main video preview screen, you can tap on big red circle to start streaming.

This setup is applicable to Larix Broadcaster for both Android and iOS, as well as to Larix Screencaster for Android.

Visit our documentation reference page for more Larix setup information.

If you have questions regarding Larix Broadcaster or other mobile products, please contact us via our helpdesk.

Related documentation


Softvelum mobile solutionsLarix documentation reference pagePublish from Nimble Streamer to Facebook Live,

June 3, 2019

SRT sender max bandwidth

SRT is a streaming protocol which adapts to the network conditions, helps compensate for jitter, bandwidth fluctuations and has error recovery mechanism to minimize the packet loss.

These excellent features lead to the lost packets re-transmission, which increases the bandwidth between sender and receiver. This is why if the network conditions are bad enough, the re-sent packets may consume all available throughput.

As you cannot control the network environment completely, you should set up the maximum bandwidth available for transmission over SRT for every SRT sender. Even if your network is fine now, it's a good practice to prevent any fluctuations in the future.

This maximum bandwidth limitation is set up using "maxbw" parameter for SRT streaming.
It needs to be set on sender side.

We recommend having maxbw value twice as much as your stream bandwidth, e.g. if you send 1Mbps, your maxbw should be 2Mbps.

We also strongly encourage you to set both "maxbw" and "latency" parameters for streaming via SRT. Please read this article to learn more about this parameter.

Proper configuration option is also available for SRT setup in Nimble Streamer. First, please refer to SRT setup article to get familiar with the general setup process. Once you complete all required steps, use custom parameter fields to define appropriate entry as shown on the figure below.



Please notice that maxbw is defined in bytes per second, NOT bits per second. So if you'd like to set maximum bandwidth to 1 Mbps, you'll have to specify 125000 as your designated value. This is an SRT-specific approach to value definition.


If you have any questions regarding SRT setup and usage, please feel free to contact our helpdesk so we could advise.

Related documentation


Nimble Streamer SRT feature setHaivision blog: Reliable Low Latency Delivery with SRT+SLDP

SRT packet loss is fixed

Softvelum team is an active participant of SRT community and we follow up with all updates as well as contribute to the code base.

Recently the SRT library was updated with a commit which fixes packet loss for the case when SRT Sender is working in Listen mode.

This fix is on the master branch already, and it's on its way to the next SRT package release. Nimble Streamer team made changes to SRT Nimble Streamer package so our customers could make proper update and fix the problem for their streaming scenarios.

So if you use SRT Sender in Listen mode, we highly recommend you to upgrade SRT package.
This will fix the aforementioned problem and will improve your streaming experience.

If you have any questions, please feel free to contact our helpdesk so we could advise.

Related documentation


Nimble Streamer SRT feature set, SRT streaming setup,

May 20, 2019

Support for HEVC over RTMP in Softvelum products

RTMP protocol is widely used for origin delivery, e.g. delivery from encoders to edge servers or delivery between origins and edges. The content codecs defined by the spec are H.264 and VP6.

However the recent increase of H.265/HEVC usage inspired third parties for making changes in RTMP so it could carry this new codec. E.g. ffmpeg has forks with proper support.




Thus Softvelum team has added HEVC support for RTMP into our products:

  • Nimble Streamer supports HEVC via RTMP/RTMPS when it's published into Nimble or pulled by it, in order to transmux into HLS, MPEG-DASH and RTSP;
  • Larix Broadcaster allows publishing via RTMP/RTMPS from mobile devices with Android and iOS.

So if you need to use RTMP instead of RTSPSRT or MPEG-TS for HEVC, you can try this new approach.

Please notice that RTMP support for HEVC is a non-standard experimental feature. In order to use it properly, both sender and receiver sides need to support it.

In case of any questions or issues please contact our helpdesk.


Related documentation


Nimble Streamer, Nimble Streamer supported codecsRTMP support in Nimble Streamer

April 10, 2019

SCTE-35 markers passthrough

Nimble Streamer covers wide variety of live streaming scenarios for numerous use cases. Some of them are related to advertising solutions which involve the usage of SCTE-35 markers which need to be delivered through Nimble Streamer without interruption and alternation.

Following the requests from our clients we added support for passing through the SCTE-35 markers  from incoming MPEG-TS and HLS streams into output MPEG-TS and HLS delivery. So if your original stream has some markers in it, they will be passed through into the outgoing stream.

To enable this feature you need to add the following parameter into Nimble Streamer config file:
scte35_processing_enabled = true
You can read this page to find out how exactly you can make changes to Nimble Streamer config.

If you find any issues with SCTE-35 passthrough, please file us a support ticket so we could help you.

Nimble Advertizer


Nimble Advertizer, our server-side ads insertion framework, has full support for ads insertion with SCTE-35 markers. Take a look at its tech spec and demo page.

Related documentation


Nimble Streamer live streaming scenarios, Nimble Streamer configuration file, Nimble Advertizer,

March 31, 2019

2019 Q1 summary

The first quarter of year 2019 brought a new product and several interesting updates by Softvelum team.


Qosifire live streaming quality monitoring service

Qosifire is a web service for monitoring live streams' availability and quality.
This is our approach to quality monitoring for a variety of streaming companies which need simple, powerful and reliable solution.

Qosifire has 3 basic components:

  • Agent software is installed on customer server to check streams;
  • Our web service console collects data from agents;
  • Free mobile apps notify users' devices about alerts and show stats.

At the moment Qosifire supports Icecast audio streaming quality monitoring. We are working on HLS quality monitoring for video and audio - stay tuned for our updates.
As for pricing, Qosifire is a SaaS with simple and affordable cost structure.


Now let's get back to the flagship product - Nimble Streamer.

For those of our customers who are working with VOD, take a look at video discussion between Jan Ozer of Streaming Learning Center and Yury Udovichenko of Softvelum, called Dynamic packetizing: pros/cons/recipes about pros and cons of dynamic packetizing of live, VOD and DVR content. Your video comments are welcomed!

As for live streaming, you may remember that in 2017 Adobe announced they would stop Flash technology support by the end of 2020. For all live streaming companies this means the decline of RTMP end-user "last mile" streaming.
We've released an article called Get ready for Flash farewell and RTMP decline which describes the timeline for that decline and describe the alternative which our team created to solve that problem - the SLDP low latency live streaming technology. Read this article and let us know of your thoughts on this.

Meanwhile our customers keep using RTMP for "first mile" delivery of live streams, as well as for streams' transfer between origin and edge nodes of their delivery infrastructure. Using that protocol, people care about security of their delivery over public network, so we kept receiving requests for adding RTMPS (RTMP over SSL) support in our products. So we did it.

RTMPS in now supported in Nimble Streamer in all delivery modes like publish, reception and playback.
In addition, RTMPS and RTSPS are now supported in Larix Broadcaster application and mobile SDK for Android and iOS.
Moreover, RTMPS is supported in SLDP Player to allow not just secure publishing but also secure playback in case it's needed too.

Those of our live streamers who use Nimble DVR for recording and playback, might find a new API for DVR export to MP4 very convenient.

Speaking of convenience, take a look at server tags in WMSPanel. We made that for those who has big number of servers and would like to access them easier through the panel.


Last but now least, take a look at The State of Streaming Protocols for Q1 2019 with RTMP declining and HLS going up.


We'll keep you updated on our latest features and improvements. Stay tuned for more updates and follow us at Facebook and Twitter to get latest news and updates of our products and services.

The State of Streaming Protocols - 2019 Q1

Softvelum team continues analyzing the state of streaming protocols. It's based on stats from WMSPanel reporting service which handles data from Wowza Streaming Engine and Nimble Streamer servers - there were 3800+ servers on average this quarter. WMSPanel collected data about more than 14 billion views. Total view time for our server products is nearly nearly 1.7 billion hours this quarter, or 18+ million view hours per day.

Let's take a look at the chart and numbers of this quarter.

You can see HLS share has increased to 74% with RTMP going down to 12%. It's early to say for sure but this might be one of the first signs of shifting from RTMP due to its future decline. We'll see the progress through the year.

The State of Streaming Protocols - Q1 2019 

You may compare that to the picture of 2018 streaming protocols landscape:

March 17, 2019

Server tags in WMSPanel

WMSPanel has been the best web service for media servers reporting for many year so far. We're improving our reporting and monitoring capabilities all the time.

Following the feedback from our customers we've improved our servers management page.

Now you can assign any number of tags to each media server instance and filter them by those tags in servers list. This especially helps customers with significant number of servers to distinct them by their functions and purposes.

To define the tags for a server, go to servers list (Servers top menu). Here you can click on gear icon and select Edit item or click on server name and them click Edit on a newly opened page. You will see server details page.


The Tags edit field will allow entering multiple tags.

Once you save data, you will see server details. Once you go back to servers list, you will see new added tags as well as "All" and "No tags" items. You can click on any number of tags, this will show only those servers which have the selected tags. Clicking on "All" will reset the filter and show all servers. "No tags" will show those servers which have no tags at the moment.



We believe this will help you work with your servers more convenient.

Related documentation


WMSPanel media servers reporting

February 20, 2019

DVR export to MP4

DVR feature set in Nimble Streamer provides streams recording and playback. Basic playback is available via a direct HLS or MPEG-DASH link as described in described in this article.

However you may need a "hard copy" of a recording which is just a MP4 file. It can now be exported via Nimble API call. Let's see how you can use it.

First, you need to set up Nimble Streamer to process direct API calls. Open API description page and read "Starting point: enable API access" section regarding the initial setup. You may also want to secure your calls as describe in "Option: Making authorized requests" section.

Second, you need to use this HTTP GET call to obtain the archive:
/manage/dvr/export_mp4/<app>/<stream>[?start=<utc_time>&end=<utc_time>]
The content of MP4 result file will be returned in HTTP response body.

The "start" and "end" parameters allow defining the time frame which is going to be exported.


This command will get a portion of your archive:
curl -o archive.mp4 -v "http://127.0.0.1:8082/manage/dvr/export_mp4/live/stream?start=1542708934&end=1542712534" 

This one will download the entire archive: 
curl -o archive.mp4 -v http://127.0.0.1:8082/manage/dvr/export_mp4/live/stream

As mentioned above, the result will be returned as a body of HTTP response.



If you have any questions on this or other features, let us know.

Related documentation


Nimble Streamer DVR feature setDVR setup for Nimble StreamerLive streaming scenariosUsage snapshots

February 10, 2019

RTMPS: SSL support for RTMP in Nimble Streamer

Today RTMP still remains the de-facto industry standard for publishing of live streams from encoders and their delivery between nodes over the Internet. Nimble Streamer has full support for this protocol in all 4 delivery modes - publish and get published stream, provide playback and pull streams.

Having a constant need for security for data transmission, our customers requested more security for RTMP delivery. So now Nimble Streamer supports RTMPS which is RTMP delivery over SSL in all 4 modes.

If you're not familiar with RTMP usage, please refer to RTMP feature set digest page.
If you already have some experience with Nimble Streamer setup, please follow the steps below to make adjustments to your current setup and use SSL for RTMP.

Set up SSL certificate


This step is required if your Nimble Streamer server will be providing RTMPS playback.

You need to set up SSL certificates via nimble.conf. Read this article to see what steps need to be followed.

Receiving RTMPS


Setup is done the same way as it's done for un-encrypted RTMP: you need to add global or application setting and then define interfaces to receive stream.
The only difference is the interface setup - you need to check Use SSL parameter for interface via UI.


If you need to use both RTMP and RTMPS, you need to add 2 ports, one of them with SSL checkbox being checked. The rest of the setup is the same as for un-encrypted RTMP, read this article as example.

Also notice that Larix Broadcaster supports RTMPS publishing so you can use it for secure delivery from users' mobile devices to your streaming infrastructure.

Re-publishing RTMP over SSL


The re-publishing setup is also doesn't differ from common use case setup, except for specifying Use SSL parameter for port as shown below.


The rest of parameters are the same as for non-SSL stream setup described here.

Pulling RTMPS streams


Once you have RTMPS stream available via some URL, you can pull it by Nimble Streamer. Setting up RTMPS stream pull is set up the same way as it's done for un-encrypted RTMP. The only difference is the URL, you just need to use rtmps:// prefix and specify SSL port.
rtmps://192.168.0.1:443/live/stream
You can see example here:



Other fields are used the same way as usual.

Providing RTMPS playback


Once you have your certificate ready as described in "Set up SSL certificate" section above, your RTMPS streams will be available as other RTMP stream. The only difference is rtmps:// prefix and a port number in URL, e.g.:
rtmps://192.168.0.1:443/live/stream
Other than that there's no difference from common playback setup.

Notice that even though RTMP is a great way to publish streams, its playback scenarios will soon be declined by the industry due to Flash technology support discontinuity. So if you need real-time latency for live streams, take a look at Get ready for Flash farewell and RTMP decline article explaining to handle this shift. However, pulling and publishing capabilities will not go anywhere so you can use them as much as you need.


If you need to change the outgoing content, like create multiple streams for ABR, use our Live Transcoder for Nimble Streamer to transform. It has high performance and low resource usage.

For other live streaming scenarios, check our live streaming use cases.

Related documentation


RTMP feature setLive Streaming featuresLive Transcoder for Nimble StreamerPay-per-view for Nimble Streamer,

January 30, 2019

Introducing Qosifire - live streaming quality monitoring service

Live streaming is an industry which is rigorous to quality. If you participate in live auction or track surveillance, you cannot miss any part of live stream. Even if you run an online radio or sport stream, your viewers must not find themselves in front of black screen or hear silence.

A lot of companies are building their own streaming infrastructures, that's why QoS - quality of streaming service - and QoE - quality of content experience - are highly anticipated. Hence the need for some tools and services which might help monitoring the quality of live streams 24/7.

Qosifire is another approach to quality monitoring for a variety of streaming companies which need simple, powerful and reliable solution. It's a web service for monitoring live streams' availability and quality.

Qosifire is based on these basic parts:
  • Monitoring agent software is set up on your servers. It tracks streams real-time for a number of protocol-specific items and sends results to monitoring service.
  • Monitoring service gives you full picture over your streams performance. It collects data about streams, tracks issues, provides end-users with access to stats and sends email and mobile alerts.
  • Mobile applications notify users about alerts via push notifications and give them ability to track stats.

Icecast streams monitoring is the only currently available protocol. We are working on video streams monitoring at the moment.
Current audio monitoring feature set covers all Icecast-specific transfer details on both network and protocol level. Qosifire also checks for buffering issues and provides silence detection.

Qosifire is a subscription-based service with free trial. You can use it almost full scale and then subscribe for monthly payments.


Qosifire products set is brought to you by Softvelum, the team behind your favorite Nimble Streamer media server with Nimble Live Transcoder, WMSPanel reporting panel, a set of mobile streaming solutions and other products you might have used already. All of our experience for high-performance and reliability was implemented into Qosifire and we'll keep rolling out more features during 2019.

Visit Qosifire web site to get more details and start your free trial period.

January 10, 2019

Get ready for Flash farewell and RTMP decline

In 2017 Adobe announced that they would stop supporting the Flash technology at the end of 2020.

This announcement agitated the media streaming industry as it meant that Flash Player, a critical requirement for consuming RTMP streams in web browsers, would not be updated and distributed anymore. Major browser owners like Google, Mozilla and Microsoft released their timelines for Flash decline. Common intention is to disable Flash plugin by default in 2019 and then remove Flash completely in 2020. So with the final release of Flash and its removal from browsers, the majority of Internet users will not be able to play RTMP streams.

It’s hard to overrate the impact of Flash and RTMP streaming protocol for online streaming industry. Flash gave a flexible way to implement interactive browser applications, while RTMP gave ability to deliver video and audio to those apps. This tandem boosted the industry, showing the potential of media delivery and consumption over the Internet, opening the door to other technologies. Now we live in a world of HTML5/JavaScript web apps in MSE-enabled browsers with protocols like HLS, MPEG-DASH and WebRTC to deliver the content. As you can see in our quarterly and yearly State of Streaming Protocols reports, HLS now dominates the delivery field with nearly two thirds of traffic.

However, RTMP and Flash remain active, as you can see in the aforementioned report as well. The reason is simple: up until now RTMP was the best way to deliver ultra-low latency live streams to end-users. Neither HLS nor other HTTP chunks-based protocols could provide proper level on latency and start-up delay.

A number of areas require the streaming latency and start-up delay to be as low as possible:
  • Watching sports on your device, you don’t want to get your picture a minute later than your friends who are watching TV. The e-sports also fall under the same category. Regardless of the nature of competition you want to see what’s happening right now.
  • Betting and bidding use cases have big money at stake so you don't want to be even a couple of seconds behind the other participants to make your bet.
  • Law enforcement, security and surveillance scenarios demand to have a hand on pulse in real-time to make an immediate response in case of emergency.

So with the fade of RTMP, a lot of people will get a very bad user experience. Thus many companies are looking for a replacement. Besides the core requirement of latency and delay, new solutions must have support for new codecs and adaptive bitrate (ABR). RTMP was designed long before these elements became a must-have, so currently they are available mostly in HTTP-based protocols like HLS or MPEG-DASH.

We understand the strive of our customers for something better so our team began creating a technology which would cover all the needs mentioned above. As a transport layer, we chose WebSockets which is now the industry standard for cross-applications interaction. A lot of communication solutions have been using WebSockets for several years, so the choice for underlying foundation was not hard.

On top of that we built our media delivery layer to cover all the use cases we discussed above. This required both server-side and player-side implementation to use full advantages for this new technology for last mile delivery.

We called it SLDPSoftvelum Low Latency Protocol. As all of our products, it’s built with high performance and low resource usage in mind.

Its basic features are listed below.
  • Sub-second delay between origin and player. This applies both to start-up delay and on streaming latency. This is a crucial part of SLDP.
  • SLDP is codec-agnostic. You can delivery and play whatever codec your end-user platform can play. These are H.264, VP8, VP9 and H.265/HEVC video with AAC or MP3 audio.
  • ABR support for switching between bitrates according to your network conditions. Changing channels takes just a GOP time.
  • SLDP is firewall-friendly as it uses HTTP/HTTPS.

SLDP provides full feature set for media delivery and consumption available to you with low efforts.

Let’s see what steps you will follow to replace Flash-based RTMP streaming with SLDP.

Input stream processing


Softvelum Nimble Streamer freeware media server has full support for SLDP for a server-side. This includes processing incoming streams via any of supported protocols and codecs.

If you already use Nimble Streamer for processing some live input then you’re all set and good to go further. You won’t need to do anything else to prepare your content for SLDP streaming.

If you haven’t yet tried live streaming with Nimble Streamer then first start with installing it on your server.

Second, you need to describe incoming streams. Nimble Streamer is capable of receiving published RTMPpulling RTMPpulling and receiving RTSP, process SRT streams via all existing modes, take MPEG-TS input, pull HLS and even Icecast streams. Each way has its setup procedure, follow the respective links to get it done. All operations are performed via easy-to-use web UI and it will take just a couple minutes to complete.

Most common case is taking published RTMP stream as input. Once you describe the incoming stream and push it to Nimble, it will appear in the output streams and you’ll be able to use it for playback right away. You can also take a look at other RTMP-related articles.

The input codecs can be whatever you have in your source stream, it depends basically on your encoder and transport protocol capabilities described on this page. At the moment it’s H.264/H.265/VP8/VP9 video and AAC/MP3 audio.

If you'd like to use multiple resolutions in your streams, check ABR setup how-to with general steps to complete that.

Stream delivery


Having your streams been set up properly via web UI, you can either use them directly for playback or build some network for further delivery. You may want to consider setting up origin server to deliver your content to several edges over various networks using appropriate delivery protocols like RTMP, SRT or any other. Take a look Softvelum usage snapshots to see how you can use SLDP with our other products. You can also check frequently asked questions to see best practices of SLDP usage and tuning.

Speaking of complex use cases, check our article in Haivision company blog describing stream delivery from Haivision encoder via SRT through delivery network to end user player using SRT and SLDP protocols. It should give a great overview of both protocols’ capabilities.

As an extension of Nimble Streamer capabilities, you can use Amazon CloudFront which has full support for WebSockets so SLDP can be delivered via this delivery network seamlessly. Take a look at CloudFront setup for SLDP delivery to learn more about proper settings. You can use any other network with WebSockets support, you can be sure Nimble Streamer can be used fine there.

Up to this point the setup wasn’t much different from any other solution. You just set up your infrastructure to get input and deliver the output to your client.

At the edge of your delivery chain (whether it’s one server or the entire network), you will get a playable SLDP stream URL which you can use on your client, both browser and mobile.

Browser playback


Now it's time to start replacing your Flash-based web player with SLDP player.

SLDP web player is an HTML5 JavaScript app. It connects to media server via WebSockets, exchanges commands, receives media data, plays it on a browser’s MSE engine and provides control over the playback process.

We provide freeware version of SLDP web player, you can download it here and set up for your own use cases. There you can see sample JavaScript code and description of player's parameters. It works in any MSE-enabled browser regardless of your platform, e.g. Chrome or Explorer on Windows, Chrome or Mozilla on Linux etc. You can even use it on Android browsers even though we have a separate app for that platform (see the next section).

If you’d like to make deep customization of look-and-feel, you can obtain our web player SDK. It’s available via subscription which covers updates and technical support from our SLDP team.

Mobile devices playback


If your users need to have iOS or Android playback experience, they either use your own branded native applications by now or have some free apps from markets installed.

In the first case, you probably have some mobile app in place, so you’ll just need to add another stream receiving engine in addition to RTMP. Softvelum provides SLDP Player SDK for Android and SDK for iOS. Both SDKs are available via subscriptions which cover periodical updates and technical support from our mobile development team. Having our SDKs, you’ll be able to add SLDP processing and playback to your existing apps or create brand new ones. This won’t require much efforts by your mobile developers as SDK has easy-to-use code of our SLDP Player free app.

We provide free SLDP Player applications available for iOS and Android. Your users can download it for free and play designated URLs. You can use player app to evaluate our SDK capabilities before purchasing premium subscription. Those apps' source code is available as part of SDKs packages too.

Further usage


So at this point you should have your RTMP playback be replaced by SLDP with our solutions to cover existing scenarios and also to provide more features like new codecs or ABR. This overview article showed the basic steps you can follow to consider SLDP as a replacement of RTMP for end-user playback for low latency scenarios.

You don't have to make this technology shift right away as RTMP is still working perfectly. However you should definitely consider researching its replacement with some new technology. And we hope SLDP will be the one you choose for production.

Please check some snapshots of Softvelum products usage to see how you can have SLDP utilized with other products of our company, including content protection, DVR, Live Transcoder and Nimble Advertizer.

If you have any questions regarding our products, feel free to contact our helpdesk.