May 3, 2020

Larix Grove for distributing Larix Broadcaster settings

Usually all connections in Larix Broadcaster are set up in application settings, and you need to add each connection manually.

With Larix Grove you can distribute and import streaming settings using a special URL format.
You make a URL with Grove wizard and send it via email, messenger, a web page or QR code.

When you tap on the link on a web page, Larix Broadcaster is launched and you need to confirm settings import.

You can create a QR code and once you scan it, you will import connections settings.

You may also open the Settings -> Connections -> Import Larix setting app menu to process the links manually copied from an email or a messenger.

Try Larix Grove in action and let us know of any ideas to improve it.

Here's a demo video showing this feature in action.


April 22, 2020

Handling fuzzy FPS to get proper bitrate output

Nimble Live Transcoder allows creating transcoding scenarios with various transformations, including control over output bitrate. Usually you have some input stream with expected parameters like frame rate (FPS) and resolution and want to create one or more output with certain bitrate.

Normally you can define output bitrate using various parameters of encoder libraries, like it's described for x264 bitrate, constant x264 bitrate, NVENC bitrate and QuickSync bitrate setup in respective articles.

However, some sources may produce streams with uncertain frame rate. For example, mobile encoders may produce variable bitrate so when your users are streaming with apps like Larix Broadcaster you may get streams with unexpected FPS. As a result, if you ingest such fuzzy stream into a transcoding scenario, the encoder may produce the unexpected bitrate - higher or lower than you've defined in encoder settings.

You can use two approaches to handle this: by adding FPS filter or by defining FPS numerator and denominator parameters for encoder.

Option A: Add FPS filter


The best way to handle this is to set some definite value for FPS using a custom "fps" filter.
If you expect your users to stream with framerate around 25-30 FPS, you can set filter to 30 FPS. If you need a better framerate, just set it to proper value like 60 FPS. This filter uses optimized algorithm to insert extra duplicate frames, and encoders are optimized to process such repeating frames, so this will not add any processing overhead.

A.1 Add video_filter_reset_by_pts_enabled parameter to config

Before setting a filter, add video_filter_reset_by_pts_enabled = true parameter into your nimble.conf. With FPS filter, if you have MPEG-TS input (e.g. if you use SRT) and if input gets time gap, the output may hang up, so this parameter avoids that.

A.2 Add filter

Here's how you add the filter into transcoding scenario. Open your scenario and drag a Custom filter onto a pipeline. 


Then connect it to other filters and the encoder.


Now if you have your bitrate set up in the encoder element, you'll get the output bitrate which you've defined there after you save the scenario.

Option B: Using FPS numerator and denominator


Another option is to set output FPS numerator and denominator. All Transcoder encoders' libraries support them, so just set them as part of parameters: fps_n for numerator and fps_d for denominator. These parameters will "introduce" the encoder to the numbers which it will use for further bitrate calculation.

If you need 30 FPS, use fps_n=30 and fps_d=1, as shown below. The same formula applies to 60 FPS: use fps_n=60 and fps_d=1.


In this example, once you save encoder setting and then save scenario, the encoder will be likely to produce 1Mbps output based on expected frame rate of 30 FPS.



Let us know if you have any questions about these approaches.

Related documentation


Nimble Live Transcoder, Live Transcoder documentation reference,

April 20, 2020

RIST protocol in Nimble Streamer

RIST (Reliable Internet Streaming Transport) protocol is a new protocol for low-latency live video over un-managed networks. You can find out more about this protocol in RIST website. RIST is developed and promoted by RIST Forum where Softvelum is an affiliate member.

RIST is available via libRIST open-source library so Softvelum uses it to integrate this technology into Nimble Streamer media server. We give our customers all delivery options available on the market so libRIST is now one more delivery protocol in our stack.


Nimble Streamer allows both receiving (Listen and Pull) and sending (Listen and Push) RIST streams.

All supported protocols can be used as input for re-packaging into RIST: SRT, RTMP, RTSP, MPEGTS, HLS and Icecast.

Having any RIST input stream, it can be re-packaged into all supported output protocols: HLS, MPEG-DASH, SRT, RTMP, RTSP, MPEGTS and Icecast, as well as recorded into DVR.

This article describes the steps to set up RIST streaming via Nimble Streamer.

Install Nimble Streamer and RIST package


RIST is available via separate package for Nimble Streamer. You need to install it in order to use this protocol.
  1. Sign up for WMSPanel account if you haven't done it yet.
  2. Install Nimble Streamer or upgrade it to latest version.
  3. Follow instruction RIST page to install RIST package. RIST package is currently available only on Ubuntu 18.04 and later versions.
Now you may proceed with the setup.

Receiving RIST streams


Being logged into WMSPanel, click on Nimble Streamer -> Live streams setting top menu. Then choose MPEGTS In tab.


Click on Add RIST stream button to open the dialog shown below.

RIST provides two modes for obtaining the stream:

  • Listen sets Nimble to wait for incoming data and process it as soon as it arrives. Your source needs to be set "Push" mode.
  • Pull sets Nimble to initiate the source server to start sending the data. Your source needs to be set to Listen mode.
If you select Listen, you'll need to specify the local interface - Local IP and Local port - which Nimble will listen to in order to get a stream. If you'd like it to listen to all interfaces, just set Local IP to 0.0.0.0.



Alias is the name that will be used in incoming streams list.

Also check Add outgoing stream checkbox and fill in the Application name and Stream name fields if you'd like to automatically create outgoing stream for further processing. This step is specific to MPEG-TS streams in Nimble.

If you'd like get stream in Pull mode, you'll need to specify the Remote IP and Remote port where you'll get your stream from.


Once you save setting, the settings will be applied  create outgoing stream, you can use a breadcrumbs on top of settings page to go to live streams of specific server (by clicking "RIST demo" server link in this example):


You can now use this stream for further processing and playback similar to other MPEG-TS processed streams.

Sending RIST streams


We assume that you have a live stream ready to be sent out. If you don't, please refer to live streaming scenarios to see how you can set it up.

Go to UDP streaming tab to add new setting.


Click on Add RIST setting to see dialog below.
RIST allows streaming in two modes:

  • Push will initiate the active sending of the selected channel to the destination IP/Port which is working in Listen mode.
  • Listen will make Nimble Streamer wait for the Pull command from remote server.

Mode is a field to set this behavior.


If you choose Push mode, you'll see Remote IP and Remote port fields to specify the destination.

If you choose Listen, you'll be able to specify Local IP and Local port to be used for listening for a connection. Just like in receiving use case, If you'd like it to listen to all interfaces, just set Local IP to 0.0.0.0.


Source application name, Source stream name and their respective PMT PIDs, Video PIDs and Audio PIDs describe where the content is taken from for further transmission. Those streams need to be defined prior to making this setup. You may define multiple sources - in this case RIST channel will have multiple streams in it.

Supported RIST parameters


Nimble Streamer allows using the following parameters. You need to set them both on sender and receiver side. You can refer to RIST documentation for more details and tips on their usage.

  • aes-type is encryption type, set if only secret is not empty. Available values are are: 0 - none, 128 - AES-128, 192 - AES-192, 256 - AES-256.
  • bandwidth is RIST recovery bandwidth (Kbit/s), it's 100000 by default
  • buffer is maximum RIST recovery buffer size in milliseconds, it's 1000 by default.
  • cname is a manually configured identifier.
  • compression defines compression, 0 is for disabled, 1 is for enabled.
  • keepalive-interval is a Keep-alive timeout in milliseconds, it's "100" by default.
  • miface is multicast interface name.
  • profile is RIST profile. Available values are are: simple, main (default)
  • reorder-buffer is reorder buffer size in milliseconds, it's 25 by default.
  • return-bandwidth is RIST recovery return bandwidth (Kbit/s), it's 0 by default
  • rist-logging is logging level. Available values are are: quiet (default), info, error, warn, debug, simulate
  • rtt is round-trip-time (RTT) in milliseconds, it's 500 by default.
  • secret is encryption password.
  • session-timeout is a session timeout, in milliseconds.
If you have any questions about these parameters, please refer to librist documentation as describe below.

libRIST


Nimble Streamer uses librist library version 2.6.5. If you have any questions on RIST parameters usage and other topics, please refer to RIST documentation. and contact RIST team about it.


Related documentation


Softvelum Nimble Streamer, MPEG-TS support in Nimble Streamer

April 12, 2020

Nimble Streamer DRM with Widevine, FairPlay, Playready, EZDRM and VCAS

Softvelum team is glad to announce that Nimble Streamer now has full support for a number of Digital Rights Management (DRM) capabilities for live streaming.

The following encryption engines are supported to protect MPEG-DASH:
  • Google Widevine™
  • Microsoft Playready™
Also, Apple FairPlay is supported t protect Apple HLS streams.

The following key management servers are supported:
  • Widevine Cloud Service with key rotation
  • EZDRM™ key management support for Widevine, FairPlay and Playready
Also, previously supported Verimatrix™ VCAS key management for HLS protection is fully supported.

DRM configuration is performed via easy-to-use configuration file which allows defining per-application setting for all available DRM features.

Visit DRM feature page to learn more about DRM setting.

Nimble DRM is part of Nimble Addenda premium package which requires a license accessible via monthly subscription. Addenda covers Nimble DRM and Nimble Advertizer capabilities.

Please read Nimble DRM spec and subscribe for a license to try it in action.

Later on we'll release some other popular key management systems, let us know which one you're looking for the most.

Related documentation


Nimble Streamer DRM, Nimble Addenda, Nimble Advertizer

April 7, 2020

iCloud support and file operations in Larix Broadcaster for iOS

Larix Broadcaster for iOS now has improved capabilities for recording live stream into local files.

Here are the options you can use for files storage:

  • Local storage available via MacOS Finder as described in this article.
  • iCloud Drive
  • Photo Library

You can also split video into sections, like it's usually done in dash cams. The recording is divided into multiple files by length.

Here's how you can set this up.

First, install Larix Broadcaster from AppStore.

Go to app Settings / Capture and recording / Record menu.


First, turn on the Record stream to automatically record any live stream.

Tap on Storage to select the default storage for recorded videos and screenshots. This will be one of 3 options mentioned above.

Split video into sections allows defining the length of recording which the stream will be split into. By default of Off.

Once you make recordings, go to Settings / Manage saved files menu.


Here you can long tap on file name to move it to proper destination. You can also tap Edit to perform multiple files operation. The iCloud tab will show the content of respective iCloud Drive folder. The recorded or copied files an be found in respective folders.

Take a look at a brief video overview of this feature.





Let us know if you'd like any improve in this feature set.

Related documentation


Larix Broadcaster, Larix documentation reference, Softvelum YouTube channel,

April 2, 2020

SRT FEC (forward error correction) support in Nimble Streamer

Softvelum is an active adopter of SRT technology and Nimble Streamer has extended support for it.

One of the features introduced in latest SRT library versions is the ability to set custom packet filters for SRT transmission. The first introduced built-in filter is Forward Error Correction (FEC).

Before using feature, please read carefully the SRT Packet Filtering & FEC feature documentation in SRT library github repo.

1. Disclaimer


We assume you are already familiar with SRT setup and usage, and you've successfully used SRT in other scenarios and use cases.

Before proceeding further, set up a test streaming scenario and make sure it works without any filters.

FEC filter is still under development and we've added it per requests from our customers.
Here is what you need to consider before using it:
  • Use FEC filter feature on your own risk.
  • It may crash the server, so if you face any issues, check Nimble Streamer logs to analyse the problem.
  • Read Known issues section below in case of issues.
  • Try using it with test servers and test streams first, and then move to production only when you make sure it works as expected.

2. Upgrade


In order to use this filter, you must upgrade Nimble Streamer and make sure you have the latest SRT library package.

  1. Nimble Streamer version must be at least 3.6.5-6, use this procedure to upgrade.
  2. SRT library package must be at least 1.4.1, use this page to get upgrade instruction.

Once you upgrade and re-start Nimble Streamer instance, you may proceed to further setup.

3. Setup details


According to information from SRT developers team, the FEC filter must be set on both sender and receiver sides and at least one side should define a configuration for it. In our example we'll define configuration parameter on sender.

As was mentioned, we assume you've set up your SRT streaming scenario. Let's modify it to set up sender part.

3.1 Sender


Go to "UDP streaming" tab on "Live streams setting" page and open your SRT setting. Scroll down to parameters list and add new "filter" parameter with a value which you find appropriate, as shown on a screenshot below.



We use "fec,cols:10,rows:5" there just as example, but you can use any other value which you find appropriate for your case, please refer to FEC documentation to learn more.

As you see we use latency and maxbw parameters as described in this article, we highly recommend always using them for your cases as well.

3.2 Receiver


Now on a receiver side, you need to define "filter" parameter with "fec" value. Notice that you don't need to set more details because you've defined them earlier on sender side.

In case of Nimble Streamer setup, go to "Live streams settings" page, "MPEGTS In" tab and add incoming stream. Then enter "filter" parameter with "fec" value as shown below.



This was an example of FEC usage in Nimble Streamer.

4. Known issues


In case of any issues please analyse Nimble Streamer logs to get more details for further analysis.

As this feature is under development, it has a number of issues. We've faced one of the issues during testing.

When FEC is enabled and lots of packets are dropped, it does not recover:
If a large drop is simulated on the line, SRT gets it's self into a state where it's no longer transmitting packets:
The following message is displayed:
16:40:22.279871!W:SRT.c: FEC: LARGE DROP detected! Resetting all groups. Base: %1583134490 -> %1583139490(shift by 5000).
It's still not fixed as of April 3rd, 2020:

So if you find this error in your Nimble Streamer logs, just disable FEC filter for now.

This is also the advise for any other issues related to FEC: if you face any uncertainly, just remove FEC filter and use SRT without it because we won't be able to help you.

You can look for existing issues and solutions in SRT issues on github. You can post your questions there in case of concerns with FEC filter.

Related documentation


SRT support in Softvelum products, SRT in Nimble Streamer,

March 30, 2020

2020 Q1 summary

This first quarter of 2020 brought a lot of disruption into lives of billions of people. The unprecedented global measures to reduce the harm from pandemic require a lot of businesses to move online, work remotely and use live streaming more intensively.

Softvelum is fully committed to provide the best support to our customers, as always. Since the early days of inception, we were working remotely full-time. We were building and adjusting our business processes to keep the efficiency while expanding our team. Now when we have to self-isolate in order to stay healthy, we keep doing what we used to do through all these years, keeping the level of support on the same highest level. Feel free to contact our helpdesk and take a look at the list of our social networks at the bottom of this message to stay in touch with us.

With the extreme rise of interest for live streaming, we keep working on new features, here are updates from this quarter which we'd like to share with you.



Mobile products

Mobile streaming in on the rise now, so we keep improving it.




SRT

SRT protocol is being deployed into more products across the industry. Our company was among the first to implemented it into our products, and now we see more people building their delivery networks based on this technology. So we've documented this approach:
  • Glass-to-Glass Delivery with SRT: The Softvelum Way - a post for SRT Alliance blog about building delivery from mobile deice through Nimble media server into mobile player.
  • Glass-to-glass SRT delivery setup - a post in our blog describing setup full details.
  • All of our products - Nimble Streamer, Larix Broadcaster and SLDP Player - now use the latest SRT library version 1.4.1.
  • Just in case you missed, watch vMix video tutorial for streaming from Larix Broadcaster to vMix via SRT, which can also be used a the source for such delivery chain.


Live Transcoder

We are continuously improving Live Transcoder so this quarter we made a number of updates to make it more robust and efficient. Here are the latest features we've made.

  • You can now create transcoding pipelines based only on NVENC hardware acceleration which works for Ubuntu 18.04+. Read this setup article for more details.
  • FFmpeg custom builds are now supported. This allows using additional libraries that are not supported by Transcoder at the moment. Read this article for setup details.
  • Transcoder control API is now available as part of WMSPanel API. It's a good way to automate some basic control operations.


Nimble Streamer

Read SVG News article about how Riot Games build their streaming infrastructure with various products, including Nimble Streamer.

A number of updates are available for Nimble Streamer this quarter:

Also, take a look at the State of Streaming Protocols for 2020Q1.


If you'd like to get our future news and updates, please consider following our social networks. We've launched a Telegram channel recently and we now make more videos for our YouTube channel. As always, our TwitterFacebook and LinkedIn feeds keep showing our news.



Stay healthy and safe, our team will help you carry on!

The State of Streaming Protocols - 2020 Q1

Softvelum team keeps tracking the state of streaming protocols. It's based on stats from WMSPanel reporting service which handles data from Wowza Streaming Engine and Nimble Streamer servers. This quarter WMSPanel collected data about more than 17.8 billion views. Total view time for our server products is 3.04 billion hours this quarter, or 33+ million view hours per day.

The State of Streaming Protocols - Q4 2019
The State of Streaming Protocols - Q1 2020

You can compare these numbers with metrics from Q4 2019:

The State of Streaming Protocols - Q4 2019

You can see a slight decrease of HLS while shifting views to progressive download and MPEG-DASH

We'll keep tracking protocols to see the dynamics. Check our updates at FacebookTwitter, Telegram and LinkedIn.

If you'd like to use these stats, please refer to this article by original name and URL.

March 22, 2020

Building NVENC-only pipeline with Nimble Transcoder

Live Transcoder for Nimble Streamer provides wide feature set for transforming live content using both software libraries and hardware acceleration.

NVidia NVENC always fully supported in Live Transcoder for decoding and encoding but all filtering operations were performed using CPU. That caused extra resources usage to transfer processed frames among CPU, GPU and RAM.

Nimble Live Transcoder now allows building transcoding pipelines which are performed completely with NVidia GPU hardware acceleration. This is done using specific FFmpeg libraries which we use in addition to our own code.

We'll show you how to set up this NVENC-powered processing chain.

1. Installation and initial setup


We assume you've already set up Nimble Streamer, it's been set up to get an incoming live stream and you've tested basic streaming. In our example we'll use a stream which application name is "input" and stream name is "source".

If you're not familiar with Live Transcoder, take a look at Transcoder documentation reference.

Notice that the described functionality is available on Ubuntu 18.04 only. We'll support other upcoming LTS Ubuntu releases as well.

The basic steps to make NVENC working are as follows:

  1. Install the latest NVidia drivers on your server.
  2. Create a transcoder license and subscribe for it.
  3. Install Live Transcoder add-on.
  4. Create some simple scenario with CPU transcoding (e.g. downscale your stream to 240p). This way you'll make sure the transcoder was set up properly.
If you already have Transcoder installed, please run these commands to upgrade the package:
sudo apt-get update
sudo apt-get install nimble-transcoder

Now create a new scenario to start a new pipeline setup.

2. Decoder setup


Once you create a new scenario, drag and drop a blue decoder element onto the dashboard. There you need to specify "NVENC-ffmpeg" in Decoder field.


Once the incoming stream is received, Nimble Transcoder will use proper NVDEC/CUVID FFmpeg decoder: h264_cuvid, hevc_cuvid or mpeg2_cuvid. Each decoder has its set of options in case you'd like to fine-tune them or if you want to use extended feature set.

The GPU core number from GPU field will be used in the pipeline which you create. So all further filters and encoders will recognize the source GPU core and will execute their transformations there.

One of those features for all decoders is the ability to resize the frame during decoding. This operation is highly optimized and you can use it to reduce further resource usage. This is available via "resize" parameter as shown on a picture below.


This feature is specifically helpful when you have FullHD stream input and you need to downscale it further. This resolution requires a lot of resources to handle so if you make initial downscale to HD or even lower resolution, then all further operations will consume less RAM and processing power on GPU.

Notice that all forwarding features (subtitles and SCTE-35 markers forwarding) mentioned at the bottom of the dialog will work regardless of decoding option which you choose.

Now let's set up filtering.

3. Filtering


Once the frame is decoded you can process it via a set of ffmpeg filters which are able to control NVENC behavior.
Nimble Transcoder supports a number of those, here are the most frequently used:

  • "split" - allows creating several identical outputs from input video. It's available as a filter element in a tool box of Transcoder UI.
  • "scale_npp" performs frame scaling. You add a custom filter to your scenario, set its name to "scale_npp" and its value to resolution, e.g. "854:480" or "640:360".
  • "fps" is a filter which sets the frames per second value. It's also defined via custom filter.
Notice that scale_npp can have only one output.

Notice that regular Scale filter from the UI toolbox will not work with GPU-decoded frames as well as other regular ffmpeg filters just because the processing is done internally in GPU.

However, you can take the frame out of GPU and process it separately using "hwdownload" and "hwupload_cuda" filters. To add them, add a custom filter, set its name as mentioned and leave the value field empty. Your steps will be as follows:

  1. Add "hwdownload" to get the frame from GPU.
  2. Add "format" custom filter with "nv12" value to set proper frame format.
  3. After that you can use regular FFmpeg filters.
  4. Then add "hwupload_cuda" filter to put it back into GPU processing pipe.

Notice that it will increase RAM/CPU usage so use it only if you need to do something you cannot do on GPU.

Let us know if you need information about other filters.

4. Encoder setup


Having the content transformed via filters, you can now encode it. Add encoder element to your scenario and select "FFmpeg" in "Encoder" field.

Then define "Codec" field as either h264_nvenc or hevc_nvenc - for H.264/AVC or H.265/HEVC codecs respectively.


You can use any parameters applicable for h264_hvenc or hevc_nvenc encoders.

For h264_hvenc most popular parameters would be these:
  • "b" defines bitrate in Mbps. Example: "4.5M" for 4.5 Mbps.
  • "profile" defines encoding profile, its possible values are "baseline", "main", "high", "high444p".
  • "preset" stands for encoding preset, its values are "default", "slow", "medium", "fast", "hp", "hq", "bd", "ll", "llhq", "llhp", "lossless", "losslesshp".


For more encoder settings, refer to FFmpeg documentation.

Just like you saw in decoder element, all forwarding features from listed under Expert setup at the bottom of the dialog will work properly.

5. Audio setup


When you have video pipeline set up, you need to define audio part. If you don't need any sound transformation, you can add a passthrough for it just like it's described in other setup examples.

6. Example


We've made a video showing the example of setup process, take a look at it:



Here's what we set up there:

  • A decoder has a downscale to 720p as described in section 2 above.
  • A split filter which has 3 equal outputs.
  • One output goes directly to the encoder. It takes the downscaled frame and simply encodes it into live/stream_720 output. The encoding parameters are similar to what you see in section 4.
  • Another output it processed via Scale_npp filter which scales it to 480p. That filter is described in section 3. Its output is encoded to live/stream_480 output stream.
  • One more output of split filter goes through "Scale_npp" (to scale to 360p) to "Fps" filter which sets its "fps" value to "25". Then it's encoded into live/stream_360 output.
  • Audio input is passed through for all 3 available output renditions.

This scenario uses only NVENC capabilities for video processing. The output streams are then transmuxed into the output streaming protocols which you select in global server settings or specific settings for "live" application.

Later on we'll introduce a video tutorial showing this scenario creation step by step.

If you have any questions, issues or questions, please feel free to contact us.

Related documentation


Live Transcoder for Nimble Streamer, Transcoder documentation reference,

March 16, 2020

SLDP Player setup for Android

SLDP Player is an application which allows playing SLDP, SRT, Icecast, RTMP, HLS and MPEG-DASH on Android device.

In this article we'll overview all settings of SLDP Player.

You can install it from Google Play and once it's installed, you'll see connections menu which will be empty.


Click on plus button to enter a dialog to create a new connection.


Name field sets up a name of current connection in connections' list.

URL is the field where you define your connection. Each connection has common URI structure like protocol://server:port/app/stream/some-suffix where port can be skipped for some default value and suffix may also not be needed:

  • HLS, MPEG-DASH and Icecast streams will have familiar URLs like http://servername:8081/app/stream/playlist.m3u8, https://servername/app/stream/manifest.mpd or https://servername/app/stream/icecast.stream
  • SLDP will have name like sldp://servername:8081/app/stream
  • RTMP will have URL like rtmp://servername:1935/app/stream - notice that stream name is appended to the end of the URL.
  • SRT address will look like srt://servername:1234/ . If you use streamid - see its description below.

Source type for HTTP is used when you play an HTTP-based protocol but the protocol cannot be determined based on its parts like playlist or manifest name. E.g. a URL like https://servername/live/stream.index can mean both HLS and MPEG-DASH, so in that case you should explicitly specify that. In other cases just leave it a s Auto.

SLDP offset parameter allows decreasing start time for streams, read this article for more details.

Buffering defines the size of buffer used before playback starts. It's used to avoid re-buffering during connection issues.

Synchronized playback is described in this article.

Bitrate for Internet radio is for cases when you use SLDP for transmitting online radio and it has adaptive bitrate (ABR). This parameter defines the default bitrate which is used for starting the playback.

SRT passphrase and SRT pbkeylen are specific to your use case security settings so refer to your server admin for more details.

SRT latency and SRT maxbw are related to data re-transmission in SRT connection. Read this article to understand that better.

SRT streamid field is used only if your data source uses that field for identifying streams.


Once you tap Save, you'll see a new entry in streams list. In the example below we've saved SRT playback example from glass-to-glass SRT delivery article.


Now you can just tap on the name and start watching the stream.

You can also make screenshots. To enable screenshots, long-tap on your connection and then select "Play (enable FX)". Once asked for more permissions - just enable them too. When you start playback, you can double-tap on the screen to get a screenshot.


You can also see that in action in this video.



Take a look at Softvelum Playback solutions and let us know if you have any questions.

March 9, 2020

Glass-to-glass SRT delivery setup

SRT delivery of live streams is gaining momentum as more companies add this protocol support into their products. Being SRT Alliance member, Softvelum provides extensive SRT support in various products.

Currently it's possible to create a glass-to-glass delivery with SRT using Softvelum products.


This article describes detailed setup of the following streaming scenario:
  • Content creator is streaming from Larix Broadcaster to Nimble Streamer server instance.
  • Nimble Streamer takes incoming SRT and provides output for playback.
  • Viewer uses SLDP Player for pulling live stream from Nimble Streamer.
To make this work, we'll set up each element of this chain. We'll show the setup within a local network and you can make it work across any network.

1. Set up Nimble Streamer


Before moving forward you need to complete the following steps:


We'll use an instance available via 192.168.0.104 IP address in local network, all mobile devices will be connected to the same local network.

Here's what we'll set up on Nimble Streamer side:

  • Larix Broadcaster app will use SRT in Push (Caller) mode to deliver the content, so we'll set up Nimble in "Listen" mode to receive it.
  • SLDP Player will work in "Pull" mode to retrieve live stream for playback, so we'll set up Nimble output in "Listen" mode to take those requests and respond with content.

So let's set up both these elements.

1.1 Receiving input SRT via Listen


In WMSPanel, go to Nimble Streamer -> Live streams settings top menu, then choose "MPEGTS In" tab.


Now click on Add SRT stream to see a new dialog.


Here you need to choose Listen from Receive mode drop down box, enter 0.0.0.0 in Local IP and use some port that is available on your server, like 2020 in this case. The Alias is used for further reference in UI.

Check Add outgoing stream checkbox and define Application name and Stream name, this will create proper output which we'll use later on the next step.

Once you click Save, you'll see this setting being synced to Nimble Steamer instance.


Now if you click on MPEGTS Out tab, you'll see that proper output has also been described for further use.


This is required because SRT uses MPEG-TS as its media transport, which requires this distinction due to a nature of that protocol. You can read more about MPEGTS setup in this article.

From this moment you'll be able to publish a stream into Nimble Streamer, so we have one more setup step left.

1.2 Providing the SRT output via Listen


Go to UDP streaming tab.


Click on Add SRT setting button to see the following dialog.


Set Mode field to ListenLocal port is selected from available ports. Local IP is set to "0.0.0.0", this will allow getting requests on all interfaces.

If you want to use Nimble Streamer with connections from outside of your network, you need to make sure that your firewall and network in general are set up properly to make those connections.

Source application name and Source stream name are defined as the app and stream name from MPEGTS Out section above, those are "srt" and "output". This will redirect the source content into SRT output.

In addition you may define maxbw and latency parameters in case if you use some uncontrolled network for delivery. Read this article for more details.

Now when we have an instance of Nimble Streamer ready to work, we can set up a streaming app.

2. Set up Larix Broadcaster SRT streaming


Larix Broadcaster is a free app, you can install it from Google Play and from AppStore. You can find out about all features of Larix on Android page and on iOS page, they have full lists of capabilities.

Let's use Larix Broadcaster for Android to set up live streaming to Nimble Streamer instance. Once you install and launch it, you'll see preview screen.


Click on gear icon to enter settings dialog.


You may keep default settings, or change some parameters like the resolution from Video menu. You can discover them by browsing the menus.

Now let's set up SRT output stream. Go to Connections menu.


We've previously set up some connections to testing purposes - you can see RTMP and RTSP being both checked. This allowed streaming simultaneously into two destinations. We need to add a new one, so tap on New connection menu.


Here you need to add a name for your connection and then enter the publishing URL. That URL consists of srt:// prefix, the address of a server and port number. In our case it will be srt://192.168.0.104:2020/ - the IP of a server and proper port which we used during the SRT setup on step 1.1. You may leave other options as they are.

After saving a setting you will see it in the list. You can un-check other connections if you want to stream only via the new one.


Now return to preview screen. You can push the big red circle button to start streaming.


The button will change its shape and you will see FPS and streaming length on top and particular stream stats in the bottom.

Now let's watch this stream on other device.

3. Set up SLDP Player SRT playback


SLDP Player is a solution which allows playing SLDP low latency protocol on HTML5 pages and provides wide playback capabilities via Android and iOS apps. You can install it from Google Play and AppStore.

We'll use Android app to demonstrate the SRT playback. Once you install it, you'll see connections menu which will be empty.


Click on plus button to enter a dialog to create a new connection.


Here you will enter a connection Name and a URL. The URL is our case will be srt://192.168.0.104:2021/ where the IP and port are taken from step previous 1.2.

Also read more about additional settings of SRT streaming and general overview of SLDP Player.

Once you tap Save, you'll see a new entry in streams list.


Now you can just tap on the name and start watching the stream.




That's it. You can change any of the described components for your streaming scenario as well as combine them with other products and features of our company.

Let us know if you have any questions about the described products and setup.

Related documentation


SRT support overview for Nimble Streamer, SRT setup in Nimble StreamerLarix Broadcaster, Larix Broadcaster docs referenceSLDP Player

March 4, 2020

Using Certbot with Nimble Streamer working on port 80

When you start using Certbot with Nimble Streamer, you may face the use case when Nimble Streamer is running on port 80. We'll show how this part can be handled.

If you follow this instruction for Certbot on step "4. Choose how you'd like to run Certbot", you need to choose "No, I need to keep my web server running." option.

This setup considers serving Cerbot's  '.well-known; 'folder by Nimble Streamer.

To make it available do the following.

1. Create a folder /pub/.well-known, assign nimble as an owner using the following commands:
sudo mkdir /pub/.well-known
sudo chown nimble:nimble /pub/.well-known
sudo chmod 775 /pub/.well-known
You can use any folder location, but please change it accordingly in other steps.

2. Go to Nimble Streamer -> HTTP origin applications top menu. Click on Add origin application and set .well-known as shown on the screenshot :



3. Choose Nimble Streamer -> Edit Nimble Routes menu and click on Add VOD streaming route then set it as shown on a screenshot.


4. Execute the following command to get certificates, with your_domain_name replaced with your domain name:
certbot certonly --webroot -w /pub -d your_domain_name

That's it. Now you may proceed with Certbot setup instructions from our original article.

Related documentation


Using Certbot with Nimble StreamerSSL support for HLS, MPEG-DASH, Icecast, MPEG-TS and SLDP, Paywall feature set

February 26, 2020

Synchronized playback on multiple devices with SLDP

Playing a live stream simultaneously on multiple devices often requires synchronized playback.

The cases look simple:
  • One big screen shows something and viewers need to have the same audio on their individual devices.
  • A second screen application need to be in sync with ongoing live stream on TV.
  • Multiple screens in the same room show the same content with single source of sound.
  • A number of surveillance cameras need to be shown on the same page.
You probably know other cases where you might have the same requirement.

With traditional delivery protocols like HLS ad MPEG-DASH it's very hard to achieve without dramatically increasing the latency.

SLDP live streaming protocol allows delivering streams real time with low latency, adaptive bitrate and small zapping time. Now it also allows synchronizing the playback among devices and browsers for all the cases which you can see above. It's supported on both server side and client side.

Web demo. You can see how this feature works on demo web page with two SLDP players.

Mobile and web demo. Take a look at feature demonstration below with HTML5 browser, Android and iOS.


BTW, check our YouTube channel for other videos.



Now let's see how you can start using this feature.

Notice that all implementations use additional buffer to make proper synchronization which will increase latency. This buffer must be the same across all platforms. Check each player platform for parameter setup.

Enable feature in Nimble Streamer


We assume you are already familiar with Nimble Streamer setup and you have a working SLDP live stream. If not, please read SLDP setup and usage article to make it work.

On your server, edit nimble.conf to add this parameter and re-start Nimble Streamer:
sldp_add_steady_timestamps = true

You can visit Nimble Streamer parameters reference to learn more about operating that config file.

Once you re-start the server, every SLDP live stream will have a steady clock timestamps needed for playback adjustments. If the connected player doesn't request the steady clock time, Nimble Streamer will not have it in the output stream to avoid any overhead.

Playback in HTML5 SLDP player


If you want to have a synchronized playback in web browsers, use our freeware HTML5 SLDP player.

By default, the feature is turned off. To enable it, add sync_buffer buffer parameter which specifies the buffer size in milliseconds. Recommended values are from 1000 to 5000 and it needs to be the same in all players.

Playback on iOS


SLDP Player for iOS allows enabling and using synchronized playback.
  1. Install SLDP Player via AppStore.
  2. In connection setting, enable Synchronized playback flag as shown on the screenshot below.
  3. Use Buffering field to define the buffer for this feature. As mentioned above, it needs to be the same in all players.

Now save setting and start playback.

Playback on Android


Android SLDP Player also allows enabling and using synchronized playback.

  1. Install SLDP Player from Google Play.
  2. In connection setting, enable Synchronized playback flag.
  3. Use Buffering field to define the buffer for this feature, it needs to be the same in all players.


Now save connection and start playing the stream.


You may also use this kind of custom URL
sldp://ap.address:8081/live/stream?steady=true&buffering=2000
to set "Synchronized playback" value to "On" and "Buffering" value to "2000". This will allow passing URL among viewers with no need for additional instructions.


Once you start playback on multiple devices and browsers with that feature enabled, your playback on all devices will catch up.

Let us know how it works for you and what improvements you'd like to have for it.

Related documentation


SLDP technology overview, SLDP support in Nimble Streamer, Softvelum playback solutions,

February 17, 2020

Fallback of published RTMP, RTSP and Icecast streams

RTMP, RTSP and Icecast live protocols can be pulled by Nimble Streamer for further processing, and in order to improve robustness each pulled stream can have fallback streams. So if a primary stream cannot be pulled for some reason from the origin, an alternative stream is pulled to do a failover. The playback is not stopped so the user experience is not harmed much.

The aforementioned protocols are often used in publishing mode when the stream is pushed into Nimble Streamer for processing. In this case there is no built-in way to cover that.

Nimble Streamer provides another reliable mechanism for covering fallback of RTMP, RTSP and Icecast published streams, the Live Transcoder hot swap feature set. It allows shifting to secondary stream if the primary one is down for some reason, while maintaining the playback output  for video and audio.

The following steps allow setting this up.

1. Install Live Transcoder


Hot swap feature set requires Live Transcoder premium add-on for Nimble Streamer.

There are two main reasons for Live Transcoder usage:

  • Secondary (substitution) stream needs to fit the primary (original) stream by video resolution and audio sample rate.
  • The primary stream need to be decoded in order to get the substitution smoothly.

You need to obtain a license for Transcoder, then install the add-on and register a license for it.

2. Set up published inputs


You need to have both primary (original) and secondary (substitution) stream being set up and published into Nimble Streamer. In case you haven't done it yet, check the articles for RTMP, RTSP and Icecast publication setup.

3. Set up hot swap failover


Having both streams ready and Transcoder installed, you can set up failover hot swap for them. Follow the instructions and make sure you complete all steps.

4. Test the setup


As always, you need to test the setup before using it in production use cases. If you have any questions or issues, please contact our team so we could help.

Related documentation


Live streaming via Nimble StreamerFailover hot swap, Emergency stream hot swap,

February 13, 2020

Live Transcoder control API

Nimble Streamer Live Transcoder is well known for its drag-and-drop web UI which allows setting up live stream transformation of any complexity using any browser.

However we have a number of users who need to have some automation of Transcoder operations.

Our team introduced our first approach to Transcoder API.

Visit this WMSPanel API page to see all details of API setup and usage.

The operations which you can do over Transcoder instance are as follows:

  • Retrieve the list of transcoder scenarios
  • Get details of particular scenario
  • Pause and resume particular scenario
  • Delete an existing transcoder scenario

So having a set of scenarios for your servers, you can operate them just like you can do it from scenarios list in UI.

If you need more API call, please feel free to share them using our helpdesk so we could prioritize features in our wishlist.

Related documentation


Nimble Streamer Live Transcoder, Transcoder documentation reference,

February 6, 2020

HbbTV MPEG-DASH support in Nimble Streamer

Hybrid Broadcast Broadband TV (HbbTV) has been working with MPEG-DASH for some time by now and Nimble Streamer MPEG-DASH implementation also has that support.

To enabled this support, a specific profile needs to be added to DASH outgoing streams. This can be done by adding the following parameter into nimble.conf file:

dash_live_profiles = urn:hbbtv:dash:profile:isoff-live:2012,urn:mpeg:dash:profile:isoff-live:2011

You need to re-start Nimble Streamer after changing the config. Read this page to learn more about operating config file.

Related documents


MPEG-DASH support in Nimble Streamer

January 23, 2020

Mobile streaming to Dacast and Akamai

Larix mobile SDK allows publishing live streams from mobile devices to wide variety of destinations like media servers and streaming services. Some destinations require special handling due to authorization or other concerns.

Dacast service provides a turn-key solution for live streaming. It uses Akamai CDN for ingest and delivery, making it simple for an average user to get it working. However, Akamai has its requirements for authorization and other stream parameters. Nimble Streamer allows publishing to Akamai already so we've added the same support into Larix Broadcaster.

Here is how you can get it working.

You may also read Streaming with the Larix Broadcaster Mobile App article from Dacast knowledge base to see other details of app setup.

Set up Dacast stream


We assume you already have Dacast account, so just open the dashboard and add a new stream.



Click on the stream name to see its full setup details. Click on Encoder tab on top to see the encoder setup details.

Click on Other RTMP encoder to see full parameters of RTMP connection.



Here you see Login and Password values which you will use later in Larix.

Now click on "Click if your encoder has one field" link to see a field with full URL for publishing.


Copy this full URL for later use, it should look like this:
rtmp://p.ep123456.i.akamaientrypoint.net/EntryPoint/dclive_1_150@123456

While you're in Dacast dashboard, check Publish settings tab to get a player page in order to check future result of your setup.

Now let's get to Larix setup.

Set up Larix Broadcaster


Larix Broadcaster is available for both Android and iOS platforms so just install it as usual.

Open the app and and enter settings by tapping on gear icon.

Tap on Connections -> New connection to enter a form below.



  • Name field can contain any alias you want for your connection. Larix Broadcaster allows streaming to multiple destinations simultaneously, so this is how you will distinct them from one another.
  • URL field defines the target destination. Insert the URL which you copied in the previous section.
  • Target type must be set to Akamai/Dacast.
  • Login and Password need ot be exactly like you've seen in Dacast connection settings.

Save connection, get back to connections list and make sure you select this new connection.
Now return to image preview screen and just hit the red button to start streaming.

Now check Dacast player page from previous section to watch the results.

Akamai


This setup procedure is applied the same way for publishing to Akamai CDN via RTMP. The publishing URL will have the same structure with same type of credentials. Larix Broadcaster target type is also "Akamai/Dacast". Please refer to Akamai website to learn more about its setup.



If you have any issues with this feature set, just let us know.

Related documentation


Larix mobile apps and SDK, Nimble Streamer RTMP feature setPublishing to Akamai from Nimble Streamer