March 30, 2020

2020 Q1 summary

This first quarter of 2020 brought a lot of disruption into lives of billions of people. The unprecedented global measures to reduce the harm from pandemic require a lot of businesses to move online, work remotely and use live streaming more intensively.

Softvelum is fully committed to provide the best support to our customers, as always. Since the early days of inception, we were working remotely full-time. We were building and adjusting our business processes to keep the efficiency while expanding our team. Now when we have to self-isolate in order to stay healthy, we keep doing what we used to do through all these years, keeping the level of support on the same highest level. Feel free to contact our helpdesk and take a look at the list of our social networks at the bottom of this message to stay in touch with us.

With the extreme rise of interest for live streaming, we keep working on new features, here are updates from this quarter which we'd like to share with you.



Mobile products

Mobile streaming in on the rise now, so we keep improving it.




SRT

SRT protocol is being deployed into more products across the industry. Our company was among the first to implemented it into our products, and now we see more people building their delivery networks based on this technology. So we've documented this approach:
  • Glass-to-Glass Delivery with SRT: The Softvelum Way - a post for SRT Alliance blog about building delivery from mobile deice through Nimble media server into mobile player.
  • Glass-to-glass SRT delivery setup - a post in our blog describing setup full details.
  • All of our products - Nimble Streamer, Larix Broadcaster and SLDP Player - now use the latest SRT library version 1.4.1.
  • Just in case you missed, watch vMix video tutorial for streaming from Larix Broadcaster to vMix via SRT, which can also be used a the source for such delivery chain.


Live Transcoder

We are continuously improving Live Transcoder so this quarter we made a number of updates to make it more robust and efficient. Here are the latest features we've made.

  • You can now create transcoding pipelines based only on NVENC hardware acceleration which works for Ubuntu 18.04+. Read this setup article for more details.
  • FFmpeg custom builds are now supported. This allows using additional libraries that are not supported by Transcoder at the moment. Read this article for setup details.
  • Transcoder control API is now available as part of WMSPanel API. It's a good way to automate some basic control operations.


Nimble Streamer

Read SVG News article about how Riot Games build their streaming infrastructure with various products, including Nimble Streamer.

A number of updates are available for Nimble Streamer this quarter:

Also, take a look at the State of Streaming Protocols for 2020Q1.


If you'd like to get our future news and updates, please consider following our social networks. We've launched a Telegram channel recently and we now make more videos for our YouTube channel. As always, our TwitterFacebook and LinkedIn feeds keep showing our news.



Stay healthy and safe, our team will help you carry on!

The State of Streaming Protocols - 2020 Q1

Softvelum team keeps tracking the state of streaming protocols. It's based on stats from WMSPanel reporting service which handles data from Wowza Streaming Engine and Nimble Streamer servers. This quarter WMSPanel collected data about more than 17.8 billion views. Total view time for our server products is 3.04 billion hours this quarter, or 33+ million view hours per day.

The State of Streaming Protocols - Q4 2019
The State of Streaming Protocols - Q1 2020

You can compare these numbers with metrics from Q4 2019:

The State of Streaming Protocols - Q4 2019

You can see a slight decrease of HLS while shifting views to progressive download and MPEG-DASH

We'll keep tracking protocols to see the dynamics. Check our updates at FacebookTwitter, Telegram and LinkedIn.

If you'd like to use these stats, please refer to this article by original name and URL.

March 22, 2020

Building NVENC-only pipeline with Nimble Transcoder

Live Transcoder for Nimble Streamer provides wide feature set for transforming live content using both software libraries and hardware acceleration.

NVidia NVENC always fully supported in Live Transcoder for decoding and encoding but all filtering operations were performed using CPU. That caused extra resources usage to transfer processed frames among CPU, GPU and RAM.

Nimble Live Transcoder now allows building transcoding pipelines which are performed completely with NVidia GPU hardware acceleration. This is done using specific FFmpeg libraries which we use in addition to our own code.

We'll show you how to set up this NVENC-powered processing chain.

1. Installation and initial setup


We assume you've already set up Nimble Streamer, it's been set up to get an incoming live stream and you've tested basic streaming. In our example we'll use a stream which application name is "input" and stream name is "source".

If you're not familiar with Live Transcoder, take a look at Transcoder documentation reference.

Notice that the described functionality is available on Ubuntu 18.04, 20.04 and 22.04 only. We'll support other upcoming LTS Ubuntu releases as well.

The basic steps to make NVENC working are as follows:

  1. Install the latest NVidia drivers on your server.
  2. Create a transcoder license and subscribe for it.
  3. Install Live Transcoder add-on.
  4. Create some simple scenario with CPU transcoding (e.g. downscale your stream to 240p). This way you'll make sure the transcoder was set up properly.
If you already have Transcoder installed, please run these commands to upgrade the package:
sudo apt-get update
sudo apt-get install nimble-transcoder

Now create a new scenario to start a new pipeline setup.

2. Decoder setup


Once you create a new scenario, drag and drop a blue decoder element onto the dashboard. There you need to specify "NVENC-ffmpeg" in Decoder field.


Once the incoming stream is received, Nimble Transcoder will use proper NVDEC/CUVID FFmpeg decoder: h264_cuvid, hevc_cuvid or mpeg2_cuvid. Each decoder has its set of options in case you'd like to fine-tune them or if you want to use extended feature set.

The GPU core number from GPU field will be used in the pipeline which you create. So all further filters and encoders will recognize the source GPU core and will execute their transformations there.

One of those features for all decoders is the ability to resize the frame during decoding. This operation is highly optimized and you can use it to reduce further resource usage. This is available via "resize" parameter as shown on a picture below. Notice that the value is set as <width>x<height>.


This feature is specifically helpful when you have FullHD stream input and you need to downscale it further. This resolution requires a lot of resources to handle so if you make initial downscale to HD or even lower resolution, then all further operations will consume less RAM and processing power on GPU.

Notice that all forwarding features (subtitles and SCTE-35 markers forwarding) mentioned at the bottom of the dialog will work regardless of decoding option which you choose.

If you change decoder settings of a scenario which is active and running, then you need to re-start the scenario.

Now let's set up filtering.

3. Filtering


Once the frame is decoded you can process it via a set of ffmpeg filters which are able to control NVENC behavior.

Nimble Transcoder supports a number of those, here are the most frequently used.

"split" - allows creating several identical outputs from input video. It's available as a filter element in a tool box of Transcoder UI.

"scale_npp" performs frame scaling. You add a custom filter to your scenario, set its name to "scale_npp" and its value to resolution, e.g. "854:480" or "640:360".

Notice that scale_npp can have only one output.


"fps" is a filter which sets the frames per second value. It's also defined via custom filter.

Picture filter allows setting a static image overlay for a video. Once you add it into your scenario, choose "CUDA" in Encoding hardware dropdown.



Notice that regular Scale filter from the UI toolbox will not work with GPU-decoded frames as well as other regular ffmpeg filters just because the processing is done internally in GPU.

However, you can take the frame out of GPU and process it separately using "hwdownload" and "hwupload_cuda" filters. To add them, add a custom filter, set its name as mentioned and leave the value field empty. Your steps will be as follows:

  1. Add "hwdownload" to get the frame from GPU.
  2. Add "format" custom filter with "nv12" value to set proper frame format.
  3. After that you can use regular FFmpeg filters.
  4. Then add "hwupload_cuda" filter to put it back into GPU processing pipe.

Notice that it will increase RAM/CPU usage so use it only if you need to do something you cannot do on GPU.

Let us know if you need information about other filters.

4. Encoder setup


Having the content transformed via filters, you can now encode it. Add encoder element to your scenario and select "FFmpeg" in "Encoder" field.

Then define "Codec" field as either h264_nvenc or hevc_nvenc - for H.264/AVC or H.265/HEVC codecs respectively.


You can use any parameters applicable for h264_nvenc or hevc_nvenc encoders.

For h264_nvenc most popular parameters would be these:
  • "b" defines bitrate in Mbps. Example: "4.5M" for 4.5 Mbps.
  • "profile" defines encoding profile, its possible values are "baseline", "main", "high", "high444p".
  • "preset" stands for encoding preset, its values are "default", "slow", "medium", "fast", "hp", "hq", "bd", "ll", "llhq", "llhp", "lossless", "losslesshp".

If your input stream is anamorphic you might need to save its SAR parameter in the output as well, especially if you’re using a 'scale' filter in your Transcoder pipeline while DAR = SAR x Width / Height. Nimble supports keeping input SAR using keep-sar parameter set to true for encoder in its ‘Video output’ section. SAR/DAR/PAR correlation is described in this article.


For more encoder settings, refer to FFmpeg documentation.

Just like you saw in decoder element, all forwarding features from listed under Expert setup at the bottom of the dialog will work properly.

5. Audio setup


When you have video pipeline set up, you need to define audio part. If you don't need any sound transformation, you can add a passthrough for it just like it's described in other setup examples.

6. Example


We've made a video showing the example of setup process, take a look at it:



Here's what we set up there:

  • A decoder has a downscale to 720p as described in section 2 above.
  • A split filter which has 3 equal outputs.
  • One output goes directly to the encoder. It takes the downscaled frame and simply encodes it into live/stream_720 output. The encoding parameters are similar to what you see in section 4.
  • Another output it processed via Scale_npp filter which scales it to 480p. That filter is described in section 3. Its output is encoded to live/stream_480 output stream.
  • One more output of split filter goes through "Scale_npp" (to scale to 360p) to "Fps" filter which sets its "fps" value to "25". Then it's encoded into live/stream_360 output.
  • Audio input is passed through for all 3 available output renditions.

This scenario uses only NVENC capabilities for video processing. The output streams are then transmuxed into the output streaming protocols which you select in global server settings or specific settings for "live" application.

If you have any questions, issues or questions, please feel free to contact us.

Related documentation


Live Transcoder for Nimble Streamer, Transcoder documentation referenceZabbix monitoring of Nimble Streamer with NVidia GPU status.

March 16, 2020

Larix Player setup for Android

Larix Player is an application which allows playing SLDP, SRT, Icecast, RTMP, HLS and MPEG-DASH on Android device.

In this article we'll overview all settings of Larix Player.

You can install it from Google Play and once it's installed, you'll see connections menu which will be empty.


Click on plus button to enter a dialog to create a new connection.


Name field sets up a name of current connection in connections' list.

URL is the field where you define your connection. Each connection has common URI structure like protocol://server:port/app/stream/some-suffix where port can be skipped for some default value and suffix may also not be needed:

  • HLS, MPEG-DASH and Icecast streams will have familiar URLs like http://servername:8081/app/stream/playlist.m3u8, https://servername/app/stream/manifest.mpd or https://servername/app/stream/icecast.stream
  • SLDP will have name like sldp://servername:8081/app/stream
  • RTMP will have URL like rtmp://servername:1935/app/stream - notice that stream name is appended to the end of the URL.
  • SRT address will look like srt://servername:1234/ . If you use streamid - see its description below.

Source type for HTTP is used when you play an HTTP-based protocol but the protocol cannot be determined based on its parts like playlist or manifest name. E.g. a URL like https://servername/live/stream.index can mean both HLS and MPEG-DASH, so in that case you should explicitly specify that. In other cases just leave it a s Auto.

SLDP offset parameter allows decreasing start time for streams, read this article for more details.

Buffering defines the size of buffer used before playback starts. It's used to avoid re-buffering during connection issues.

Synchronized playback is described in this article.

Bitrate for Internet radio is for cases when you use SLDP for transmitting online radio and it has adaptive bitrate (ABR). This parameter defines the default bitrate which is used for starting the playback.

SRT passphrase and SRT pbkeylen are specific to your use case security settings so refer to your server admin for more details.

SRT latency and SRT maxbw are related to data re-transmission in SRT connection. Read this article to understand that better.

SRT streamid field is used only if your data source uses that field for identifying streams.


Once you tap Save, you'll see a new entry in streams list. In the example below we've saved SRT playback example from glass-to-glass SRT delivery article.


Now you can just tap on the name and start watching the stream.

You can also make screenshots. To enable screenshots, long-tap on your connection and then select "Play (enable FX)". Once asked for more permissions - just enable them too. When you start playback, you can double-tap on the screen to get a screenshot.


You can also see that in action in this video.



Take a look at Softvelum Playback solutions and let us know if you have any questions.

March 9, 2020

Glass-to-glass SRT delivery setup

SRT delivery of live streams is gaining momentum as more companies add this protocol support into their products. Being SRT Alliance member, Softvelum provides extensive SRT support in various products.

Currently it's possible to create a glass-to-glass delivery with SRT using Softvelum products.


This article describes detailed setup of the following streaming scenario:
  • Content creator is streaming from Larix Broadcaster to Nimble Streamer server instance.
  • Nimble Streamer takes incoming SRT and provides output for playback.
  • Viewer uses Larix Player for pulling live stream from Nimble Streamer.
To make this work, we'll set up each element of this chain. We'll show the setup within a local network and you can make it work across any network.

1. Set up Nimble Streamer


Before moving forward you need to complete the following steps:


We'll use an instance available via 192.168.0.104 IP address in local network, all mobile devices will be connected to the same local network.

Here's what we'll set up on Nimble Streamer side:

  • Larix Broadcaster app will use SRT in Push (Caller) mode to deliver the content, so we'll set up Nimble in "Listen" mode to receive it.
  • Larix Player will work in "Pull" mode to retrieve live stream for playback, so we'll set up Nimble output in "Listen" mode to take those requests and respond with content.

So let's set up both these elements.

1.1 Receiving input SRT via Listen


In WMSPanel, go to Nimble Streamer -> Live streams settings top menu, then choose "MPEGTS In" tab.


Now click on Add SRT stream to see a new dialog.


Here you need to choose Listen from Receive mode drop down box, enter 0.0.0.0 in Local IP and use some port that is available on your server, like 2020 in this case. The Alias is used for further reference in UI.

Check Add outgoing stream checkbox and define Application name and Stream name, this will create proper output which we'll use later on the next step.

Once you click Save, you'll see this setting being synced to Nimble Steamer instance.


Now if you click on MPEGTS Out tab, you'll see that proper output has also been described for further use.


This is required because SRT uses MPEG-TS as its media transport, which requires this distinction due to a nature of that protocol. You can read more about MPEGTS setup in this article.

There are more options of controlling input SRT streams via SRT Publisher Assistance Security Set - SRT PASSet, including user/password authorization and per-stream parameters. Read PASSet overview article for more details.

From this moment you'll be able to publish a stream into Nimble Streamer, so we have one more setup step left.

1.2 Providing the SRT output via Listen


Go to UDP streaming tab.


Click on Add SRT setting button to see the following dialog.


Set Mode field to ListenLocal port is selected from available ports. Local IP is set to "0.0.0.0", this will allow getting requests on all interfaces.

If you want to use Nimble Streamer with connections from outside of your network, you need to make sure that your firewall and network in general are set up properly to make those connections.

Source application name and Source stream name are defined as the app and stream name from MPEGTS Out section above, those are "srt" and "output". This will redirect the source content into SRT output.

In addition you may define maxbw and latency parameters in case if you use some uncontrolled network for delivery. Read this article for more details.

Now when we have an instance of Nimble Streamer ready to work, we can set up a streaming app.

2. Set up Larix Broadcaster SRT streaming


Larix Broadcaster is a free app, you can install it from Google Play and from AppStore. You can find out about all features of Larix on Android page and on iOS page, they have full lists of capabilities.

Let's use Larix Broadcaster for Android to set up live streaming to Nimble Streamer instance. Once you install and launch it, you'll see preview screen.


Click on gear icon to enter settings dialog.


You may keep default settings, or change some parameters like the resolution from Video menu. You can discover them by browsing the menus.

Now let's set up SRT output stream. Go to Connections menu.


We've previously set up some connections to testing purposes - you can see RTMP and RTSP being both checked. This allowed streaming simultaneously into two destinations. We need to add a new one, so tap on New connection menu.


Here you need to add a name for your connection and then enter the publishing URL. That URL consists of srt:// prefix, the address of a server and port number. In our case it will be srt://192.168.0.104:2020/ - the IP of a server and proper port which we used during the SRT setup on step 1.1. You may leave other options as they are.

After saving a setting you will see it in the list. You can un-check other connections if you want to stream only via the new one.


Now return to preview screen. You can push the big red circle button to start streaming.


The button will change its shape and you will see FPS and streaming length on top and particular stream stats in the bottom.

Now let's watch this stream on other device.

3. Set up Larix Player SRT playback


Larix Player is a solution which allows playing multiple live protocols and provides wide playback capabilities via Android and iOS apps. You can install it from Google Play and AppStore.

We'll use Android app to demonstrate the SRT playback. Once you install it, you'll see connections menu which will be empty.


Click on plus button to enter a dialog to create a new connection.


Here you will enter a connection Name and a URL. The URL is our case will be srt://192.168.0.104:2021/ where the IP and port are taken from step previous 1.2.

Also read more about additional settings of SRT streaming and general overview of Larix Player.

Once you tap Save, you'll see a new entry in streams list.


Now you can just tap on the name and start watching the stream.




That's it. You can change any of the described components for your streaming scenario as well as combine them with other products and features of our company.


You can take a look at another similar example: Set up OBS Studio SRT streaming with Larix Broadcaster and Nimble Streamer.

Let us know if you have any questions about the described products and setup.

Related documentation


SRT support overview for Nimble Streamer, SRT setup in Nimble StreamerSRT playback stats and protectionLarix Broadcaster, Larix Broadcaster docs referenceLarix Player

March 4, 2020

Using Certbot with Nimble Streamer working on port 80

When you start using Certbot with Nimble Streamer, you may face the use case when Nimble Streamer is running on port 80. We'll show how this part can be handled.

If you follow this instruction for Certbot on step "4. Choose how you'd like to run Certbot", you need to choose "No, I need to keep my web server running." option.

This setup considers serving Cerbot's  '.well-known; 'folder by Nimble Streamer.

To make it available do the following.

1. Create a folder /pub/.well-known, assign nimble as an owner using the following commands:
sudo mkdir /pub/.well-known
sudo chown nimble:nimble /pub/.well-known
sudo chmod 775 /pub/.well-known
You can use any folder location, but please change it accordingly in other steps.

2. Go to Nimble Streamer -> HTTP origin applications top menu. Click on Add origin application and set .well-known as shown on the screenshot :



3. Choose Nimble Streamer -> Edit Nimble Routes menu and click on Add VOD streaming route then set it as shown on a screenshot.


4. Execute the following command to get certificates, with your_domain_name replaced with your domain name:
certbot certonly --webroot -w /pub -d your_domain_name

That's it. Now you may proceed with Certbot setup instructions from our original article.

Related documentation


Using Certbot with Nimble StreamerSSL support for HLS, MPEG-DASH, Icecast, MPEG-TS and SLDP, Paywall feature set