September 27, 2024
AMD Alveo U30 support
July 31, 2024
Larix Tuner is alive!
Softvelum is excited to introduce Larix Tuner, a new web service designed to simplify the management of multiple Larix Broadcaster instances.
Learn more about Larix Tuner, its features, workflow, and pricing.
June 25, 2024
Convert DVB subtitles into WebVTT for HLS
June 24, 2024
The blog is moving
Starting this month, all our new articles are published at our new blog at Softvelum.com website.
Old articles will remain in this blog and we'll occasionally be updating them once we make changes to our products.
We encourage you to follow us in our social networks: Facebook, Twitter and LinkedIn, join our channel in Telegram and subscribe to our YouTube channel. All our updates from the website, the blog or other sources are announced there.
June 5, 2024
DVR in SLDP HTML5 Player
Read this recent article to learn more about the setup and usage:
Adding DVR into SLDP HTML5 Player
ABR for WHEP WebRTC
As always, it's highly optimized for delivery to multiple simultaneous viewers to make is cost efficient.
WebRTC adaptive bitrate WHEP in Nimble Streamer
May 20, 2024
New case study: Nimble Streamer powers LiveX and VVCR cloud production
Explore our latest case study to see how Nimble Streamer and LiveX are shaping the future of live video production.
LiveX, a leading full-service production company, is revolutionizing live production using Nimble Streamer. LiveX creates and broadcasts live events for global brand leaders, leveraging the power of cloud-based tools like Virtual Video Control Room (VVCR).
Built around Nimble Streamer, VVCR enables fully remote production processes with unparalleled flexibility and scalability. From the NYC Times Square experience to coordinating over 1000 gamers in Call of Duty II events, Nimble Streamer is at the core of LiveX's remarkable projects.
Learn more about the benefits of Nimble Streamer and LiveX collaboration in cloud production.
May 16, 2024
WHEP Load Tester tool to test WebRTC performance
As part of WHEP ABR playback support in Nimble Streamer, we had to create some additional tools in order to test our own solution.
March 26, 2024
libaom-av1 and SVT-AV1 transcoding setup with Nimble Streamer
Now with Nimble Streamer support for AV1, streaming services can easily adopt this advanced format to deliver high-quality content. Adding another significant piece into AV1 support, here we describe the transcoding options available for Nimble Streamer Live Transcoder: SVT-AV1 and libaom-av1 encoders are now supported via the respective FFmpeg encoder option.
Prerequisites
1. Decoding AV1 received streams
AV1 streams are supported as video input, and no extra setup steps are required. Just drag Video Source to a timeline, and specify the name of a stream for decoding. After that any of the Transcoder’s filters can be used on the stream.
2. Encoding with libaom-av1
Specifying libaom-av1 for encoding is not much different from any other encoders. Add the Video output to a timeline, name the output application, and stream, and then select FFmpeg as the encoder. Next, type in libaom-av1 in a Codec field. Once you click OK and then Save the scenario, the settings will be applied and you will get the output stream in AV1 format.
2.1 libaom-av1 constant quality
2.2 libaom-av1 constrained quality
2.3 libaom-av1 keyframe control
3. Encoding with SVT-AV1
3.1 SVT-AV1 CRF
3.2 SVT-AV1 Presets
3.3 Additional options with SvtAv1Params
Live video overlays and videowall with Nimble Live Transcoder
Prerequisites
Before using the Transcoder for Nimble Streamer, you must have a basic subscription to WMSPanel, as Transcoder does not work for trial servers. The Transcoder license costs 50 USD per server per month, there is no trial version for it.
Please find more details on installing and registering an instance of Transcoder on the following installation page.
Basics of operation
First, a base layer that determines the output resolution to create an overlaid video must be defined. This can be a live video or an image.
Then, all other input sources will be stacked on top of this main layer via the Overlay filter. You may also need the Scale filter to adjust the size of the other sources to place and fit them on the base layer.
The next concept is that you must define one of the sources as the Main stream. Once the scenario is started, the Transcoder will start equalizing all other streams’ timestamps to the Main stream to provide a stable source for Overlays.
Please also check an extended video tutorial about creating mosaic video wall using overlays at the end of this article.
Let's go through the details of this scenario setup below.
Sources for overlays: live streams
Put a Video source decoder block on a timeline, and the Add video source dialogue will appear.
This will allow choosing the source type: Stream, File, NDI. First, we describe the regular live stream setup.
Type in an application and choose the stream to auto-fill the stream’s name. If the streams are registered in the panel, they will appear for selection. Choose the required one.
To equalize any other stream's timestamps with the Main's timestamps, choose the PTS adjustment enabled checkbox in its source’s settings.
Since the Main and PTS Adjustment are defined, you can safely chain several Overlays (or any other filters) in a Transcoder’s scenario. This versatility will help create distinctive visual effects for your video output.
As for the Audio, the Main is the only stream that will have perfectly synced audio. We do not advise taking audio from other streams, as we cannot guarantee the audio sync in such a case.
If the source live stream is over or stopped, the decoder will hold the previously available frame in its output, making the next in a chain Overlay filter operational.
In case the Transcoder scenario is just starting and one of the live sources participating in the Overlay chain is missing, the Scenario will fail to start. We advise using hot-swap failover to ensure the source always has some viable frames to process.
Source for overlay: video file, static, or animated image files
Move on to the next type of decoder: File Source decoder. Although we already have an article about decoding from files, here we will notice the crucial points.
As the name implies, it uses files on the same server Nimble Streamer is running as input. As you specify the path to some file and the scenario is running, then the File Decoder generates looped output with stable timestamps which can be assigned as Main. This is true for both video or image files. As you might have spotted, there’s no PTS Adjustment checkbox. It is considered that if the Main stream is not specified, such sources will be forced as PTS adjusted.
The video file formats supported are MP4, MOV and MKV. The codecs are listed in our codecs reference. Videos with alpha channel are not supported at the moment. Feel free to ask our support team if you plan using them.
Supported image formats include PNG, APNG, JPEG, GIF, TIFF and BMP. The alpha channel or GIF transparency is available. GIF and APNG animations are supported too.
In case of a single frame image, the file will be re-loaded as fast as the file is modified or overwritten. Notice that reloading will not work for video files or files with a sequence of frames, like APNG animations.
All the above makes the File Decoder a handy source for animated or updated Overlays like moving logos or bar ads.
Notice that path considerations are different between Windows and Linux.
For more information on File decoder, please read Binding un-synced video and audio sources article.
Single Overlay filter setup
Now let's move on to some practical setup that will be a basis for cases like video wall or mosaic.
Multiple Overlay filter setup (videowall)
Hardware accelerated Overlay and other filters
Is the hardware acceleration available for this feature set? Absolutely, but just for Ubuntu Linux users. Both NVENC and QuickSync have hardware-accelerated versions of overlay filters. It's better to use NVENC-ffmpeg or NVENC-quicksinc decoders with these filters for performance reasons. This will avoid getting and putting frames to the system’s RAM and losing CPU cycles by using additional hwupload and hwdownload filters.
However, these decoders come with certain limitations. While they allow you to decode H.264 or HEVC (if supported by the GPU) mp4/mkv video files, they are not suitable for handling static images or animated files such as GIFs.
Remember to consider the hardware-accelerated scale_npp (scale_cuda) and vpp_qsv (scale_qsv) filters for resizing a video source. NVENC-ffmpeg decoder allows scaling in its settings by the resize option. Explore additional information about NVENC-only scenarios in the following article.
If you require more details on creating an NVENC-only scenario, please let us know. We have plans to publish an article specifically addressing this topic.
Troubleshooting
Here we’ll describe some tips and common errors you may face while while using Transcoder.
Although we did an extensive job of improving the stability of the Overlay filter, anyway we recommend using the out-of-process mode to provide better stability for the Nimble Streamer server. This mode will ensure that a single scenario with an issue won’t crash Nimble Streamer. Read about it in this article.
If you’re changing source type, let's say from File to Live, scenario restart is considered. It can be performed while editing the Scenario by the corresponding icons near its name.
Some common messages in the log:
- [video_decoder] E: failed to open stream /some/path/file.ext - You’ll get this error in the log if the specified file name is not found or Nimble has issues accessing it (e.g. due to permission restrictions).
- [video_filter] E: failed to parse filter graph - This means something is wrong with a filter chain from the parameter or there are filter name incompatibility errors between the decoder and filter. Other messages are needed for details.
- E: encoder too slow’ for encoder or decoder in most cases it means you are running out of processing power. If the multicore CPU is used, and you observe that some cores are not loaded when the error is present, this may be improved by allowing more cores for decoding or encoding.
- [video_filter] E: reset video filter on pts gap(prev=XXX, cur=YYY, timescale=1/1000), filter=XXXXX, stream=larix/2 - It's not an error, but the indication that due to the timestamp jumps, the filter was re-created.
- [video_decoder] E: decode interval gap detected, decode_interval=XXXX for [_some_path_] - it very likely is a decoding performance issue or the inconsistent system time.
Mosaic video wall tutorial
We've made an extended video tutorial about creating mosaic video wall using overlays in Live Transcoder