Pages

February 27, 2017

Stress-testing NVidia GPU with IBM

Recently we finished extensive testing of latest NVidia Tesla M60 graphic card in IBM Bluemix Cloud Platform to see how much it increases the performance of Live Transcoder for Nimble Streamer.

We got excellent results, please read this article for more details:

Stress-testing NVidia GPU for live Ttranscoding

February 16, 2017

FDK AAC encoder and decoder in Nimble Transcoder

Live Transcoder for Nimble Streamer has full support for AAC decoding and encoding, along with various audio filters like re-sampling, transrating or audio channels manipulations.

Now we add FDK AAC support for both decoding and encoding. It allows adding HE-AAC and HE-AACv2 to your transcoding scenarios. This is also another alternative to ffmpeg decoder for audio streams, while having decent quality.

Let's see how you can set up FDK usage in your scenarios.

First, create a new scenario or modify existing one. If you need to perform only audio transformation, you can add a passthrough for video stream.
Minimum scenario for audio transformation.
As mentioned, you can use FDK for both decoding and encoding. Here is how decoder will look like in this case:

Using FDK as decoder.
So you just select libfdk_aac at Decoder drop-down list instead of Default.

If you'd like to encode using libfdk, open encoder dialog and choose libfdk_aac from Encoder drop-down list.

Using FDK as encoder

This also allows you to select HE-AAC and HE-AACv2 profiles. Print "profile" in property edit box to get drop-down list of profiles:
  • aac_low
  • aac_he
  • aac_he_v2
  • aac_ld
  • aac_eld

Choose aac_he or aac_he_v2 for respective options.


Feel free to visit Live Transcoder webpage for other transcoding features description and contact us if you have any question.

Related documentation



February 14, 2017

Forward CEA-708 subtitles with Nimble Streamer

Providing subtitles as part of live streaming is important and is required by law is some countries. So people ask adding that capability into Nimble Streamer in addition to VOD subtitles support.

There are cases when source stream which comes into Nimble Streamer already contains subtitles metainformation. So now Nimble allows forwarding CEA-708 subtitles. This means that all outgoing streams for all supported protocols will include subtitles.

This works for both transmuxing and transcoding of H.264 (AVC) content.

Transmuxing support this forwarding by default. Whatever metainformation is inserted into the original stream, it is passed through to all other protocols.

To make this work in Live Transcoder scenarios, you need to enable this feature for outgoing streams. It's a premium add-on for this media server and it has easy-to-use web UI to control transcoding behavior. To install and get a license for it, visit this page.
To enable this feature for particular encoded stream, you need to edit an encoder block for the stream which you want subtitles to be forwarded for.

Transcoder scenario

Click on encoder details icon to open encoder details dialog.



Check the Forward CEA-708 subtitles box and save settings to close the dialog. Then click Save on scenario age to apply it on server.

That's it - the forwarding will start working right after a scenario is saved on server.


Also take a look at DVB subtitles processing and SCTE-35 processing which can also be passed through Live Transcoder.

Please also check Subtitles digest page to see what else Nimble can do for you.


Feel free to visit Live Transcoder webpage for other transcoding features description and contact us if you have any question.

Related documentation


Handling live streams timing errors in Nimble Streamer DVR

Sometimes when an MPEG-TS stream is received from media source, it may have some glitches either in video or audio. This is caused by third-party encoders which set incorrect time stamps assigned to media fragments - they may go back and forth in some un-predicted range. This also happens even when the source stream is transmuxed into other protocols, e.g. RTMP.

This may bother the viewers and also cause media servers to malfunction during the recording of the stream. Nimble Streamer allows compensating those timing issues and perform correct recording of video and audio in DVR. If the compensation can't help, then Nimble just removes the chunk and resets recording period.

Go to Nimble Streamer top menu, select Live Streams Settings menu and select DVR tab to open its settings.



Choose the designated stream properties and find Error correction section and check Drop invalid segments checkbox. This will perform the required correction to the recorded media, and the playback will be smooth from player point of view.

Keep protocol timestamps. If original stream has issues with timestamps, Nimble Streamer tries to compensate this by re-calculating correct numbers for DVR. This option disables compensation of timestamps, original timestamps are saved into the database and recording period is being reset.

Check segment sizes on load. This is a debugging option which enables validation of segments sizes in addition to getting size from the database. It's added for debugging purposes only, it increases load time for DVR archive so you should not use it by default.

Align segment time (PROGRAM-DATE-TIME) enables PROGRAM-DATE-TIME alignment for HLS segments based on stream timestamps to avoid drifting between PDT and segments duration.

Troubleshooting other issues


Please read Troubleshooting section in DVR setup article to see what else you can do to fix DVR-related issues.

Watch our DVR video tutorial: DVR recording and playback in Nimble Streamer

Also notice that HLS DVR streams can be added to SLDP HTML5 Player for rewinding low latency streams. Read this article for details.

If you have any further questions, contact our team.

Related documentation


February 9, 2017

Viewing ASN statistics for streaming connections

A number of our large customers build and maintain their own media content deliver networks. Common layout includes origin servers to process the content from its sources and edge servers which handle connections from end-users who watch and listen the media.

It's important to locate edges as close to possible viewers as possible to reduce latency and improve overall user experience. So you need a way to determine optimal physical location for each edge. This is why it's important to know what ASNs your viewers have. That will allow putting your edges on a proper hosting location with proper network peers.

WMSPanel allows showing ASN statistics of your views showing how many connections were made with from the most active ASNs. It's part of our media servers reporting framework.