July 27, 2017

VP8 and VP9 support in SLDP HTML5 player

Our customers keep adding our new real-time low latency protocol SLDP into their streaming workflows as it allows sub-second delay for live streaming delivery. Some basic usage of SLDP protocol is described in this article which describes typical usage scenarios.

People ask us about adding new capabilities into SLDP technology. One of them is adding VP8 and VP9 codecs - those are open and royalty free video coding formats developed by Google. Nimble Streamer Live Transcoder now supports these protocols transcoding in addition to already supported VP8/VP9 transmuxing feature set.

SLDP is codec-agnostic protocol so in order to have full support for VP8/VP9 from source to the viewer, we only needed to add it into our HTML5 player.

So today we'd like to announce that VP8 and VP9 support was added into SLDP HTML5 web player. This type of playback fully depends on platform support, so it can be currently played in Chrome, Firefox and Opera and possibly in Microsoft browsers over time.

You can use SLDP for streaming in various bitrate modes. First you can stream single-bitrate VP8/VP9 if you know your target audience can play it.

Another option is to create ABR group to include streams encoded with both H.264 and VP8/VP9. In this case our SLDP HTML5 player will choose stream with H.264 codec if target browser or platform doesn't support VP8/VP9 playback. That would give you capabilities for flexible delivery of your content to multiple platforms.

We'll be adding other codecs support per our customers' requests so please contact us to get help or suggest new features.

July 24, 2017

Publishing stream to Periscope

RTMP republishing is widely used by Nimble Streamer users to deliver live streams to various types of destinations, including other media servers, CDNs, YouTube or Twitch.

Periscope live streaming platform also takes RTMP streams as input so our customers use that as a target too. As Periscope requires some additional stream setup, we've created a dialog in our UI to cover that use case.

As a source for your stream you can use any delivery method supported by Nimble Streamer.


Once the input stream is taken by Nimble, you can perform a setup for publishing it to Periscope.

July 18, 2017

Transcoding VP8 and VP9 in Nimble Streamer

VP8 and VP9 are open and royalty free video coding formats developed by Google. Nimble Streamer Live Transcoder now supports these protocols transcoding in addition to already supported VP8/VP9 transmuxing feature set.

Live Transcoder allows performing both decoding and encoding.

To receive VP8 and VP9 for transcoding, Nimble Streamer allows processing RTSP from published and pulled sources. The result stream can be delivered using RTSP and SLDP protocols.

Decoding


The following methods are currently supported for decoding VP8/VP9 content for further transformation:
  • Software decoder
  • Intel® Quick Sync technology for hardware decoding. VP8 is supported on Windows and Linux, VP9 is supported only on Windows. 
  • NVidia® NVENC hardware decoding for Windows and Linux.
You can specify the decoding method in decoder block in any transcoding scenario just like you specify it for other codecs.

Encoding


Currently the encoding is performed only via software encoder. To use it for VP8 and VP9, open encoder block in your transcoding scenario and select "libvpx" from dropdown menu 

Setting encoder for VP9 and VP8.

You will then be able to select Codec and specify other parameters listed below.


libvpx VP8/VP9 encoder parameters


quality

Quality Deadline

  • best - use the Best Qulity Deadline;
  • good - use the Good Qulity Deadline;
  • rt(default) -use the Real Time Qulity Deadline;

threads

Number of threads that will be allocated to the encode process

profile

Sets the encoder profile. Supported value: 1. Values 1-3 will be supported in the future versions of Transcoder.

lag_in_frames

Defines an upper limit on the number of frames into the future that the encoder can look. Values range: 0 to 25.

bitrate/b

Bitrate in kbps.

rc_mode

Rate control mode.

  • vbr- variable bitrate mode
  • cbr -  constant bitrate mode
  • cq -  constrained quality mode
  • q - constant quality mode


cq_level

Constrained Quality Level, in CQ mode the encoder will try to encode normal frames (all frames apart from key frames, golden frames and alternative reference frames) at a quantizer / quality level of cq_level. Values range: 0 to 63.

min_q

Minimum (Best Quality) Quantizer.

max_q

Maximum (Worst Quality) Quantizer.

buf_sz

Decoder Buffer Size indicates the amount of data that may be buffered by the decoding application. Note that this value is expressed in units of time (milliseconds). For example, a value of 5000 indicates that the client will buffer (at least) 5000ms worth of encoded data.

buf_initial_sz

Decoder Buffer Initial Size indicates the amount of data that will be buffered by the
decoding application prior to beginning playback. This value is expressed in units of time (milliseconds).

buf_optimal_sz

Decoder Buffer Optimal Size indicates the amount of data that the encoder should try to maintain in the decoder's buffer. This value is expressed in units  of time (milliseconds).

undershoot_pct

Rate control adaptation undershoot control. This value, expressed as a percentage of the target bitrate, controls the maximum allowed adaptation speed of the codec. This factor controls the maximum amount of bits that can be subtracted from the target bitrate in order to compensate for prior overshoot.
Values range: 0 to 1000

overshoot_pct

Rate control adaptation overshoot control. This value, expressed as a percentage of the target bitrate, controls the maximum allowed adaptation speed of the codec.
This factor controls the maximum amount of bits that can be added to the target bitrate in order to compensate for prior undershoot. Values range: 0 to 1000.

kf_mode

Keyframe placement mode. This value indicates whether the encoder should place keyframes at a fixed interval, or determine the optimal placement automatically.
Values: auto/disabled

kf_min_dist

Keyframe minimum interval. This value, expressed as a number of frames, prevents the encoder from placing a keyframe nearer than kf_min_dist to the previous keyframe.
At least kf_min_dist frames non-keyframes will be coded before the next keyframe. Set kf_min_dist equal to kf_max_dist for a fixed interval.

kf_max_dist

Keyframe maximum interval. This value, expressed as a number of frames, forces the encoder to code a keyframe if one has not been coded in the last kf_max_dist frames.
A value of 0 implies all frames will be keyframes. Set kf_min_dist equal to kf_max_dist for a fixed interval.

drop_frame

The drop frame parameter specifies a buffer fullness threshold at which the encoder starts to drop frames as a percentage of the optimal value specified by buf_optimal_sz. If it is set to 0 then dropping of frames is disabled.
Values range: 0 to 100.

resize_allowed

Enable/disable spatial resampling, if supported by the codec.

resize_up, resize_down

The resize up and down parameters are high and low buffer fullness "watermark" levels at which we start to consider changing down to a smaller internal image size, if the buffer is being run down, or back up to a larger size if the buffer is filling up again. The numbers represent a percentage of buf_optimal_sz.
Values range: 0 to 100

error_resilient

Enable error resilient modes indicates to the encoder which features it should enable to take measures for streaming over lossy or noisy links.

  • 0 - disabled
  • 1 - Improve resiliency against losses of whole frames
  • 2 - The frame partitions are independently decodable by the bool decoder, meaning that partitions can be decoded even though earlier partitions have been lost. Note that intra prediction is still done over the partition boundary.
  • 3 - Both features

auto_alt_ref

Codec control function to enable automatic set and use alf frames.

  • 0 - disable
  • 1 - enable

sharpness

Codec control function to set sharpness.

static_tresh

Codec control function to set the threshold for MBs treated static.

arnr_max_frames

Codec control function to set the max no of frames to create arf.

arnr_strength

Codec control function to set the filter strength for the arf.

tune

Optimize output for PSNR or SSIM quality measurement.
Values: psnr/ssim(default)

max_intra_bitrate_pct

Codec control function to set Max data rate for Intra frames.

cq_level

Constrained Quality Level, in CQ mode the encoder will try to encode normal frames (all frames apart from key frames, golden frames and alternative reference frames) at a quantizer / quality level of cq_level. Values range: 0 to 63.

libvpx VP8-specific parameters


speed

Codec control function to set encoder internal speed settings.
Values range: -16 to 16

token_parts

Codec control function to set the number of token partitions.

screen_content_mode

Codec control function to set encoder screen content mode.

  • 0 - off;
  • 1 - On;
  • 2 - On with more aggressive rate control;


noise_sensitivity

control function to set noise sensitivity

  • 0 - off;
  • 1 - OnYOnly;
  • 2 - OnYUV;
  • 3 - OnYUVAggressive;
  • 4 - Adaptive;


gf_cbr_boost

Boost percentage for Golden Frame in CBR mode.


libvpx VP9-specific parameters


speed

Codec control function to set encoder internal speed settings.
Values range: -8 to 8

max_inter_bitrate_pct

Codec control function to set max data rate for Inter frames.

gf_cbr_boost

Boost percentage for Golden Frame in CBR mode.

lossless

Lossless encoding mode.

  • 0 - lossy coding mode ;
  • 1 - lossless coding mode;


tile_cols

Number of tile columns

  • 0 - 1 tile column ;
  • 1 - 2 tile columns;
  • 2 - 4 tile columns;
  • n - 2**n tile columns;

tile_rows

Number of tile rows

  • 0 - 1 tile row ;
  • 1 - 2 tile rows;
  • 2 - 4 tile rows;

aq_mode

Adaptive quantization mode.

frame_boost

Periodic Q boost.

  • 0 = off ;
  • 1 = on;

noise_sensitivity

Noise sensitivity.

  • 0: off
  • 1: On(YOnly)

tune_content

Content type

  • default - Regular video content (Default);
  • screen - Screen capture content;

min_gf_interval

Minimum interval between GF/ARF frames

max_gf_interval

Maximum interval between GF/ARF frames

level

Target level

  • 255: off (default);
  • 0: only keep level stats;
  • 10: target for level 1.0;
  • 11: target for level 1.1;
  • ...
  • 62: target for level 6.2

row_mt

Row level multi-threading

  • 0 : off;
  • 1 : on;

alt_ref_aq

Special mode for altref adaptive quantization

  • 0 - disable
  • 1 - enable


Easy control


Live Transcoder has easy to use Web UI which provides drag-n-drop workflow editor to apply transcoding scenarios across various servers in a few clicks.
With FFmpeg filters you can transform content in various ways, e.g. change the video resize, make graphic overlays, picture-in-picture, key frames alignments, audio re-sampling etc.
Take a look at our videos to see Transcoder UI in action.


Feel free to visit Live Transcoder webpage for other transcoding features description and contact us if you have any question.

Related documentation


Live Transcoder for Nimble StreamerBuild streaming infrastructureTranscoder web UI previewLive Streaming featuresBuild streaming infrastructure,

July 7, 2017

Vote for Softvelum

The Streaming Media European Readers' Choice Awards 2017 voting has been started to get industry opinions on the best solutions on the market. Last year we were selected as the finalist for best streaming innovation for Nimble Streamer. Let's see what we get this year.


Our company is presented in 4 nominations and we hope to get your votes. Here's a brief instruction how to proceed.

1. Find and vote


Go to voting page here, enter your name and contacts to see the full list of nominees.

Find Softvelum products in the following nominations:

  1. Best Streaming Innovation: Nimble Live Transcoder
  2. Mobile Video App or Solution: Larix Broadcaster
  3. Server Hardware/Software: Nimble Streamer
  4. Transcoding Solution: Nimble Live Transcoder

We hope you enjoy our products and will choose them in the list.

2. Confirm your vote


Voting closes on August 1st, and at that point, all voters will receive an email asking them to to confirm their votes; only these confirmed votes will be counted.

The finalists will be announced on 15 August, and the winners will be announced in the Autumn issue of the European edition of Streaming Media magazine.


Thanks for being our loyal customers, looking forward to getting your votes.

July 6, 2017

Setting SRT in Nimble Streamer

Secure Reliable Transport (SRT) is a transport technology that optimizes streaming performance across unpredictable networks. It's applied to contribution and distribution endpoints as part of a video stream workflow to deliver the best quality and lowest latency video at all times.

As audio/video packets are streamed from a source to a destination device, SRT detects and adapts to the real-time network conditions between the two endpoints. SRT helps compensate for jitter and bandwidth fluctuations due to congestion over noisy networks, such as the Internet. Its error recovery mechanism minimizes the packet loss typical of Internet connections. AES 128/256 bit encryption is also supported for end-to-end security, keeping your streams safe from prying eyes.

Our company, Softvelum LLC, became a member of SRT Alliance which is a group dedicated to managing and supporting the open source implementation of SRT. This alliance is accelerating interoperability of video streaming solutions and fostering collaboration with industry leaders to achieve lower latency Internet video transport. Softvelum is actively contributing to the community to improve the protocol and its ecosystem.

Nimble Streamer has full support for SRT:
  • Both Push (Caller) and Listen delivery modes are supported.
  • Both Listen and Pull receive modes can be used to get SRT streams for further processing.
  • Rendezvous mode for sending and receiving is supported.
  • Custom "latency" and "maxbw" parameters are supported for fine tuning.
  • Forward error correction filter support.
  • Multiple streams can be sent via single SRT channel.
  • Multiple Pull-mode clients can be supported when working in "Listen" mode, see section "Set up sending via SRT" section for details.
  • Deliver and receive SRT bonded streams with Nimble Streamer with libsrt 1.5.
  • AES 128 bit encryption provides stream security.
  • IPv4 and IPv6 support.
  • SRT playback stats and protection.
"streamid" parameters support: 
  • Nimble supports streamid in Push and Pull modes.
  • In addition, Nimble supports streamid in receiver Listen mode as part of SRT PASSet feature set.
  • Besides, RTMP-style SRT streamid in Listen mode for publisher is supported for easier output generation of live streams.

Using Nimble Streamer transmuxing engine, you may create any supported live protocol output from SRT, as well as receive any of them  to create SRT.

Let's see how SRT can be enabled and used in Nimble Streamer.

Besides reading this full guide you can take a look at these video tutorials:

Install SRT package


SRT is available via separate package for Nimble Streamer. You need to install it in order to use this protocol.
  1. Install Nimble Streamer or upgrade it to latest version.
  2. Follow this instruction to install SRT package. For macOS, use this Docker-based approach.
Now you may proceed with the setup. You may define both transmission and reception settings so Nimble Streamer could be used on both ends of delivery chain.

A. Set up sending via SRT 


Having any input via RTMP, RTSP, MPEG-TS or HLS, you may create outgoing SRT streams.

Go to Nimble Streamer -> Live streams settings menu. Click on UDP streaming tab.


Click on Add SRT setting button to see the following dialog for creating a new SRT channel.


IP and Port fields define the destination of your channel.

Mode field may be set to either PushListen or Rendezvous - those are the modes supported by SRT.

  • Push will initiate the active sending of the selected channel to the destination IP/Port regardless of the recipient state. This is also referred to as "Caller" mode.
  • Listen will make Nimble Streamer connect to destination IP/Port and will wait for the Pull command from remote server.
  • Rendezvous mode is described below.

Listen mode allows processing streamid using "RTMP style", e.g. obtain existing live streams available in Nimble Streamer using "app/stream" notation. Read this article for more details. 


Mux rate is typically used for precise testing of channel throughput. Read this article for all details regarding mux rate setup.

Source applications names, streams names and their respective PMT PIDs, Video PIDs and Audio PIDs describe where the content is taken from for further transmission. You may define multiple sources - in this case SRT channel will have multiple streams in it.

MPEGTS In Source option can be used If you'd like to stream some existing MPEGTS input intact without any transmuxing, like pass though PIDs, multiple tracks, subtitles or any metadata that is not supported by Nimble. Read this article for more details.

In addition to mandatory parameters you may specify custom parameters and their values.

Custom parameters


latency parameter can be defined in milliseconds. We recommend setting it not less than "120". If you don't specify, Nimble will use 120 by default as set in SRT library. This specified delay causes an extra "wasted time" if the packet reached the destination quickly (and therefore it's way ahead of "time to play"), but when it was lost, this gives it an extra time to recover the packet before its "time to play" comes.

maxbw parameter (measured in bytes per second) defines maximum bandwidth which is allowed to be used by a single outgoing stream. If the network connection is not stable enough then a lot of SRT packets will be re-transmitted within a period specified by latency. The more streams from same server are used in unstable network, the more packets are lost and need to be re-transmitted. As a result, entire network becomes utilized. If maxbw parameter is set then even if network conditions are bad, each stream will not consume more than it's allowed to.
So maxbw needs to include original stream bandwidth and possible re-transmission traffic. We highly recommend setting maxbw parameter to avoid excessive bandwidth usage. Please read this article for more details.
If you don't know the exact values of your expected bandwidth, please set maxbw to 0 to let the SRT library auto-select bandwidth efficiently and handle re-transmission of lost packets.

In general, we highly encourage using both "latency" and "maxbw" parameters. Please read this article for all details.

passphrase and pbkeylen parameters are available to enable encryption for processed streams. The passphrase must be at least 10 symbols long - this is your password. The pbkeylen may be set to 16, 24 or 32 with 16 used by default. If these parameters are set during the transmission, you need to re-start the stream in order for encryption parameters to apply. Same set of parameters and values must be defined for both sender and receiver parts.

local_ip and local_port parameters define which interface will be used for sending the stream in Push mode.

stream_id can also be specified in case your receiving side supports it.

Once you save the settings, they will be added to the list and will be synced with the server.


One-to-many multipoint listener


Multiple Pull-mode clients may request stream when Nimble Streamer provides SRT in "Listen" mode, like TechEx or mwEdge software. So in order to provide multiple streams in that case, you need to enable multipoint listener feature by adding this parameter into Nimble config:
srt_multipoint_listener_enabled = true
Read configuration parameters reference for more details about managing Nimble Streamer parameters.


Now you may receive content via SRT on your destination. As we already mentioned, Nimble can be used to receive the SRT streams so let's see how you can do that.

B. Set up receiving of SRT


Having SRT input, you may transmux it into  RTMP, RTSP, MPEG-TS, HLS, MPEG-DASH or even Icecast outgoing streams.

Add incoming stream


Go to Nimble Streamer -> Live streams settings menu. Click on MPEGTS In tab.


Click on Add SRT stream button to see the following dialog for creating a new incoming SRT channel.




Receive mode specifies the mode used for obtaining the stream.

  • Listen sets Nimble to wait for incoming data and process it as soon as it arrives. Your source needs to be set "Push" mode - see previous section.
  • Pull sets Nimble to initiate the source server to start sending the data. Your source needs to be set to Listen mode.
  • Rendezvous mode is described below.

IP Address and Port fields specify which interface will be used for connection.

Alias is the name that will be used in incoming streams list.

You should also use latency parameter, you also need to specify it in custom parameters. Read this article for get more details about this parameter usage.

Once you save the setting, it will be started for sync-up with your server.



You've specified incoming stream, now you need to add it to outgoing streams, like it's done for any other MPEG-TS streams. This needs to be done because each SRT channel may carry several streams. You may check Add outgoing stream checkbox to enter app and stream names to make it automatically.

Go to MPEGTS Out tab to see the list of outgoing streams. Click on Add outgoing stream.


Here you will define the names of applications and streams which will be used for playback and other use cases. You will select video and audio sources and if they have multiple streams in a single channel, then you'll select the required ones. Read this article for more details about adding outgoing streams.

To play the SRT stream, go to Nimble Streamer -> Live Streams menu, select the outgoing streams to see what your server now has to offer for playback.

C. Setting up Rendezvous mode


Rendezvous mode is an addition to the modes described above. It allows streaming over SRT in case if one of both sides of send-receive pair use firewalls.
Each side needs local and remote interfaces to be set up: Local IP with Local Port and Remote IP with Remote Port.

These interfaces are used for initiating the transmission and exchanging the data.

  • Sender side will send data to Receiver Local IP/port.
  • Receiver side will send data to Sender Local IP/port during initial negotiation.


You can refer to SRT Deployment Guide (the link will start download) to get more details about it.

The description below shows how you can set up the Rendezvous mode.

Sending in Rendezvous mode


To set up SRT sending you need to go to Nimble Streamer -> Live Streams Settings menu. You will see settings page, click on UDP streaming tab there. You will see a page similar to what you saw in Set up sending via SRT section above. Click on Add SRT setting button to see the dialog. there you need to choose Rendezvous from Mode dropdown list.


Here, a pair of 192.168.0.2 and its port is a Local interface of our server which is a source of a stream to send. The Remote pair is the destination.

Receiving in Rendezvous mode


Go to Nimble Streamer -> Live streams settings menu. Click on MPEGTS In tab to see a page similar to what you saw in Set up receiving of SRT section above. Click on Add SRT stream to see the dialog below. Select Rendezvous from the Mode drop-down.



Here, a pair of Remote IP and Port will point to a source of the stream, a pair of Local IP and Port will point to current receiver server. The rest of the setup is done the same way as in Set up receiving of SRT section.

That's it - your Nimble Streamer instance may now get live streams via SRT.

We keep improving this feature set, so let us know if you have any questions or concerns about it.

SRT Publisher Assistance Security Set


Nimble Streamer has flexible security and management feature set for SRT receiver Listen mode:
  • Accept streamid parameter with "application/stream" format.
  • Make per-application and per-stream authentication with user and password.
  • Apply any SRT parameters to each individual stream and even individual publisher.
  • Apply allow and deny lists for IP addresses on server and stream level.
  • Manage published streams via publish control framework.

Read more details in feature overview.


Related documentation





Follow us in social media to get updates about our new features and products: YouTubeTwitterFacebookLinkedInRedditTelegram

July 2, 2017

2017Q2 news

Second quarter of 2017 is over so it's time to see what our company has been doing. As you can see below, we've released some new products that might be interesting for you.

Before getting to the updates, here are a couple of announcements you'd like you to take a look at.

First, check our team's opinion on the business competition in our industry. It's an answer to a frequently asked questions so please read it and make your own opinion.

If you plan attending IBC 2017, keep in mind that our team representatives will be visiting the trade show. So if you'd like to meet us, just drop us a note.

As always, take a look at The State of Streaming Protocols for 2017Q2. You'll see a couple of interesting points there.


Also, if you find our products useful, please find some time to vote for Softvelum solutions in European Readers' Choice Awards 2017.

SLDP


Softvelum Low Delay Protocol is a new technology for reducing the latency for last-mile delivery for end-users. It's based on WebSockets for better accessibility. The core features are
  • Sub-second delay between origin and player.
  • Codec-agnostic
  • ABR support
  • HTTP and HTTPS on top of TCP
  • Buffer offset support

Take a look at SLDP usage description and also most frequent questions that show the protocol best practices.
In addition to low latency, SLDP provides great level of live streams protection, read this article for more details.

Server side is support is available in Nimble Streamer while client side is covered by SLDP PlayersWeb player is working in any MSE-enabled web browsers on desktop and connected devices. Android native player is coming soon, while iOS player and SDK are already available and is described below.

Mobile solutions


We've released playback SDK for iOS. You can use RTMP and SLDP playback in your iOS apps. Use this page to subscribe for obtain SDK and get upcoming updates.

SLDP Player sample app is also available in AppStore, you can use it for playing any RTMP and SLDP links. Its sources are also available in iOS SDK.

Larix streaming SDK also has several updates.

First, take a look at mobile SDK FAQ to see answers you might be looking for. It's being updated so if you miss anything - just let us know. 

Limelight has released a how-to about using Larix Broadcaster with its delivery network.

Android and iOS streaming SDKs have several important updates.

iOS got the following

  • Bluetooth support
  • afreeca.tv publish fixed. It requires max bitrate as 2000 Kbps according to service requirements.
  • Connectivity improvements
Android has the same set of updates and features.

You can check all releases' description in these release notes and proceed with subscription here.


Nimble Streamer and Transcoder


Before exposing big updates, here's a feature you were waiting for: HLS input for processing. So now you can pull HLS stream to transmux it to other protocols like RTMP or even use it for further transcoding.

Here are a few big updates.

HEVC transcoding is now available in Live Trasncoder in addition to HEVC transmuxing.

  • You can decode HEVC using NVENC and QuickSync as well as software decoder.
  • Encoding can be done with NVENC and QuickSync.

Read more about NVENC encoding and QuickSync encoder parameters. Also, QuickSync setup description was updated with Media Server Studio 2017.

Speaking of new codecs, Live Transcoder now can encode audio to MP3. So now it's possible to transcode your live audio from AAC, MP3, Speex, PCM and MP2 into MP3.

Read more about codecs support in Nimble Streamer and its Live Transcoder

Other major update for transcoder is streams' hot swap with Live Transcoder. It now allows using the following opposite scenarios:

  • Streams failover hot swap where main stream is backed up by secondary stream in case of publication fail.
  • Emergency swap where main stream is replaced with some replacement stream when it becomes available. This is used in cases like US Emergency Alert System (EAS).

The streams are swapped without streams interruption so user playback will not be affected.

Last but not least: we've added new capabilities to give granulated control to end-users. We've separated control features and you can give access to particular capability. Read this article for details.



Stay tuned for more features - follow us at FacebookTwitter or Google+ to get latest news and updates of our products and services.