Showing posts with label encoder. Show all posts
Showing posts with label encoder. Show all posts

November 28, 2021

NETINT encoder support in Nimble Streamer Transcoder via custom FFmpeg build

Live Transcoder for Nimble Streamer supports a number of encoding libraries off-the-shelf which cover most used technologies. Besides native integration with SDK and specific libraries, Live Transcoder uses some FFmpeg libraries for certain tasks. This way, Nimble allows using external libraries to extend its capabilities by re-building them with FFmpeg as necessary.

NETINT provides its hardware encoders API via FFmpeg-based libraries. This means you can make a custom FFmpeg build for Nimble Streamer and thus give it the access to NETINT encoding. Once you do that you'll be able to add it into your transcoder scenarios.

Let's see how you can accomplish that.

We assume you already have Live Transcoder installed and registered. In not, please check Transcoder installation instruction and watch introductory video tutorial from our playlist.
Before proceeding further, make sure you've upgraded your Nimble Streamer instance to the latest version.

First, you need to follow QuickStartGuideT408_T432_FW2_5.pdf documentation file:
  1. install necessary tools
  2. build libxcoder library
  3. check out FFmpeg 4.3.1
  4. apply Netint patch FFmpeg-n4.3.1_t4xx_patch
Second, the instruction says you need to run "bash build_ffmpeg.sh", but instead of that you need to run this command:
bash build_ffmpeg.sh --shared --custom_flags "--build-suffix=-nimble --disable-ffmpeg"
This will build shared libraries with Nimble suffix.

Third step is to run these commands from a regular instruction:
make install
sudo ldconfig
Finally, restart Nimble Streamer using a command for your OS as described here.

After that Nimble will be able to use the new libraries.

The encoder settings in transcoder scenario will need to use the following parameters:
  • Encoder must be set to FFmpeg
  • Specify h264 codec as "h264_ni_enc" for T408 and T432 Video Transcoders, or "h264_ni_quadra_enc" for Quadra VPUs. Use "h264_ni_logan_enc" codec for Logan Video Server.
  • Custom parameter xcoder-params will allow to control the encoder behavior
An example is shown here:

Once you save settings in your scenario, your Nimble Streamer instance will use NETINT encoder. At the moment you cannot use NETINT hardware filters, as Nimble Transcoder does not support hardware NETINT decoding. If you're interested in this feature, please contact us.

If you have any questions about custom FFmpeg builds and their integration with NETINT please contact our helpdesk.

Follow us in social media to get updates about our new features and products: YouTubeTwitterFacebookLinkedInRedditTelegram

Related documentation


March 22, 2020

Building NVENC-only pipeline with Nimble Transcoder

Live Transcoder for Nimble Streamer provides wide feature set for transforming live content using both software libraries and hardware acceleration.

NVidia NVENC always fully supported in Live Transcoder for decoding and encoding but all filtering operations were performed using CPU. That caused extra resources usage to transfer processed frames among CPU, GPU and RAM.

Nimble Live Transcoder now allows building transcoding pipelines which are performed completely with NVidia GPU hardware acceleration. This is done using specific FFmpeg libraries which we use in addition to our own code.

We'll show you how to set up this NVENC-powered processing chain.

1. Installation and initial setup


We assume you've already set up Nimble Streamer, it's been set up to get an incoming live stream and you've tested basic streaming. In our example we'll use a stream which application name is "input" and stream name is "source".

If you're not familiar with Live Transcoder, take a look at Transcoder documentation reference.

Notice that the described functionality is available on Ubuntu 18.04, 20.04 and 22.04 only. We'll support other upcoming LTS Ubuntu releases as well.

The basic steps to make NVENC working are as follows:

  1. Install the latest NVidia drivers on your server.
  2. Create a transcoder license and subscribe for it.
  3. Install Live Transcoder add-on.
  4. Create some simple scenario with CPU transcoding (e.g. downscale your stream to 240p). This way you'll make sure the transcoder was set up properly.
If you already have Transcoder installed, please run these commands to upgrade the package:
sudo apt-get update
sudo apt-get install nimble-transcoder

Now create a new scenario to start a new pipeline setup.

2. Decoder setup


Once you create a new scenario, drag and drop a blue decoder element onto the dashboard. There you need to specify "NVENC-ffmpeg" in Decoder field.


Once the incoming stream is received, Nimble Transcoder will use proper NVDEC/CUVID FFmpeg decoder: h264_cuvid, hevc_cuvid or mpeg2_cuvid. Each decoder has its set of options in case you'd like to fine-tune them or if you want to use extended feature set.

The GPU core number from GPU field will be used in the pipeline which you create. So all further filters and encoders will recognize the source GPU core and will execute their transformations there.

One of those features for all decoders is the ability to resize the frame during decoding. This operation is highly optimized and you can use it to reduce further resource usage. This is available via "resize" parameter as shown on a picture below. Notice that the value is set as <width>x<height>.


This feature is specifically helpful when you have FullHD stream input and you need to downscale it further. This resolution requires a lot of resources to handle so if you make initial downscale to HD or even lower resolution, then all further operations will consume less RAM and processing power on GPU.

Notice that all forwarding features (subtitles and SCTE-35 markers forwarding) mentioned at the bottom of the dialog will work regardless of decoding option which you choose.

If you change decoder settings of a scenario which is active and running, then you need to re-start the scenario.

Now let's set up filtering.

3. Filtering


Once the frame is decoded you can process it via a set of ffmpeg filters which are able to control NVENC behavior.

Nimble Transcoder supports a number of those, here are the most frequently used.

"split" - allows creating several identical outputs from input video. It's available as a filter element in a tool box of Transcoder UI.

"scale_npp" performs frame scaling. You add a custom filter to your scenario, set its name to "scale_npp" and its value to resolution, e.g. "854:480" or "640:360".

Notice that scale_npp can have only one output.


"fps" is a filter which sets the frames per second value. It's also defined via custom filter.

Picture filter allows setting a static image overlay for a video. Once you add it into your scenario, choose "CUDA" in Encoding hardware dropdown.



Notice that regular Scale filter from the UI toolbox will not work with GPU-decoded frames as well as other regular ffmpeg filters just because the processing is done internally in GPU.

However, you can take the frame out of GPU and process it separately using "hwdownload" and "hwupload_cuda" filters. To add them, add a custom filter, set its name as mentioned and leave the value field empty. Your steps will be as follows:

  1. Add "hwdownload" to get the frame from GPU.
  2. Add "format" custom filter with "nv12" value to set proper frame format.
  3. After that you can use regular FFmpeg filters.
  4. Then add "hwupload_cuda" filter to put it back into GPU processing pipe.

Notice that it will increase RAM/CPU usage so use it only if you need to do something you cannot do on GPU.

Let us know if you need information about other filters.

4. Encoder setup


Having the content transformed via filters, you can now encode it. Add encoder element to your scenario and select "FFmpeg" in "Encoder" field.

Then define "Codec" field as either h264_nvenc or hevc_nvenc - for H.264/AVC or H.265/HEVC codecs respectively.


You can use any parameters applicable for h264_nvenc or hevc_nvenc encoders.

For h264_nvenc most popular parameters would be these:
  • "b" defines bitrate in Mbps. Example: "4.5M" for 4.5 Mbps.
  • "profile" defines encoding profile, its possible values are "baseline", "main", "high", "high444p".
  • "preset" stands for encoding preset, its values are "default", "slow", "medium", "fast", "hp", "hq", "bd", "ll", "llhq", "llhp", "lossless", "losslesshp".

If your input stream is anamorphic you might need to save its SAR parameter in the output as well, especially if you’re using a 'scale' filter in your Transcoder pipeline while DAR = SAR x Width / Height. Nimble supports keeping input SAR using keep-sar parameter set to true for encoder in its ‘Video output’ section. SAR/DAR/PAR correlation is described in this article.


For more encoder settings, refer to FFmpeg documentation.

Just like you saw in decoder element, all forwarding features from listed under Expert setup at the bottom of the dialog will work properly.

5. Audio setup


When you have video pipeline set up, you need to define audio part. If you don't need any sound transformation, you can add a passthrough for it just like it's described in other setup examples.

6. Example


We've made a video showing the example of setup process, take a look at it:



Here's what we set up there:

  • A decoder has a downscale to 720p as described in section 2 above.
  • A split filter which has 3 equal outputs.
  • One output goes directly to the encoder. It takes the downscaled frame and simply encodes it into live/stream_720 output. The encoding parameters are similar to what you see in section 4.
  • Another output it processed via Scale_npp filter which scales it to 480p. That filter is described in section 3. Its output is encoded to live/stream_480 output stream.
  • One more output of split filter goes through "Scale_npp" (to scale to 360p) to "Fps" filter which sets its "fps" value to "25". Then it's encoded into live/stream_360 output.
  • Audio input is passed through for all 3 available output renditions.

This scenario uses only NVENC capabilities for video processing. The output streams are then transmuxed into the output streaming protocols which you select in global server settings or specific settings for "live" application.

If you have any questions, issues or questions, please feel free to contact us.

Related documentation


Live Transcoder for Nimble Streamer, Transcoder documentation referenceZabbix monitoring of Nimble Streamer with NVidia GPU status.

January 20, 2020

FFmpeg custom build support in Live Transcoder

Live Transcoder for Nimble Streamer supports a variety of decoding, filtering and encoding libraries. All the libraries which we have there were checked for reliability, performance and proper licensing before being added into the deployment package.

Our customers ask us to add some new libraries into Transcoder deployment package so they could be available by default in the UI. Mostly those are some existing open-source encoders, or commercial encoder libraries, or even custom encoders built by our customers themselves. However we couldn't add all the libraries which we are requested and this kept the doors closed for new functionality and gave bad experience to our customers.

To solve this problem, it's now possible to use custom builds of FFmpeg libraries to utilize any video and audio encoders as well as filters which are not supported in the default Transcoder package. Live Transcoder uses FFmpeg and its libraries for certain tasks under LGPL license which allows re-building it as necessary. So now you can just add more libraries if you need.

Linux packages of Live Transcoder can pick up custom libraries and use them for further encoding.
Re-building FFmpeg on Windows is also possible. If you are familiar with building FFmpeg for Windows you may try it, however we do not provide support for this case.

Here's how you may re-build FFmpeg and use it further.

1. Legal disclaimer


This article describes the process of building custom third-party FFmpeg libraries and using them in Softvelum Live Transcoder in addition to the libraries which are deployed as part of Live Transcoder package.

Every custom library which is a result of building FFmpeg has its own licensing terms. So every library must be examined for its licensing terms prior to any usage or distribution, including but not limited to the patent licensing terms.

Softvelum, LLC is not responsible for any license or patent infringement which can occur as a result of any FFmpeg custom build usage by Live Transcoder users.

2. Building FFmpeg


This section describes how you can build FFmpeg for further usage in Transcoder.

We strongly recommend you to try custom build approach in testing environment first. Once you get consistent result there, you may apply it to your production environment.

If something goes wrong after any of the steps and you'd like to revert it, just re-install Live Transcoder. This will rewrite all libraries with their default copies.

2.1 Making default FFmpeg build


To make sure your environment is ready for custom builds, let's start with building FFmpeg with the default libraries for Live Transcoder.

First, download the FFmpeg package. As required by FFmpeg license, we've uploaded the FFmpeg package and its build script on our website.

Second, run the shell script in the same directory where you've just downloaded FFmpeg. It has all commands needed for getting a working copy of FFmpeg. Its compiled libraries can be used with Live Transcoder as is.

You may get errors related to missing packages, like Freetype or Speex libraries. Just install respective packages using this command for Ubuntu
sudo apt install libfreetype6-dev libspeex-dev
and this command for CentOS
yum install freetype-devel speex-devel bzip2

You'll be able to proceed with building after that.

2.2 Making and using custom FFmpeg build


Now when you have FFmpeg ready for builds, you may add third-party encoder. This depends on what encoder you'd like to add, so you need to refer to your library documentation for more details on installation.

Having an encoder installed, you need to modify your build script to include it. Change your build script and modify the following line:
--enable-encoder=aac,png,mjpeg,customvideocodec,customaudiocodec \
Append your custom encoder name into that line. This is the name which is used within FFmpeg and which will later be used in Live Transcoder. In this case you can see "customvideocodec" and "customaudiocodec". You may also need to append additional lines for other parameters, so check library documentation for more information.

You can find examples of other custom build scripts in our github.

Once the build is over, you can use the new library.

2.3 Using libraries


You can ingest the libraries to Live Transcoder by copying them from "build/lib/" subdirectory of build directory into proper location.

Run this command on Ubuntu to see where Transcoder libraries are located:
dpkg -L nimble-transcoder
Most probably your directory will be /usr/lib/x86_64-linux-gnu/.

On CentOS you can run this command to see where it is:
rpm -ql nimble-transcoder

Once you find the location, you can re-write the libraries by copying from your build directory to Transcoder location.

2.4 Re-start Nimble Streamer


The last step to make those new libraries start working, is to re-start Nimble Streamer using the command required by your specific OS.

For Ubuntu it's this one:
sudo service nimble restart
You can find other OSes on installation page.

3. Using custom libraries in Live Transcoder


Now when you have the custom library available, you can start using it from your Live Transcoder web UI.

Create a Transcoder scenario as usual and add a new encoder element. You can watch this tutorial video to see how transcoding scenarios are created.

For custom video codec follow these steps:
  1. Drop video encoder element.
  2. In the "Add output stream" dialog, the "Encoder" dropdown menu must be set to "FFmpeg" value.
  3. In "Codec" field you need to specify the encoder name as defined in video encoder library, e.g. "customvideocodec" in our example. See section 2.2 regarding codec name in build parameters.


Custom audio codec is added the same way:

  1. Drop audio encoder element.
  2. In "Add output stream" dialog, set the "Encoder" field to "FFmpeg".
  3. In Codec field, specify the encoder name as defined in audio encoder library, e.g. "customaudiocodec" in our example. See section 2.2 regarding codec name in build parameters.



Besides that you can specify whatever parameters that are used in your custom encoder.

That's it. Using this technique you may use some third-party libraries which are not yet available in Live Transcoder out-of-the-box.

Also check an example of custom built FFmpeg for NETINT encoder support.

If you have any questions regarding this feature set usage, don't hesitate to contact us and show us your use case.

Related documentation


Live Transcoder for Nimble Streamer, Transcoder tutorial videos, Transcoder documentation referenceNETINT encoder support via custom build

December 17, 2019

March 23, 2018

NVENC shared context usage

Our Live Transcoder has full support for NVidia GPU decoding and encoding. As you could see from our stress-test article and EC2 tests, Nimble works very good with NVENC.

Usually customers perform decoding and encoding of a relatively small number of streams for input and output and it doesn't affect the performance of neither GPU, nor CPU. However there are cases when our customers use full power of hardware acceleration to process dozens of streams. In this case, some additional performance tuning should be done.

Nimble Live Transcoder allows re-using shared context to optimize resource usage. You may enable it by adding nvenc_context_share_enable parameter into Nimble Streamer config (nimble.conf).
nvenc_context_share_enable = true
nvenc_context_share_lock_enable = true
The second parameter above, nvenc_context_share_lock_enable, will prevent the errors related to NVENC decoding.

Nimble config control is described in this article.

Adding those parameter will enable NVENC context share which will increase the performance of Live Transcoder on high load and reduce number of NVENC-related issue.

If you'd like to manually create context cache, you may follow this article.
You may also find useful our Nimble Streamer performance tuning guide as well as Transcoder troubleshooting tips.

Feel free to visit Live Transcoder webpage for other transcoding features description and contact us if you have any question.

Zabbix monitoring of Nimble Streamer allows tracking server status, SRT streams and NVidia GPU status.

Contact us if you have any other questions or issues with Live Transcoder.

Related documentation

Live Transcoder for Nimble StreamerNVidia GPU supportHEVC support in Nimble StreamerLive Streaming features, Using Amazon EC2 for HEVC transcoding, Nimble Streamer performance tuning.

July 18, 2017

Transcoding VP8 and VP9 in Nimble Streamer

VP8 and VP9 are open and royalty free video coding formats developed by Google. Nimble Streamer Live Transcoder now supports these protocols transcoding in addition to already supported VP8/VP9 transmuxing feature set.

Live Transcoder allows performing both decoding and encoding.

To receive VP8 and VP9 for transcoding, Nimble Streamer allows processing RTSP from published and pulled sources. The result stream can be delivered using RTSP and SLDP protocols.

Decoding


The following methods are currently supported for decoding VP8/VP9 content for further transformation:
  • Software decoder
  • Intel® Quick Sync technology for hardware decoding. VP8 is supported on Windows and Linux, VP9 is supported only on Windows. 
  • NVidia® NVENC hardware decoding for Windows and Linux.
You can specify the decoding method in decoder block in any transcoding scenario just like you specify it for other codecs.

Encoding


Currently the encoding is performed only via software encoder. To use it for VP8 and VP9, open encoder block in your transcoding scenario and select "libvpx" from dropdown menu 

Setting encoder for VP9 and VP8.

You will then be able to select Codec and specify other parameters listed below.


libvpx VP8/VP9 encoder parameters


quality

Quality Deadline

  • best - use the Best Qulity Deadline;
  • good - use the Good Qulity Deadline;
  • rt(default) -use the Real Time Qulity Deadline;

threads

Number of threads that will be allocated to the encode process

profile

Sets the encoder profile. Supported value: 1. Values 1-3 will be supported in the future versions of Transcoder.

lag_in_frames

Defines an upper limit on the number of frames into the future that the encoder can look. Values range: 0 to 25.

bitrate/b

Bitrate in kbps.

rc_mode

Rate control mode.

  • vbr- variable bitrate mode
  • cbr -  constant bitrate mode
  • cq -  constrained quality mode
  • q - constant quality mode


cq_level

Constrained Quality Level, in CQ mode the encoder will try to encode normal frames (all frames apart from key frames, golden frames and alternative reference frames) at a quantizer / quality level of cq_level. Values range: 0 to 63.

min_q

Minimum (Best Quality) Quantizer.

max_q

Maximum (Worst Quality) Quantizer.

buf_sz

Decoder Buffer Size indicates the amount of data that may be buffered by the decoding application. Note that this value is expressed in units of time (milliseconds). For example, a value of 5000 indicates that the client will buffer (at least) 5000ms worth of encoded data.

buf_initial_sz

Decoder Buffer Initial Size indicates the amount of data that will be buffered by the
decoding application prior to beginning playback. This value is expressed in units of time (milliseconds).

buf_optimal_sz

Decoder Buffer Optimal Size indicates the amount of data that the encoder should try to maintain in the decoder's buffer. This value is expressed in units  of time (milliseconds).

undershoot_pct

Rate control adaptation undershoot control. This value, expressed as a percentage of the target bitrate, controls the maximum allowed adaptation speed of the codec. This factor controls the maximum amount of bits that can be subtracted from the target bitrate in order to compensate for prior overshoot.
Values range: 0 to 1000

overshoot_pct

Rate control adaptation overshoot control. This value, expressed as a percentage of the target bitrate, controls the maximum allowed adaptation speed of the codec.
This factor controls the maximum amount of bits that can be added to the target bitrate in order to compensate for prior undershoot. Values range: 0 to 1000.

kf_mode

Keyframe placement mode. This value indicates whether the encoder should place keyframes at a fixed interval, or determine the optimal placement automatically.
Values: auto/disabled

kf_min_dist

Keyframe minimum interval. This value, expressed as a number of frames, prevents the encoder from placing a keyframe nearer than kf_min_dist to the previous keyframe.
At least kf_min_dist frames non-keyframes will be coded before the next keyframe. Set kf_min_dist equal to kf_max_dist for a fixed interval.

kf_max_dist

Keyframe maximum interval. This value, expressed as a number of frames, forces the encoder to code a keyframe if one has not been coded in the last kf_max_dist frames.
A value of 0 implies all frames will be keyframes. Set kf_min_dist equal to kf_max_dist for a fixed interval.

drop_frame

The drop frame parameter specifies a buffer fullness threshold at which the encoder starts to drop frames as a percentage of the optimal value specified by buf_optimal_sz. If it is set to 0 then dropping of frames is disabled.
Values range: 0 to 100.

resize_allowed

Enable/disable spatial resampling, if supported by the codec.

resize_up, resize_down

The resize up and down parameters are high and low buffer fullness "watermark" levels at which we start to consider changing down to a smaller internal image size, if the buffer is being run down, or back up to a larger size if the buffer is filling up again. The numbers represent a percentage of buf_optimal_sz.
Values range: 0 to 100

error_resilient

Enable error resilient modes indicates to the encoder which features it should enable to take measures for streaming over lossy or noisy links.

  • 0 - disabled
  • 1 - Improve resiliency against losses of whole frames
  • 2 - The frame partitions are independently decodable by the bool decoder, meaning that partitions can be decoded even though earlier partitions have been lost. Note that intra prediction is still done over the partition boundary.
  • 3 - Both features

auto_alt_ref

Codec control function to enable automatic set and use alf frames.

  • 0 - disable
  • 1 - enable

sharpness

Codec control function to set sharpness.

static_tresh

Codec control function to set the threshold for MBs treated static.

arnr_max_frames

Codec control function to set the max no of frames to create arf.

arnr_strength

Codec control function to set the filter strength for the arf.

tune

Optimize output for PSNR or SSIM quality measurement.
Values: psnr/ssim(default)

max_intra_bitrate_pct

Codec control function to set Max data rate for Intra frames.

cq_level

Constrained Quality Level, in CQ mode the encoder will try to encode normal frames (all frames apart from key frames, golden frames and alternative reference frames) at a quantizer / quality level of cq_level. Values range: 0 to 63.

libvpx VP8-specific parameters


speed

Codec control function to set encoder internal speed settings.
Values range: -16 to 16

token_parts

Codec control function to set the number of token partitions.

screen_content_mode

Codec control function to set encoder screen content mode.

  • 0 - off;
  • 1 - On;
  • 2 - On with more aggressive rate control;


noise_sensitivity

control function to set noise sensitivity

  • 0 - off;
  • 1 - OnYOnly;
  • 2 - OnYUV;
  • 3 - OnYUVAggressive;
  • 4 - Adaptive;


gf_cbr_boost

Boost percentage for Golden Frame in CBR mode.


libvpx VP9-specific parameters


speed

Codec control function to set encoder internal speed settings.
Values range: -8 to 8

max_inter_bitrate_pct

Codec control function to set max data rate for Inter frames.

gf_cbr_boost

Boost percentage for Golden Frame in CBR mode.

lossless

Lossless encoding mode.

  • 0 - lossy coding mode ;
  • 1 - lossless coding mode;


tile_cols

Number of tile columns

  • 0 - 1 tile column ;
  • 1 - 2 tile columns;
  • 2 - 4 tile columns;
  • n - 2**n tile columns;

tile_rows

Number of tile rows

  • 0 - 1 tile row ;
  • 1 - 2 tile rows;
  • 2 - 4 tile rows;

aq_mode

Adaptive quantization mode.

frame_boost

Periodic Q boost.

  • 0 = off ;
  • 1 = on;

noise_sensitivity

Noise sensitivity.

  • 0: off
  • 1: On(YOnly)

tune_content

Content type

  • default - Regular video content (Default);
  • screen - Screen capture content;

min_gf_interval

Minimum interval between GF/ARF frames

max_gf_interval

Maximum interval between GF/ARF frames

level

Target level

  • 255: off (default);
  • 0: only keep level stats;
  • 10: target for level 1.0;
  • 11: target for level 1.1;
  • ...
  • 62: target for level 6.2

row_mt

Row level multi-threading

  • 0 : off;
  • 1 : on;

alt_ref_aq

Special mode for altref adaptive quantization

  • 0 - disable
  • 1 - enable


Easy control


Live Transcoder has easy to use Web UI which provides drag-n-drop workflow editor to apply transcoding scenarios across various servers in a few clicks.
With FFmpeg filters you can transform content in various ways, e.g. change the video resize, make graphic overlays, picture-in-picture, key frames alignments, audio re-sampling etc.
Take a look at our videos to see Transcoder UI in action.


Feel free to visit Live Transcoder webpage for other transcoding features description and contact us if you have any question.

Related documentation


Live Transcoder for Nimble StreamerBuild streaming infrastructureTranscoder web UI previewLive Streaming featuresBuild streaming infrastructure,

March 15, 2017

VA API (libVA) support in Nimble Streamer

Video Acceleration API (VA API) is a royalty-free API along with its implementation as free and open-source library (libVA). This API provides access to hardware-accelerated video processing, using hardware such as graphics processing units (GPU) to accelerate video encoding and decoding by offloading processing from CPU.

Supported codecs are H.264 and VP8.

Nimble Streamer supports VAAPI and allows using libVA in Live Transcoder as one of the options for encoding among other libraries and SDKs.

Let's see how you can start using libVA in Nimble Streamer Live Transcoder.

Open your transcoding scenario or create a new one.

Sample scenario
Click on encoding block "gear" icon for open details dialog.

Encoder settings dialog with vaapi as Encoder
Here you need to choose "vaapi" option from "Encoder" drop-down and use the Codec dropdown list to select from h264 and vp8.
Now you can fill in library-specific parameters like profile etc. Once you save encoder settings and save the scenario, libva will start working.

Check the description of all supported parameters below.

H.264 encoding parameters


profile

Specifies the codec profile. The values are:

  • high (this one is default)
  • main
  • contstrained baseline

level

Specifies the codec level (level_idc value * 10).
Default: 51 (Level 5.1, up to 4K30)

g, keyint

Number of pictures within the current GOP (Group of Pictures).
1 - only I-frames are used.
Default: 120

bf

Maximum number of B frames between non-B-frames.

  • 0 - no B frames (default)
  • 1 - IBPBP...
  • 2 - IBBPBBP... etc.

rate_control

Sets bitrate control methods.

  • cbr - Use the constant bitrate control algorithm. "bitrate", "init_bufsize", "bufsize", "max_bitrate" - might be specified.
  • cqp -  Use the constant quantization parameter algorithm; "qpi", "qpp", "qpb" might be specified.

Default: cbr if bitrate is set, cqp otherwise.

b, bitrate

Maximum bit-rate to be constrained by the rate control implementation. Sets bitrate in kbps.
Must be specified for cbr.

target_percentage

The bit-rate the rate control is targeting, as a percentage of the maximum bit-rate for example if target_percentage is 95 then the rate control will target a bit-rate that is 95% of the maximum bit-rate.
Default: 66%

windows_size_ms

windows size in milliseconds. For example if this is set to 500, then the rate control will guarantee the target bit-rate over a 500 ms window.
Default: 1000

initial_qp

Initial QP for the first I frames, 0 - encoder chooses the best QP according to rate control;
Default: 0

min_qp

Minimal QP frames, 0 - encoder chooses the best QP according to rate control;
Default: 0

bufsize

Sets the size of the rate buffer in bytes. If is equal to zero, the value is calculated using bitrate, frame rate, profile, level, and so on.

init_bufsize

Sets how full the rate buffer must be before playback starts in bytes. If is equal to zero, the value is calculated using bitrate, frame rate, profile, level and etc.

qpi, qpp, qpb

Quantization Parameters for I, P and B frames, must be specified for CQP mode.
It's a value from 1…51 range, where 1 corresponds to the best quality.
Defult: 0

quality

Encoding quality - higher is worse and faster, 0 - use driver default.
Default: 0

fps_n, fps_d

Set output FPS numerator and denominator. It only affects num_units_in_tick and time_scale fields in SPS.

  • If fps_n=30 and fps_d=1 then it's 30 FPS
  • If fps_n=60000 and fps_d=2002 then it's 29.97 FPS

Source stream FPS or filter FPS is used if fps_n and fps_d are not set.

VP8 encoding parameters


The following parameters can be used if you select VP8 as your target codec.

g, keyint

Number of pictures within the current GOP (Group of Pictures).
1 - only I-frames are used.
Default: 120

rate_control

Sets bitrate control methods.


  • cbr - Use the constant bitrate control algorithm. "bitrate", "init_bufsize", "bufsize", "max_bitrate" - might be specified.
  • cqp -  Use the constant quantization parameter algorithm; "qpi", "qpp", "qpb" might be specified.

Default: cbr if bitrate is set, cqp otherwise.

b, bitrate

Maximum bit-rate to be constrained by the rate control implementation. Sets bitrate in kbps.
Must be specified for cbr.

target_percentage

The bit-rate the rate control is targeting, as a percentage of the maximum bit-rate for example if target_percentage is 95 then the rate control will target a bit-rate that is 95% of the maximum bit-rate.
Default: 66%

windows_size_ms

Windows size in milliseconds. For example if this is set to 500, then the rate control will guarantee the target bit-rate over a 500 ms window.
Default: 1000

initial_qp

Initial QP for the first I frames, 0 - encoder chooses the best QP according to rate control;
Default: 0

min_qp

Minimal QP frames, 0 - encoder chooses the best QP according to rate control;
Default: 0

bufsize

Sets the size of the rate buffer in bytes. If is equal to zero, the value is calculated using bitrate, frame rate, profile, level, and so on.

init_bufsize

Sets how full the rate buffer must be before playback starts in bytes. If is equal to zero, the value is calculated using bitrate, frame rate, profile, level and etc.

qpi, qpp

Quantization Parameters for I, P and B frames, must be specified for CQP mode.
It's a value from 1…51 range, where 1 corresponds to the best quality.
Defult: 0

quality

Encoding quality - higher is worse and faster, 0 - use driver default.
Default: 0

error_resilient 

enable error resilience features

  • 0 - disable(default)
  • 1 - enable


kf_auto

Auto keyframe placement, non-zero means enable auto keyframe placement

  • 0 - disable
  • 1 - enable(default) 


kf_min_dist

keyframe minimum interval

kf_max_dist

keyframe maximum interval

recon_filter

Reconstruction Filter type

  • 0: bicubic,
  • 1: bilinear,
  • other: none


loop_filter_type

Loop filter type

  • 0: no loop fitler,
  • 1: simple loop filter


loop_filter_level

loop filter level value. When loop_filter_level is 0, loop filter shall be disabled.

sharpness

Controls the deblocking filter sensitivity

iqi

I-frame quantization index
Range: 0..127

pqi

P-frame quantization index
Range: 0..127






Feel free to visit Live Transcoder webpage for other transcoding features description and contact us if you have any question.

Related documentation