June 10, 2017

FAQ: Larix mobile broadcasting SDK

As long as we provide mobile broadcasting SDK, we get a number of typical questions regarding its capabilities and use cases. Let's take a look at most frequent ones and have them answered.

Q1: How do I stream to YouTube Live?

Let's say you have a streaming URL rtmp://a.rtmp.youtube.com/live2 and a stream key abcd-efgh-abcd-efgh. To perform streaming you need to create a connection with the following connection URL: rtmp://a.rtmp.youtube.com/live2/abcd-efgh-abcd-efgh. Enter it into corresponding field and start streaming.

Q2: How can I do authenticated streaming via RTMP?

RTMP has several authentication methods, the default one is the Adobe authentication. Unfortunately we don't support it as it's a proprietary technology which is eligible for patent claims. To avoid any patent infringement, we do not have support for it.
Instead we propose using parameter-based authentication. Nimble Streamer supports parameters in URL out-of-the-box, Wowza has ModuleSecureURLParams for this. Check this article to see example of parameter usage.
Another option is to use RTSP, its authentication is fully supported both by our SDK and by all major media servers.

Q3: Can I specify exact frame rate for my stream?

On iOS it's fully supported.
On Android you can’t set exact frame rate, like 15fps or 30fps. It is possible to select FPS range from list of ranges supported by device’s camera. If minimum FPS equals maximum FPS, then you get a fixed frame rate. Supported FPS ranges list depends on camera hardware. Please refer to this documentation.

Q4: Can I set input gain (incoming audio volume)?

On iOS you can set input gain using standard AVAudioSession’s API. Please refer to this article and this article.
Android doesn’t provide API to set mic input gain.

Q5: Can I set specific profile and level for output stream encoding?

iOS supports the following video formats: H.264 Baseline Level 3.0, Baseline Level 3.1, Main Level 3.1, and High Profile Level 4.1. Please refer to this article and this article for details.
On Android please refer to this article on profile and this article on level:
You need Android 5.0 for profile and Android 6.0 for level. And note that profile/level combination support depends on device’ hardware type.

Q6: Can I make my application perform streaming from the background?

How can we do streaming when the app is closed, is in background or when the device is locked?
On Android you can put Streamer to service and so survive app close/go to background. Refer to Larix Screencaster sources, it has Streamer instance in background service.
On iOS video recording is impossible in background (capture session will be interrupted by OS, no way to survive this). Audio record in background will need additional investigation. We need some time to check how streamer should be modified to work like Skype app.

Q7: Can I use your SDK if my SDK subscription is cancelled?

Yes, you can use SDK and release your apps even if SDK was cancelled after one or more months of payments. However, you will not be receiving SDK updates nor will you be able to get our technical support.

Q8: What languages do you use in your SDK?

On Android we use pure Java.
On iOS, Larix Broadcaster sample application is created with Swift 3 with static streaming library. To use our library with Objective C you need only AVFoundation and CoreImage frameworks (CoreImage is used only to implement live rotation feature). You can convert AVFoundation-related Swift logic to Objective C one-to-one, Apple has example on this page.

Q9: Where can I find saved files after video recording?

On iOS we rely on iOS File Sharing, see this article. Until iOS 11 is released, you'll need iTunes do download or delete individual files.
If you want to customize path, refer to Streamer.swift / startRecord() and customize below code block:
let documents = try FileManager.default.url(for: .documentDirectory, in: .userDomainMask, appropriateFor: nil, create: false)
let df = DateFormatter()
df.dateFormat = "yyyyMMddHHmmss"
let fileName = "MVI_" + df.string(from: Date()) + ".mp4"
let fileUrl = documents.appendingPathComponent(fileName)
On Android we use DCIM/LarixBroadcaster on the internal storage. DCIM on the external storage is not supported due to Android limitation. See this article for more details.

Q10: Does iOS SDK support flash light support?

No. The iOS doesn't support this capability for third-party apps.

Q11: How can I apply image or text or animation overlay on the outgoing stream?

On iOS you can apply any CoreImage filter to outgoing video stream. You will implement CoreImage filers directly, same way like apply them to photo. Please refer to this Apple article.

On Android it's possible to stream any picture - see Larix Screencaster as an example of this. You will render with OpenGL Surface and it will be encoded and streamed. It's also possible to implement custom camera image post-processing: you get preview from camera, then apply filter using OpenGL and finally render it to Surface.

Related documentation

Mobile Broadcasting SDK, Larix Broadcaster, Mobile player SDKs


  1. Any instructions for posting on Facebook?

    1. We plan releasing an article about Facebook publishing in a few days. It will cover Nimble Streamer but the same instructions can be applied to Larix.