Documentation, Iris
Broadcasting
The main goals of the Iris broadcast SDKs for Android and iOS are, in order:
- effortless to integrate, with a sensible default configuration
- low broadcasting latency
- solid audio broadcasting
- high movie framework rate
In order to reach these goals, the SDKs suggest fully automatic adaptive scaling of the broadcast movie resolution, bitrate and framework rate. By automatically scaling according to the available upload bandwidth, the live broadcast quality will always be as high as the environment permits.
Broadcast lifecycle
1) A client integrating the broadcasting library starts a broadcast.
Two) The Iris backend assigns a unique broadcastId to the broadcast and sends it to the client.
Trio) The broadcast is registered in the Iris backend and is now visible in the listing APIs. The broadcast's state is now live.
The live broadcast can now be viewed by other clients or on the web.
Four) The client stops broadcasting.
Five) The backend switches the broadcast's state from live to archived.
The archived broadcast can now be viewed on-demand using the same method as when the broadcast was live.
How to embark a broadcast
1) Include the SDK in your app project.
Two) Create a broadcaster example and configure it with your applicationId.
Trio) Run your app and commence broadcasting.
Supported instruction set architectures
The Iris SDKs for Android and iOS contain native code, optimized and compiled to machine code for specific instruction set architectures.
Android
The Iris SDK contains native code built for the armeabi-v7a and arm64-v8a ABIs, compatible with all modern devices and ARM emulator pics.
Make sure to not build apps targeting other architectures, since the native libs will be missing for those architectures. In particular, x86 emulator photos are not supported. Real x86 devices are supported, as they can translate ARM machine code.
You may need to add abiFilters to the build.gradle file for your app. For example:
The Iris SDK contains native code built for all architectures presently supported by Apple, on devices as well as the simulator.
Options/features
Switching camera
The Iris broadcasting SDKs for Android and iOS let your app use any of the available cameras on a device.
Affixing metadata
When your app starts broadcasting, you can link unique metadata to each broadcast. Any affixed metadata can later be accessed through the API. Title, author and geotagging can be set in the SDK. Any hashtags present in the title will automatically be parsed by the Iris backend and exposed as tags in the API. The metadata API supports filtering on author, tags and title.
You may also fasten extra data, structured as you choose (for example JSON or XML), in a custom-built string.
iOS example
Android example
Server-side archiving
By default, all broadcasts are stored on the Iris backend, to support later on-demand viewing. You may override this behaviour in the broadcasting apps if you are only interested in live content.
When server-side archiving is disabled, the Iris backend does not store the broadcast content at all. Such broadcasts can only be accessed live through the API and movie player, then they are gone.
Local copy of broadcast
Due to permanently switching mobile network conditions around the broadcaster, a live broadcast will usually vary both in framework rate and movie resolution. In parallel with the live broadcast, the broadcaster SDKs for Android and iOS can store a total quality movie file locally, independent of network conditions.
Taking pictures
In addition to live broadcasting, the SDKs for Android and iOS have basic functionality for taking still pictures.
Broadcast dimensions
The SDKs support setting the maximum broadcast dimensions and optionally enforcing a specific aspect ratio, e.g. 16:9. This is especially useful on Android, where manufacturers suggest very different cameras with different supported resolutions.
Uplink speed testing
Before commencing a broadcast, you can optionally do an uplink speed test to determine whether the current network is rapid enough for streaming to the Iris servers.
While broadcasting live, you can instead observe the stream health, measured automatically at regular intervals by the broadcast library.
Extra camera features
The broadcast SDKs for Android and iOS suggest elementary APIs to use camera functionality such as focusing, zooming and toggling a LED torch.
Text and audio feedback
The Iris Live Editor lets moderators talk or initiate a voice talkback session to the broadcaster. The broadcast SDKs for Android and iOS, which the Iris Reporter apps build upon, come with built-in support for both incoming text and audio talkback.
Incoming text can be be shown in a broadcasting app either by using the provided example talk UI or designing your own. Voice talkback can be integrated by implementing a ordinary API available in the SDKs, as shown in the example code included in the SDKs.
Leave a Reply