Official Android Client SDK for LiveKit. Easily add video & audio capabilities to your Android apps.
Docs and guides at https://docs.livekit.io
LiveKit for Android is available as a Maven package.
...
dependencies {
implementation "io.livekit:livekit-android:<version>"
// Snapshots of the latest development version are available at:
// implementation "io.livekit:livekit-android:<version>-SNAPSHOT"
}
You'll also need jitpack as one of your repositories.
subprojects {
repositories {
google()
mavenCentral()
// ...
maven { url 'https://jitpack.io' }
// For SNAPSHOT access
// maven { url 'https://s01.oss.sonatype.org/content/repositories/snapshots/' }
}
}
There are two sample apps with similar functionality:
LiveKit relies on the RECORD_AUDIO
and CAMERA
permissions to use the microphone and camera.
These permission must be requested at runtime. Reference the sample app for an example.
room.localParticipant.setCameraEnabled(true)
room.localParticipant.setMicrophoneEnabled(true)
// create an intent launcher for screen capture
// this *must* be registered prior to onCreate(), ideally as an instance val
val screenCaptureIntentLauncher = registerForActivityResult(
ActivityResultContracts.StartActivityForResult()
) { result ->
val resultCode = result.resultCode
val data = result.data
if (resultCode != Activity.RESULT_OK || data == null) {
return@registerForActivityResult
}
lifecycleScope.launch {
room.localParticipant.setScreenShareEnabled(true, data)
}
}
// when it's time to enable the screen share, perform the following
val mediaProjectionManager =
getSystemService(MEDIA_PROJECTION_SERVICE) as MediaProjectionManager
screenCaptureIntentLauncher.launch(mediaProjectionManager.createScreenCaptureIntent())
LiveKit uses SurfaceViewRenderer
to render video tracks. A TextureView
implementation is also
provided through TextureViewRenderer
. Subscribed audio tracks are automatically played.
class MainActivity : AppCompatActivity(), RoomListener {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
...
val url = "wss://your_host";
val token = "your_token"
lifecycleScope.launch {
// Create Room object.
val room = LiveKit.create(
applicationContext,
RoomOptions(),
)
// Setup event handling.
launch {
room.events.collect { event ->
when(event){
is RoomEvent.TrackSubscribed -> onTrackSubscribed(event)
}
}
}
// Connect to server.
room.connect(
url,
token,
ConnectOptions()
)
// Turn on audio/video recording.
val localParticipant = room.localParticipant
localParticipant.setMicrophoneEnabled(true)
localParticipant.setCameraEnabled(true)
}
}
private fun onTrackSubscribed(event: RoomEvent.TrackSubscribed) {
if (event.track is VideoTrack) {
attachVideo(track)
}
}
private fun attachVideo(videoTrack: VideoTrack) {
// viewBinding.renderer is a `io.livekit.android.renderer.SurfaceViewRenderer` in your
// layout
videoTrack.addRenderer(viewBinding.renderer)
}
}
Properties marked with @FlowObservable
can be accessed as a Kotlin Flow to observe changes directly:
coroutineScope.launch {
room::activeSpeakers.flow.collectLatest { speakersList ->
/*...*/
}
}
To develop the Android SDK or running the sample app, you'll need:
- Ensure the protocol submodule repo is initialized and updated with
git submodule update --init
- Install Android Studio Arctic Fox 2020.3.1+
For those developing on Apple M1 Macs, please add below to $HOME/.gradle/gradle.properties
protoc_platform=osx-x86_64
- Download webrtc sources from https://webrtc.googlesource.com/src
- Add sources to Android Studio by pointing at the
webrtc/sdk/android
folder.