Official Android Client SDK for LiveKit. Easily add video & audio capabilities to your Android apps.
Docs and guides at https://docs.livekit.io
LiveKit for Android is available as a Maven package.
...
dependencies {
implementation "io.livekit:livekit-android:<version>"
}
LiveKit uses WebRTC-provided org.webrtc.SurfaceViewRenderer
to render video tracks. Subscribed audio tracks are automatically played.
class MainActivity : AppCompatActivity(), RoomListener {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
...
val url = "wss://your_host";
val token = "your_token"
launch {
val room = LiveKit.connect(
applicationContext,
url,
token,
ConnectOptions(),
this
)
val localParticipant = room.localParticipant
val audioTrack = localParticipant.createAudioTrack()
localParticipant.publishAudioTrack(audioTrack)
val videoTrack = localParticipant.createVideoTrack()
localParticipant.publishVideoTrack(videoTrack)
videoTrack.startCapture()
attachVideo(videoTrack, localParticipant)
}
}
override fun onTrackSubscribed(
track: Track,
publication: RemoteTrackPublication,
participant: RemoteParticipant,
room: Room
) {
if (track is VideoTrack) {
attachVideo(track, participant)
}
}
private fun attachVideo(videoTrack: VideoTrack, participant: Participant) {
// viewBinding.renderer is a `org.webrtc.SurfaceViewRenderer` in your
// layout
videoTrack.addRenderer(viewBinding.renderer)
}
}
To develop the Android SDK itself, you'll need
-
Check out the protocol repo to exist at the same level as this repo.
Your directory structure should look like this:
parent - protocol - client-sdk-android
-
Install Android Studio Arctic Fox Beta
- Download webrtc sources from https://webrtc.googlesource.com/src
- Add sources to Android Studio by pointing at the
webrtc/sdk/android
folder.