Skip to content

Commit

Permalink
Update documentation and add a basic quickstart sample app. (livekit#124
Browse files Browse the repository at this point in the history
)

* update readme and docs

* clean up interface methods leaking into docs

* AudioSwitchHandler documentation

* Update package descriptions

* Include link to API reference in README

* update dependencies

* Add in basic one participant sample app

* clean up debug keys

* update readme sample

* Update README and add readmes for each project folder

* Assemble all modules for compile test safety

* Disconnect on destroy

* update compile/target version

* fix build errors

* fix tests
  • Loading branch information
davidliu authored Jul 29, 2022
1 parent 65f76bd commit e6c9d9b
Show file tree
Hide file tree
Showing 53 changed files with 792 additions and 107 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/android.yml
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ jobs:
run: chmod +x gradlew

- name: Build with Gradle
run: ./gradlew clean livekit-android-sdk:assembleRelease livekit-android-sdk:testRelease
run: ./gradlew clean assembleRelease livekit-android-sdk:testRelease

- name: Import video test keys into gradle properties
if: github.event_name == 'push'
Expand Down
93 changes: 64 additions & 29 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,15 @@
# Android Kotlin SDK for LiveKit

Official Android Client SDK for [LiveKit](https://github.com/livekit/livekit-server). Easily add video & audio capabilities to your Android apps.
Official Android Client SDK for [LiveKit](https://github.com/livekit/livekit-server). Easily add
video & audio capabilities to your Android apps.

## Docs

Docs and guides at [https://docs.livekit.io](https://docs.livekit.io)
Docs and guides at [https://docs.livekit.io](https://docs.livekit.io).

API reference can be found
at [https://docs.livekit.io/client-sdk-android/index.html](https://docs.livekit.io/client-sdk-android/index.html)
.

## Installation

Expand Down Expand Up @@ -35,13 +40,6 @@ subprojects {
}
```

## Sample App

There are two sample apps with similar functionality:

* [Compose app](https://github.com/livekit/client-sdk-android/tree/master/sample-app-compose/src/main/java/io/livekit/android/composesample)
* [Standard app](https://github.com/livekit/client-sdk-android/tree/master/sample-app)

## Usage

### Permissions
Expand Down Expand Up @@ -86,36 +84,47 @@ LiveKit uses `SurfaceViewRenderer` to render video tracks. A `TextureView` imple
provided through `TextureViewRenderer`. Subscribed audio tracks are automatically played.

```kt
class MainActivity : AppCompatActivity(), RoomListener {
class MainActivity : AppCompatActivity() {

lateinit var room: Room

override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
...
val url = "wss://your_host";

setContentView(R.layout.activity_main)

// Create Room object.
room = LiveKit.create(applicationContext)

// Setup the video renderer
room.initVideoRenderer(findViewById<SurfaceViewRenderer>(R.id.renderer))

connectToRoom()
}

private fun connectToRoom() {

val url = "wss://your_host"
val token = "your_token"

lifecycleScope.launch {
// Create Room object.
val room = LiveKit.create(
applicationContext,
RoomOptions(),
)


// Setup event handling.
launch {
room.events.collect { event ->
when(event){
when (event) {
is RoomEvent.TrackSubscribed -> onTrackSubscribed(event)
else -> {}
}
}
}

// Connect to server.
room.connect(
url,
token,
ConnectOptions()
)

// Turn on audio/video recording.
val localParticipant = room.localParticipant
localParticipant.setMicrophoneEnabled(true)
Expand All @@ -124,22 +133,27 @@ class MainActivity : AppCompatActivity(), RoomListener {
}

private fun onTrackSubscribed(event: RoomEvent.TrackSubscribed) {
if (event.track is VideoTrack) {
val track = event.track
if (track is VideoTrack) {
attachVideo(track)
}
}

private fun attachVideo(videoTrack: VideoTrack) {
// viewBinding.renderer is a `io.livekit.android.renderer.SurfaceViewRenderer` in your
// layout
videoTrack.addRenderer(viewBinding.renderer)
videoTrack.addRenderer(findViewById<SurfaceViewRenderer>(R.id.renderer))
findViewById<View>(R.id.progress).visibility = View.GONE
}
}
```

See
the [basic sample app](https://github.com/livekit/client-sdk-android/blob/main/sample-app-basic/src/main/java/io/livekit/android/sample/basic/MainActivity.kt)
for the full implementation.

### `@FlowObservable`

Properties marked with `@FlowObservable` can be accessed as a Kotlin Flow to observe changes directly:
Properties marked with `@FlowObservable` can be accessed as a Kotlin Flow to observe changes
directly:

```kt
coroutineScope.launch {
Expand All @@ -149,12 +163,33 @@ coroutineScope.launch {
}
```

## Sample App

We have a basic quickstart sample
app [here](https://github.com/livekit/client-sdk-android/blob/main/sample-app-basic), showing how to
connect to a room, publish your device's audio/video, and display the video of one remote participant.

There are two more full featured video conferencing sample apps:

* [Compose app](https://github.com/livekit/client-sdk-android/tree/main/sample-app-compose/src/main/java/io/livekit/android/composesample)
* [Standard app](https://github.com/livekit/client-sdk-android/tree/main/sample-app/src/main/java/io/livekit/android/sample)

They both use
the [`CallViewModel`](https://github.com/livekit/client-sdk-android/blob/main/sample-app-common/src/main/java/io/livekit/android/sample/CallViewModel.kt)
, which handles the `Room` connection and exposes the data needed for a basic video conferencing
app.

The respective `ParticipantItem` class in each app is responsible for the displaying of each
participant's UI.

* [Compose `ParticipantItem`](https://github.com/livekit/client-sdk-android/blob/main/sample-app-compose/src/main/java/io/livekit/android/composesample/ParticipantItem.kt)
* [Standard `ParticipantItem`](https://github.com/livekit/client-sdk-android/blob/main/sample-app/src/main/java/io/livekit/android/sample/ParticipantItem.kt)

## Dev Environment

To develop the Android SDK or running the sample app, you'll need:
To develop the Android SDK or running the sample app directly from this repo, you'll need:

- Ensure the protocol submodule repo is initialized and updated with `git submodule update --init`
- Install [Android Studio Arctic Fox 2020.3.1+](https://developer.android.com/studio)

For those developing on Apple M1 Macs, please add below to $HOME/.gradle/gradle.properties

Expand Down
20 changes: 10 additions & 10 deletions build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,9 @@

buildscript {
ext {
compose_version = '1.1.1'
compose_compiler_version = '1.1.1'
kotlin_version = '1.6.10'
compose_version = '1.2.0'
compose_compiler_version = '1.2.0'
kotlin_version = '1.7.0'
java_version = JavaVersion.VERSION_1_8
dokka_version = '1.5.0'
}
Expand All @@ -13,8 +13,8 @@ buildscript {
mavenCentral()
}
dependencies {
classpath 'com.android.tools.build:gradle:7.1.2'
classpath "org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlin_version"
classpath 'com.android.tools.build:gradle:7.1.3'
classpath "org.jetbrains.kotlin:kotlin-gradle-plugin:1.7.0"
classpath "org.jetbrains.kotlin:kotlin-serialization:$kotlin_version"
classpath "org.jetbrains.dokka:dokka-gradle-plugin:$dokka_version"
classpath 'com.google.protobuf:protobuf-gradle-plugin:0.8.18'
Expand Down Expand Up @@ -45,15 +45,15 @@ nexusStaging {

ext {
androidSdk = [
compileVersion: 31,
targetVersion : 31,
compileVersion: 32,
targetVersion : 32,
minVersion : 21,
]
versions = [
androidx_core : "1.7.0",
androidx_lifecycle: "2.4.0",
androidx_core : "1.8.0",
androidx_lifecycle: "2.5.1",
autoService : '1.0.1',
dagger : "2.27",
dagger : "2.43",
groupie : "2.9.0",
junit : "4.13.2",
junitJupiter : "5.5.0",
Expand Down
8 changes: 4 additions & 4 deletions livekit-android-sdk/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ android {
kotlinCompilerExtensionVersion compose_compiler_version
}
kotlinOptions {
freeCompilerArgs = ["-Xinline-classes", "-Xopt-in=kotlin.RequiresOptIn"]
freeCompilerArgs = ["-Xinline-classes", "-opt-in=kotlin.RequiresOptIn"]
jvmTarget = java_version
}
}
Expand Down Expand Up @@ -114,13 +114,13 @@ dependencies {
api 'com.github.webrtc-sdk:android:104.5112.01'
api "com.squareup.okhttp3:okhttp:4.10.0"
api "com.twilio:audioswitch:1.1.5"
implementation "androidx.annotation:annotation:1.3.0"
implementation "androidx.annotation:annotation:1.4.0"
implementation "androidx.core:core:${versions.androidx_core}"
implementation "com.google.protobuf:protobuf-javalite:${versions.protobuf}"
implementation "androidx.compose.ui:ui:$compose_version"

implementation 'com.google.dagger:dagger:2.38'
kapt 'com.google.dagger:dagger-compiler:2.38'
implementation "com.google.dagger:dagger:${versions.dagger}"
kapt "com.google.dagger:dagger-compiler:${versions.dagger}"

implementation deps.timber
implementation 'com.vdurmont:semver4j:3.1.0'
Expand Down
11 changes: 10 additions & 1 deletion livekit-android-sdk/module.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,15 @@ Android Client SDK to [LiveKit](https://github.com/livekit/livekit-server).

This package contains the initial `connect` function.

# Package io.livekit.android.compose

Utilities and composables for use with Jetpack Compose.

# Package io.livekit.android.room

Room is the primary class that manages the connection to the LiveKit Room. It exposes listeners that lets you hook into room events
Room is the primary class that manages the connection to the LiveKit Room. It exposes listeners that lets you hook into room events.

# Package io.livekit.android.room.track

`AudioTrack` and `VideoTrack` are the classes that represent the types of media streams that can be
subscribed and published.
Original file line number Diff line number Diff line change
Expand Up @@ -10,13 +10,43 @@ import com.twilio.audioswitch.AudioSwitch
import javax.inject.Inject
import javax.inject.Singleton

/**
* An [AudioHandler] built on top of [AudioSwitch].
*
* The various settings should be set before connecting to a [Room] and [start] is called.
*/
@Singleton
class AudioSwitchHandler
@Inject
constructor(private val context: Context) : AudioHandler {
/**
* Toggle whether logging is enabled for [AudioSwitch]. By default, this is set to false.
*/
var loggingEnabled = false

/**
* Listen to changes in the available and active audio devices.
*
* @see AudioDeviceChangeListener
*/
var audioDeviceChangeListener: AudioDeviceChangeListener? = null

/**
* Listen to changes in audio focus.
*
* @see AudioManager.OnAudioFocusChangeListener
*/
var onAudioFocusChangeListener: AudioManager.OnAudioFocusChangeListener? = null

/**
* The preferred priority of audio devices to use. The first available audio device will be used.
*
* By default, the preferred order is set to:
* 1. BluetoothHeadset
* 2. WiredHeadset
* 3. Earpiece
* 4. Speakerphone
*/
var preferredDeviceList: List<Class<out AudioDevice>>? = null

private var audioSwitch: AudioSwitch? = null
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -602,6 +602,7 @@ internal constructor(
LKLog.e { "error setting remote description for answer: ${outcome.value} " }
return@launch
}
else -> {}
}
}

Expand All @@ -621,6 +622,7 @@ internal constructor(
LKLog.e { "error setting local description for answer: ${outcome.value}" }
return@launch
}
else -> {}
}
}

Expand Down
Loading

0 comments on commit e6c9d9b

Please sign in to comment.