Skip to content
This repository has been archived by the owner on Oct 10, 2023. It is now read-only.

Commit

Permalink
Add a bit more explanation
Browse files Browse the repository at this point in the history
  • Loading branch information
jgh- committed Oct 30, 2019
1 parent f54afa7 commit 48bbd54
Show file tree
Hide file tree
Showing 3 changed files with 24 additions and 14 deletions.
7 changes: 5 additions & 2 deletions Examples/Mixing/Sources/Mixing/main.swift
Original file line number Diff line number Diff line change
Expand Up @@ -131,10 +131,13 @@ let onConnection: LiveOnConnection = { pub, sub in
// Here we are pulling samples off of our audio and video buses that have been generated by the composer
// There is a GPU barrier to download the textuure from the GPU introduced here. This must be used
// even if using the Composer class because the composer makes no assumptions about where you want
// the texture to live after it's finished with it.
// the texture to live after it's finished with it.
//
// We must also filter the assets coming from the bus so that we only get the samples we want rather than
// all of them.
// all of them. You'll notice as well that when we compose the rtmp publisher at the end of the
// composition here we use a standard composition operator for video (>>>) and for audio we use a
// mapping composition operator (|>>). The mapping operator will take a list of samples and map them to the
// publisher, returning a list of the result type.
rtmpPublisher = (audioBus <<| assetFilter("composer") >>> FFmpegAudioEncoder(.aac, bitrate: 96000) |>> txn,
pictureBus <<| assetFilter("composer") >>> GPUBarrierDownload(compute) >>> FFmpegVideoEncoder(.avc,
bitrate: 3_000_000, frameDuration: manifest.video.frameDuration) >>> txn)
Expand Down
16 changes: 12 additions & 4 deletions Examples/Transcoding/Sources/Transcoding/main.swift
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,9 @@
limitations under the License.
*/

// This example is going to use the Transcode class to transcode a file and output it to an RTMP endpoint.
// This example is going to use the transcoder functions to transcode a file and output it to an RTMP endpoint.
// The transcoder functions return compositions containing the appropriate operations to transcode a CodedMediaSample
// into one (or more) different CodedMediaSamples of the same media type (audio, video)

import SwiftVideo
import Foundation
Expand Down Expand Up @@ -44,13 +46,19 @@ let onEnded: LiveOnEnded = {
// RTMP connection established, here we create the encoders and compose the outputs.
let onConnection: LiveOnConnection = { pub, sub in
if let pub = pub,
let txn = pub as? Terminal<CodedMediaSample> {
let publisher = pub as? Terminal<CodedMediaSample> {
do {
let src = try FileSource(clock, url: inputFile, assetId: "file", workspaceId: "sandbox")

// Here we are composing transformations: taking the encoded media from the file source, filtering based
// on the media type (audio or video), and transcoding the samples. You'll notice that when we compose
// the rtmp publisher at the end of the composition here we use a standard composition operator for
// video (>>>) and for audio we use a mapping composition operator (|>>). The mapping operator
// will take a list of samples and map them to the publisher, returning a list of the result type.
videoTranscoder = try codedBus <<| mediaTypeFilter(.video) >>> makeVideoTranscoder(.avc,
bitrate: 3_000_000, keyframeInterval: TimePoint(2000, 1000), newAssetId: "new") >>> txn
bitrate: 3_000_000, keyframeInterval: TimePoint(2000, 1000), newAssetId: "new") >>> publisher
audioTranscoder = try codedBus <<| mediaTypeFilter(.audio) >>>
makeAudioTranscoder(.aac, bitrate: 96_000, sampleRate: 48000, newAssetId: "new") |>> txn
makeAudioTranscoder(.aac, bitrate: 96_000, sampleRate: 48000, newAssetId: "new") |>> publisher
fileSource = src >>> codedBus
} catch {
print("Exception loading file \(error)")
Expand Down
15 changes: 7 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,13 +6,19 @@ Video streaming and processing framework for Linux, macOS, and iOS/iPadOS (maybe

This is very much a WIP. Support is currently better for Linux than macOS and even less-so for iOS devices. Work needs to be done on the Podfile for better Xcode interop (swift package manager doesn't do such a great job there).


### Getting Started

For now, check the Examples directory for some hints about how to use this framework. I promise I will be creating
documentation to clarify how it all fits together and how you can do useful, interesting things with this framework.

### Current Features

- RTMP Client and Server
- "Flavor" Client and Server (toy protocol, see flavor.md)
- OpenCL Support
- Metal Support
- Audio Decode/Encode (via FFmpeg)
- Audio Decode/Encode (via FFmpeg and CoreAudio)
- Video Decode/Encode (via FFmpeg and CoreVideo)
- Camera capture (macOS/iOS)
- Text rendering (via FreeType)
Expand All @@ -21,11 +27,4 @@ This is very much a WIP. Support is currently better for Linux than macOS and e
- Audio Resampler (via FFmpeg+SOX)


### Short ToDo

- Audio Capture and Playback (macOS/iOS)
- Audio Encode/Decode (via CoreAudio for macOS/iOS)
- More Metal colorspace options, use Metal as default on macOS/iOS/tvOS


FFmpeg support is thanks to https://github.com/sunlubo/SwiftFFmpeg

0 comments on commit 48bbd54

Please sign in to comment.