Media Frameworks and
Swift:
This is Fine
Chris Adamson • @invalidname
CocoaConf Chicago, April 2017
Slides available at slideshare.net/invalidname
Code available at github.com/invalidstream
Who the what, now?
@invalidname
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
import Cocoa
import AVFoundation
import CoreMediaIO
if let devices = AVCaptureDevice.devices(),
let avDevices = devices.filter(
{$0 is AVCaptureDevice}) as? [AVCaptureDevice] {
for device in avDevices {
print("(device.description)")
}
}
<AVCaptureHALDevice: 0x100b16ab0 [Loopback Simulator][com.rogueamoeba.Loopback:E8577B20-0806-4472-A5E6-426CABCD6C8E]>
<AVCaptureHALDevice: 0x100c1a7c0 [Loopback Line-In][com.rogueamoeba.Loopback:A00F38FD-C2B6-43FD-98B7-23BAA6FACB03]>
<AVCaptureHALDevice: 0x100c16910 [iMic USB audio system][AppleUSBAudioEngine:Griffin Technology, Inc:iMic USB audio system:220000:2,1]>
<AVCaptureHALDevice: 0x100d13900 [Loopback Keynote][com.rogueamoeba.Loopback:1936D2A3-6D0B-428E-899E-0ABE46628EA4]>
<AVCaptureHALDevice: 0x100a26850 [Soundflower (64ch)][SoundflowerEngine:1]>
<AVCaptureHALDevice: 0x100a26310 [HD Pro Webcam C920][AppleUSBAudioEngine:Unknown Manufacturer:HD Pro Webcam C920:1218B05F:3]>
<AVCaptureHALDevice: 0x100d13660 [Soundflower (2ch)][SoundflowerEngine:0]>
<AVCaptureDALDevice: 0x100a348f0 [iGlasses][iGlasses]>
<AVCaptureDALDevice: 0x100a28d00 [HD Pro Webcam C920][0x244000046d082d]>
Program ended with exit code: 0
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
CMIOObjectPropertyAddress prop =
{ kCMIOHardwarePropertyAllowScreenCaptureDevices,
kCMIOObjectPropertyScopeGlobal,
kCMIOObjectPropertyElementMaster };
UInt32 allow = 1;
CMIOObjectSetPropertyData( kCMIOObjectSystemObject,
&prop, 0, NULL, sizeof(allow), &allow );
var prop = CMIOObjectPropertyAddress(
mSelector: CMIOObjectPropertySelector(
kCMIOHardwarePropertyAllowScreenCaptureDevices),
mScope: CMIOObjectPropertyScope(kCMIOObjectPropertyScopeGlobal),
mElement: CMIOObjectPropertyElement(
kCMIOObjectPropertyElementMaster))
var allow : UInt32 = 1
CMIOObjectSetPropertyData(CMIOObjectID(kCMIOObjectSystemObject),
&prop,
0,
nil,
UInt32(MemoryLayout<UInt32>.size),
&allow)
CMIOObjectPropertyAddress prop =
{ kCMIOHardwarePropertyAllowScreenCaptureDevices,
kCMIOObjectPropertyScopeGlobal,
kCMIOObjectPropertyElementMaster };
UInt32 allow = 1;
CMIOObjectSetPropertyData( kCMIOObjectSystemObject,
&prop, 0, NULL, sizeof(allow), &allow );
var prop = CMIOObjectPropertyAddress(
mSelector: CMIOObjectPropertySelector(
kCMIOHardwarePropertyAllowScreenCaptureDevices),
mScope: CMIOObjectPropertyScope(kCMIOObjectPropertyScopeGlobal),
mElement: CMIOObjectPropertyElement(
kCMIOObjectPropertyElementMaster))
var allow : UInt32 = 1
CMIOObjectSetPropertyData(CMIOObjectID(kCMIOObjectSystemObject),
&prop,
0,
nil,
UInt32(MemoryLayout<UInt32>.size),
&allow)
CMIOObjectPropertyAddress prop =
{ kCMIOHardwarePropertyAllowScreenCaptureDevices,
kCMIOObjectPropertyScopeGlobal,
kCMIOObjectPropertyElementMaster };
UInt32 allow = 1;
CMIOObjectSetPropertyData( kCMIOObjectSystemObject,
&prop, 0, NULL, sizeof(allow), &allow );
var prop = CMIOObjectPropertyAddress(
mSelector: CMIOObjectPropertySelector(
kCMIOHardwarePropertyAllowScreenCaptureDevices),
mScope: CMIOObjectPropertyScope(kCMIOObjectPropertyScopeGlobal),
mElement: CMIOObjectPropertyElement(
kCMIOObjectPropertyElementMaster))
var allow : UInt32 = 1
CMIOObjectSetPropertyData(CMIOObjectID(kCMIOObjectSystemObject),
&prop,
0,
nil,
UInt32(MemoryLayout<UInt32>.size),
&allow)
This
is
fine
CMIOObjectPropertyAddress prop =
{ kCMIOHardwarePropertyAllowScreenCaptureDevices,
kCMIOObjectPropertyScopeGlobal,
kCMIOObjectPropertyElementMaster };
UInt32 allow = 1;
CMIOObjectSetPropertyData( kCMIOObjectSystemObject,
&prop, 0, NULL, sizeof(allow), &allow );
var prop = CMIOObjectPropertyAddress(
mSelector: CMIOObjectPropertySelector(
kCMIOHardwarePropertyAllowScreenCaptureDevices),
mScope: CMIOObjectPropertyScope(kCMIOObjectPropertyScopeGlobal),
mElement: CMIOObjectPropertyElement(
kCMIOObjectPropertyElementMaster))
var allow : UInt32 = 1
CMIOObjectSetPropertyData(CMIOObjectID(kCMIOObjectSystemObject),
&prop,
0,
nil,
UInt32(MemoryLayout<UInt32>.size),
&allow)
This
is
fine
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
var prop = CMIOObjectPropertyAddress(
mSelector: CMIOObjectPropertySelector(
kCMIOHardwarePropertyAllowScreenCaptureDevices),
mScope: CMIOObjectPropertyScope(
kCMIOObjectPropertyScopeGlobal),
mElement: CMIOObjectPropertyElement(
kCMIOObjectPropertyElementMaster))
var prop = CMIOObjectPropertyAddress(
mSelector: CMIOObjectPropertySelector(
kCMIOHardwarePropertyAllowScreenCaptureDevices),
mScope: CMIOObjectPropertyScope(
kCMIOObjectPropertyScopeGlobal),
mElement: CMIOObjectPropertyElement(
kCMIOObjectPropertyElementMaster))
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
public typealias CMIOObjectPropertySelector = UInt32
public typealias CMIOObjectPropertyScope = UInt32
public typealias CMIOObjectPropertyElement = UInt32
public struct CMIOObjectPropertyAddress {
public var mSelector: CMIOObjectPropertySelector
public var mScope: CMIOObjectPropertyScope
public var mElement: CMIOObjectPropertyElement
public init()
public init(mSelector: CMIOObjectPropertySelector,
mScope: CMIOObjectPropertyScope,
mElement: CMIOObjectPropertyElement)
}
extension CMIOObjectPropertySelector {
static let allowScreenCaptureDevices = CMIOObjectPropertySelector(
kCMIOHardwarePropertyAllowScreenCaptureDevices)
}
extension CMIOObjectPropertyScope {
static let global = CMIOObjectPropertyScope(kCMIOObjectPropertyScopeGlobal)
}
extension CMIOObjectPropertyElement {
static let master = CMIOObjectPropertyElement(kCMIOObjectPropertyElementMaster)
}
var prop = CMIOObjectPropertyAddress(
mSelector: CMIOObjectPropertySelector(
kCMIOHardwarePropertyAllowScreenCaptureDevices),
mScope: CMIOObjectPropertyScope(
kCMIOObjectPropertyScopeGlobal),
mElement: CMIOObjectPropertyElement(
kCMIOObjectPropertyElementMaster))
var prop = CMIOObjectPropertyAddress(
mSelector: .allowScreenCaptureDevices,
mScope: .global,
mElement: .master)
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
Demo
http://github.com/invalidstream/audio-reverser
Reversing Audio
1. Decode the MP3/AAC to LPCM
Reversing Audio
2. Grab a buffer from the end
Reversing Audio
3. Reverse its samples in memory
Reversing Audio
4. Write it to the front of a new file
Reversing Audio
5. Repeat until fully baked
API Needs
• Convert from MP3/AAC to LPCM
• Write sequentially to audio file (.caf, .aif, .wav)
• Random-access read from audio file
Plan A (Swift)
• AV Foundation
• AVAssetReader/Writer can do format conversion
while reading/writing audio files
• Can’t (easily) read from arbitrary packet offsets;
meant to process everything forward
Plan B (C, Swift?)
• Audio Toolbox (part of Core Audio)
• ExtAudioFile can do format conversions while
reading/writing audio files
• AudioFile can read from arbitrary packet offset
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
// declare LPCM format we are converting to
AudioStreamBasicDescription format = {0};
format.mSampleRate = 44100.0;
format.mFormatID = kAudioFormatLinearPCM;
format.mFormatFlags = kAudioFormatFlagIsPacked |
kAudioFormatFlagIsSignedInteger;
format.mBitsPerChannel = 16;
format.mChannelsPerFrame = 2;
format.mBytesPerFrame = 4;
format.mFramesPerPacket = 1;
format.mBytesPerPacket = 4;
// declare LPCM format we are converting to
var format = AudioStreamBasicDescription(
mSampleRate: 44100.0,
mFormatID: kAudioFormatLinearPCM,
mFormatFlags: kAudioFormatFlagIsPacked +
kAudioFormatFlagIsSignedInteger,
mBytesPerPacket: 4,
mFramesPerPacket: 1,
mBytesPerFrame: 4,
mChannelsPerFrame: 2,
mBitsPerChannel: 16,
mReserved: 0)
// open AudioFile for output
AudioFileID forwardAudioFile;
err = AudioFileCreateWithURL(forwardURL,
kAudioFileCAFType,
&format,
kAudioFileFlags_EraseFile,
&forwardAudioFile);
IF_ERR_RETURN
#define IF_ERR_RETURN if (err != noErr) { return err; }
// open AudioFile for output
var forwardAudioFile: AudioFileID?
err = AudioFileCreateWithURL(forwardURL,
kAudioFileCAFType,
&format,
AudioFileFlags.eraseFile,
&forwardAudioFile)
if err != noErr { return err }
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
// open AudioFile for output
var forwardAudioFile: AudioFileID?
err = AudioFileCreateWithURL(forwardURL,
kAudioFileCAFType,
&format,
AudioFileFlags.eraseFile,
&forwardAudioFile)
if err != noErr { return err }
// open AudioFile for output
var forwardAudioFile: AudioFileID?
err = AudioFileCreateWithURL(forwardURL,
kAudioFileCAFType,
&format,
AudioFileFlags.eraseFile,
&forwardAudioFile)
if err != noErr { return err }
1. Uses a free function, rather than a method on AudioFile
// open AudioFile for output
var forwardAudioFile: AudioFileID?
err = AudioFileCreateWithURL(forwardURL,
kAudioFileCAFType,
&format,
AudioFileFlags.eraseFile,
&forwardAudioFile)
if err != noErr { return err }
2. Errors are communicated via the return value, rather than
throws
// open AudioFile for output
var forwardAudioFile: AudioFileID?
err = AudioFileCreateWithURL(forwardURL,
kAudioFileCAFType,
&format,
AudioFileFlags.eraseFile,
&forwardAudioFile)
if err != noErr { return err }
3. Some parameters are UInt32 constants, some are enums
// open AudioFile for output
var forwardAudioFile: AudioFileID?
err = AudioFileCreateWithURL(forwardURL,
kAudioFileCAFType,
&format,
AudioFileFlags.eraseFile,
&forwardAudioFile)
if err != noErr { return err }
4. Audio format is passed as an
UnsafePointer<AudioStreamBasicDescription>
// open AudioFile for output
var forwardAudioFile: AudioFileID?
err = AudioFileCreateWithURL(forwardURL,
kAudioFileCAFType,
&format,
AudioFileFlags.eraseFile,
&forwardAudioFile)
if err != noErr { return err }
5. Created object is returned via an in-out parameter
To say nothing of…
Pointer arithmetic!
// swap packets inside transfer buffer
for i in 0..<packetsToTransfer/2 {
let swapSrc = transferBuffer.advanced(by: Int(i) * Int(format.mBytesPerPacket))
let swapDst = transferBuffer.advanced(by: transferBufferSize -
(Int(i+1) * Int(format.mBytesPerPacket)))
memcpy(swapBuffer, swapSrc, Int(format.mBytesPerPacket))
memcpy(swapSrc, swapDst, Int(format.mBytesPerPacket))
memcpy(swapDst, swapBuffer, Int(format.mBytesPerPacket))
}
Pointer arithmetic!
// swap packets inside transfer buffer
for i in 0..<packetsToTransfer/2 {
let swapSrc = transferBuffer.advanced(by: Int(i) * Int(format.mBytesPerPacket))
let swapDst = transferBuffer.advanced(by: transferBufferSize -
(Int(i+1) * Int(format.mBytesPerPacket)))
memcpy(swapBuffer, swapSrc, Int(format.mBytesPerPacket))
memcpy(swapSrc, swapDst, Int(format.mBytesPerPacket))
memcpy(swapDst, swapBuffer, Int(format.mBytesPerPacket))
}
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
Couldn’t you just…
extension AudioFileID {
init? (url: URL, fileType: UInt32,
format: AudioStreamBasicDescription, flags: AudioFileFlags) {
var fileId : AudioFileID?
var format = format
let err = AudioFileCreateWithURL(url as CFURL,
fileType,
&format,
flags,
&fileId)
guard err != noErr, let createdFile = fileId else { return nil }
self = createdFile
}
}
Been there, done that
• The Amazing Audio Engine !
• Novocaine (!?)
• EZAudio !
• AudioKit
• Superpowered
• etc…
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
/**
Convert a source audio file (using any Core Audio-supported codec) and create LPCM .caf
files for its forward and backward versions.
- parameter sourceURL: A file URL containing the source audio to be read from
- parameter forwardURL: A file URL with the destination to write the decompressed (LPCM) forward file
- parameter backwardURL: A file URL with the destination to write the backward file
*/
OSStatus convertAndReverse(CFURLRef sourceURL, CFURLRef forwardURL, CFURLRef backwardURL);
AudioReversingC.h
//
// Use this file to import your target's public headers that you would like to expose to Swift.
//
#import <CoreFoundation/CoreFoundation.h>
#import <AudioToolbox/AudioToolbox.h>
OSStatus convertAndReverse(CFURLRef sourceURL, CFURLRef forwardURL, CFURLRef backwardURL);
AudioReverser-Bridging-Header.h
if USE_SWIFT_CONVERTER {
err = convertAndReverseSwift(sourceURL: source as CFURL,
forwardURL: self.forwardURL as! CFURL,
backwardURL: self.backwardURL as! CFURL)
} else {
err = convertAndReverse(source as! CFURL,
self.forwardURL as! CFURL,
self.backwardURL as! CFURL)
}
C APIs on iOS/macOS
• Core Foundation
• Core Audio
• Core Media
• Video Toolbox
• Keychain
• IOKit
• OpenGL
• SQLite
• Accelerate
• OpenCV
• BSD, Mach
• etc…
Going deeper…
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
Audio Units
• Discrete software objects for working with audio
• Generators, I/O, Filters/Effects, Mixers,
Converters
• Typically combined in a “graph” model
• Used by Garage Band, Logic, etc.
Demo
R(t) = C(t) x M(t)
https://github.com/invalidstream/ring-modulator-v3audiounit
Ring Modulator
• Multiplication of two signals
• One is usually a sine wave
• Originally implemented as a ring-shaped circuit
0 0.8 1.6 2.4 3.2 4 4.8 5.6 6.4 7.2
-2.4
-1.6
-0.8
0.8
1.6
2.4
R(t) = C(t) x M(t)
0 0.8 1.6 2.4 3.2 4 4.8 5.6 6.4 7.2
-2.4
-1.6
-0.8
0.8
1.6
2.4
R(t) = C(t) x M(t)
0 0.8 1.6 2.4 3.2 4 4.8 5.6 6.4 7.2
-2.4
-1.6
-0.8
0.8
1.6
2.4
R(t) = C(t) x M(t)
0 0.8 1.6 2.4 3.2 4 4.8 5.6 6.4 7.2
-2.4
-1.6
-0.8
0.8
1.6
2.4
R(t) = C(t) x M(t)
Modulate! Modulate!
• Ring modulator best known
as the “Dalek” voice effect
on Doctor Who (circa
1963)
• Also used in early
electronic music
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
Wait, what?
-(instancetype)initWithComponentDescription:(AudioComponentDescription)componentDescription

options:(AudioComponentInstantiationOptions)options

error:(NSError **)outError {
self = [super initWithComponentDescription:componentDescription

options:options
error:outError];
if (self == nil) {
return nil;
}
// ...
return self;
}
MyAudioUnit.m
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
XPC
(macOS only)
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
Swift 3
Swift 3
Swift 4
Swift 4
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
https://github.com/apple/swift/blob/master/docs/
ABIStabilityManifesto.md
“Given the importance of getting the core ABI and the
related fundamentals correct, we are going to defer the
declaration of ABI stability out of Swift 4 while still
focusing the majority of effort to get to the point where
the ABI can be declared stable.”
—Ted Kremenek, Feb. 16, 2017

“Swift 4, stage 2 starts now”
https://lists.swift.org/pipermail/swift-evolution/Week-of-
Mon-20170213/032116.html
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
// Block which subclassers must provide to implement rendering.
- (AUInternalRenderBlock)internalRenderBlock {
// Capture in locals to avoid Obj-C member lookups.
// If "self" is captured in render, we're doing it wrong. See sample code.
return ^AUAudioUnitStatus(AudioUnitRenderActionFlags *actionFlags,
const AudioTimeStamp *timestamp,
AVAudioFrameCount frameCount,
NSInteger outputBusNumber,
AudioBufferList *outputData,
const AURenderEvent *realtimeEventListHead,
AURenderPullInputBlock pullInputBlock) {
// Do event handling and signal processing here.
return noErr;
};
}
Don’t do this
• An audio unit’s render block is called on a
realtime thread
• Therefore it cannot perform any action that could
block:
• I/O (file or network)
• Waiting on a mutex or semaphore
Also, don’t do this
• Call objc_msg_send()
• Capture any Objective-C or Swift object
• Allocate memory
Basically, if you touch anything in the block other than a pre-
allocated C struct or primitive, you’re asking for trouble.
// Block which subclassers must provide to implement rendering.
- (AUInternalRenderBlock)internalRenderBlock {
// Capture in locals to avoid Obj-C member lookups.
// If "self" is captured in render, we're doing it wrong. See sample code.
AUValue *frequencyCapture = &frequency;
AudioStreamBasicDescription *asbdCapture = &asbd;
__block UInt64 *totalFramesCapture = &totalFrames;
AudioBufferList *renderABLCapture = &renderABL;
return ^AUAudioUnitStatus(AudioUnitRenderActionFlags *actionFlags,
const AudioTimeStamp *timestamp,
AVAudioFrameCount frameCount,
NSInteger outputBusNumber,
AudioBufferList *outputData,
const AURenderEvent *realtimeEventListHead,
AURenderPullInputBlock pullInputBlock) {
// Do event handling and signal processing here.
// BLOCK IMPLEMENTATION ON NEXT SLIDE
return noErr;
};
// Block which subclassers must provide to implement rendering.
- (AUInternalRenderBlock)internalRenderBlock {
// Capture in locals to avoid Obj-C member lookups.
// If "self" is captured in render, we're doing it wrong. See sample code.
AUValue *frequencyCapture = &frequency;
AudioStreamBasicDescription *asbdCapture = &asbd;
__block UInt64 *totalFramesCapture = &totalFrames;
AudioBufferList *renderABLCapture = &renderABL;
return ^AUAudioUnitStatus(AudioUnitRenderActionFlags *actionFlags,
const AudioTimeStamp *timestamp,
AVAudioFrameCount frameCount,
NSInteger outputBusNumber,
AudioBufferList *outputData,
const AURenderEvent *realtimeEventListHead,
AURenderPullInputBlock pullInputBlock) {
// Do event handling and signal processing here.
// BLOCK IMPLEMENTATION ON NEXT SLIDE
return noErr;
};
// Block which subclassers must provide to implement rendering.
- (AUInternalRenderBlock)internalRenderBlock {
// Capture in locals to avoid Obj-C member lookups.
// If "self" is captured in render, we're doing it wrong. See sample code.
AUValue *frequencyCapture = &frequency;
AudioStreamBasicDescription *asbdCapture = &asbd;
__block UInt64 *totalFramesCapture = &totalFrames;
AudioBufferList *renderABLCapture = &renderABL;
return ^AUAudioUnitStatus(AudioUnitRenderActionFlags *actionFlags,
const AudioTimeStamp *timestamp,
AVAudioFrameCount frameCount,
NSInteger outputBusNumber,
AudioBufferList *outputData,
const AURenderEvent *realtimeEventListHead,
AURenderPullInputBlock pullInputBlock) {
// Do event handling and signal processing here.
// BLOCK IMPLEMENTATION ON NEXT SLIDE
return noErr;
};
❌
// pull in samples to filter
pullInputBlock(actionFlags, timestamp, frameCount, 0, renderABLCapture);
// copy samples from ABL, apply filter, write to outputData
size_t sampleSize = sizeof(Float32);
for (int frame = 0; frame < frameCount; frame++) {
*totalFramesCapture += 1;
for (int renderBuf = 0; renderBuf < renderABLCapture->mNumberBuffers; renderBuf++) {
Float32 *sample = renderABLCapture->mBuffers[renderBuf].mData +
(frame * asbdCapture->mBytesPerFrame);
// apply modulation
Float32 time = totalFrames / asbdCapture->mSampleRate;
*sample = *sample * fabs(sinf(M_PI * 2 * time * *frequencyCapture));
memcpy(outputData->mBuffers[renderBuf].mData +
(frame * asbdCapture->mBytesPerFrame),
sample,
sampleSize);
}
}
return noErr;
Obj-C Instance Variables!
❌
AUValue *frequencyCapture = &frequency;
AudioStreamBasicDescription *asbdCapture = &asbd;
__block UInt64 *totalFramesCapture = &totalFrames;
AudioBufferList *renderABLCapture = &renderABL;
https://github.com/apple/swift/blob/master/docs/
OwnershipManifesto.md
Certain kinds of low-level programming
require stricter performance guarantees.
Often these guarantees are less about
absolute performance than predictable
performance. For example, keeping up with
an audio stream is not a taxing job for a
modern processor, even with significant per-
sample overheads, but any sort of
unexpected hiccup is immediately noticeable
by users.
—“Swift Ownership Manifesto”,

February 2017
We believe that these problems can be addressed with an opt-in
set of features that we collectively call ownership. […]
Swift already has an ownership system, but it's “under the
covers”: it's an implementation detail that programmers have
little ability to influence. What we are proposing here is easy to
summarize:
• We should add a core rule to the ownership system,
called the Law of Exclusivity […]
• We should add features to give programmers more
control over the ownership system […]
• We should add features to allow programmers to express
types with unique ownership […]
And yet…
“[Swift] is the first industrial-quality systems
programming language that is as expressive and
enjoyable as a scripting language.”
https://developer.apple.com/library/content/documentation/
Swift/Conceptual/Swift_Programming_Language/
So… when?
Waiting…
• ABI stability — will not be in Swift 4
• Ownership — unclear
• Are these traits sufficient?
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
Strategies
• Use AV Foundation if you can
• Learn to balance C and Swift
• “Render undo C-sar what is C-sar’s…”
• The goal is to have idiomatic Swift, not Swift
that may work but looks like C
Media Frameworks and
Swift:
This is Fine
Chris Adamson • @invalidname
CocoaConf Chicago, April 2017
Slides available at slideshare.net/invalidname
Code available at github.com/invalidstream

More Related Content

PDF
Media Frameworks Versus Swift (Swift by Northwest, October 2017)
PDF
Forward Swift 2017: Media Frameworks and Swift: This Is Fine
PDF
Stupid Video Tricks (CocoaConf DC, March 2014)
PDF
Web Audio API + AngularJS
PPTX
How Functions Work
PDF
Introduction to kotlin coroutines
PDF
When symfony met promises
PDF
HTTP APIs as first class procedures in your language: cutting out SDK complex...
Media Frameworks Versus Swift (Swift by Northwest, October 2017)
Forward Swift 2017: Media Frameworks and Swift: This Is Fine
Stupid Video Tricks (CocoaConf DC, March 2014)
Web Audio API + AngularJS
How Functions Work
Introduction to kotlin coroutines
When symfony met promises
HTTP APIs as first class procedures in your language: cutting out SDK complex...

What's hot (20)

PDF
PHP Enums - PHPCon Japan 2021
PDF
What's new in PHP 8.0?
PDF
200819 NAVER TECH CONCERT 03_화려한 코루틴이 내 앱을 감싸네! 코루틴으로 작성해보는 깔끔한 비동기 코드
PDF
Java Keeps Throttling Up!
KEY
Morpheus configuration engine (slides from Saint Perl-2 conference)
PDF
Spl in the wild
PPTX
Running Ruby on Solaris (RubyKaigi 2015, 12/Dec/2015)
PDF
PHP Performance Trivia
PPT
Php my sql - functions - arrays - tutorial - programmerblog.net
PDF
Dart on server - Meetup 18/05/2017
PDF
Building Custom PHP Extensions
PDF
PHP 8: What's New and Changed
TXT
Adb instructions
PDF
Cli the other sapi pbc11
PDF
Opaque Pointers Are Coming
PDF
Deep Dive Java 17 Devoxx UK
PPTX
Z ray plugins for dummies
PDF
Create your own PHP extension, step by step - phpDay 2012 Verona
PPT
Introduction to web and php mysql
PHP Enums - PHPCon Japan 2021
What's new in PHP 8.0?
200819 NAVER TECH CONCERT 03_화려한 코루틴이 내 앱을 감싸네! 코루틴으로 작성해보는 깔끔한 비동기 코드
Java Keeps Throttling Up!
Morpheus configuration engine (slides from Saint Perl-2 conference)
Spl in the wild
Running Ruby on Solaris (RubyKaigi 2015, 12/Dec/2015)
PHP Performance Trivia
Php my sql - functions - arrays - tutorial - programmerblog.net
Dart on server - Meetup 18/05/2017
Building Custom PHP Extensions
PHP 8: What's New and Changed
Adb instructions
Cli the other sapi pbc11
Opaque Pointers Are Coming
Deep Dive Java 17 Devoxx UK
Z ray plugins for dummies
Create your own PHP extension, step by step - phpDay 2012 Verona
Introduction to web and php mysql
Ad

Similar to CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine (20)

KEY
Android workshop
PDF
Protocol-Oriented Programming in Swift
PDF
Vue JS @ MindDoc. The progressive road to online therapy
PDF
Eddystone beacons demo
PDF
Asynchronous Programming at Netflix
ODP
Non Conventional Android Programming En
ODP
Non Conventional Android Programming (English)
ODP
Bring the fun back to java
PPTX
Android & Kotlin - The code awakens #01
PDF
Apache Cordova APIs version 4.3.0
PDF
Quicli - From zero to a full CLI application in a few lines of Rust
PPTX
Django + Vue, JavaScript de 3ª generación para modernizar Django
PDF
Animating angular applications
PDF
An Introduction to Property Based Testing
PDF
JavaScript APIs - The Web is the Platform
PDF
Desymfony 2011 - Habemus Bundles
PDF
Moustamera
PDF
Night Watch with QA
PDF
First meet with Android Auto
PDF
Minimizing Decision Fatigue to Improve Team Productivity
Android workshop
Protocol-Oriented Programming in Swift
Vue JS @ MindDoc. The progressive road to online therapy
Eddystone beacons demo
Asynchronous Programming at Netflix
Non Conventional Android Programming En
Non Conventional Android Programming (English)
Bring the fun back to java
Android & Kotlin - The code awakens #01
Apache Cordova APIs version 4.3.0
Quicli - From zero to a full CLI application in a few lines of Rust
Django + Vue, JavaScript de 3ª generación para modernizar Django
Animating angular applications
An Introduction to Property Based Testing
JavaScript APIs - The Web is the Platform
Desymfony 2011 - Habemus Bundles
Moustamera
Night Watch with QA
First meet with Android Auto
Minimizing Decision Fatigue to Improve Team Productivity
Ad

More from Chris Adamson (20)

PDF
Whatever Happened to Visual Novel Anime? (AWA/Youmacon 2018)
PDF
Whatever Happened to Visual Novel Anime? (JAFAX 2018)
PDF
Fall Premieres: Media Frameworks in iOS 11, macOS 10.13, and tvOS 11 (CocoaCo...
PDF
Firebase: Totally Not Parse All Over Again (Unless It Is) (CocoaConf San Jose...
PDF
Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016)
PDF
Firebase: Totally Not Parse All Over Again (Unless It Is)
PDF
Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)
PDF
Video Killed the Rolex Star (CocoaConf San Jose, November, 2015)
PDF
Video Killed the Rolex Star (CocoaConf Columbus, July 2015)
PDF
Revenge of the 80s: Cut/Copy/Paste, Undo/Redo, and More Big Hits (CocoaConf C...
PDF
Core Image: The Most Fun API You're Not Using, CocoaConf Atlanta, December 2014
PDF
Stupid Video Tricks, CocoaConf Seattle 2014
PDF
Stupid Video Tricks, CocoaConf Las Vegas
PDF
Core Image: The Most Fun API You're Not Using (CocoaConf Columbus 2014)
PDF
Stupid Video Tricks
PDF
Introduction to the Roku SDK
PDF
Get On The Audiobus (CocoaConf Atlanta, November 2013)
PDF
Get On The Audiobus (CocoaConf Boston, October 2013)
PDF
Glitch-Free A/V Encoding (CocoaConf Boston, October 2013)
PDF
iOS Media APIs (MobiDevDay Detroit, May 2013)
Whatever Happened to Visual Novel Anime? (AWA/Youmacon 2018)
Whatever Happened to Visual Novel Anime? (JAFAX 2018)
Fall Premieres: Media Frameworks in iOS 11, macOS 10.13, and tvOS 11 (CocoaCo...
Firebase: Totally Not Parse All Over Again (Unless It Is) (CocoaConf San Jose...
Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016)
Firebase: Totally Not Parse All Over Again (Unless It Is)
Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)
Video Killed the Rolex Star (CocoaConf San Jose, November, 2015)
Video Killed the Rolex Star (CocoaConf Columbus, July 2015)
Revenge of the 80s: Cut/Copy/Paste, Undo/Redo, and More Big Hits (CocoaConf C...
Core Image: The Most Fun API You're Not Using, CocoaConf Atlanta, December 2014
Stupid Video Tricks, CocoaConf Seattle 2014
Stupid Video Tricks, CocoaConf Las Vegas
Core Image: The Most Fun API You're Not Using (CocoaConf Columbus 2014)
Stupid Video Tricks
Introduction to the Roku SDK
Get On The Audiobus (CocoaConf Atlanta, November 2013)
Get On The Audiobus (CocoaConf Boston, October 2013)
Glitch-Free A/V Encoding (CocoaConf Boston, October 2013)
iOS Media APIs (MobiDevDay Detroit, May 2013)

Recently uploaded (20)

PDF
sustainability-14-14877-v2.pddhzftheheeeee
PDF
Taming the Chaos: How to Turn Unstructured Data into Decisions
PPTX
Custom Battery Pack Design Considerations for Performance and Safety
PDF
Credit Without Borders: AI and Financial Inclusion in Bangladesh
PDF
sbt 2.0: go big (Scala Days 2025 edition)
PDF
Flame analysis and combustion estimation using large language and vision assi...
PDF
Comparative analysis of machine learning models for fake news detection in so...
PDF
The influence of sentiment analysis in enhancing early warning system model f...
PDF
How ambidextrous entrepreneurial leaders react to the artificial intelligence...
PDF
Enhancing plagiarism detection using data pre-processing and machine learning...
PDF
Consumable AI The What, Why & How for Small Teams.pdf
PPTX
Microsoft Excel 365/2024 Beginner's training
PPTX
Final SEM Unit 1 for mit wpu at pune .pptx
PDF
Produktkatalog für HOBO Datenlogger, Wetterstationen, Sensoren, Software und ...
PDF
STKI Israel Market Study 2025 version august
PPTX
Modernising the Digital Integration Hub
PDF
How IoT Sensor Integration in 2025 is Transforming Industries Worldwide
PDF
UiPath Agentic Automation session 1: RPA to Agents
PPTX
MicrosoftCybserSecurityReferenceArchitecture-April-2025.pptx
PPTX
Benefits of Physical activity for teenagers.pptx
sustainability-14-14877-v2.pddhzftheheeeee
Taming the Chaos: How to Turn Unstructured Data into Decisions
Custom Battery Pack Design Considerations for Performance and Safety
Credit Without Borders: AI and Financial Inclusion in Bangladesh
sbt 2.0: go big (Scala Days 2025 edition)
Flame analysis and combustion estimation using large language and vision assi...
Comparative analysis of machine learning models for fake news detection in so...
The influence of sentiment analysis in enhancing early warning system model f...
How ambidextrous entrepreneurial leaders react to the artificial intelligence...
Enhancing plagiarism detection using data pre-processing and machine learning...
Consumable AI The What, Why & How for Small Teams.pdf
Microsoft Excel 365/2024 Beginner's training
Final SEM Unit 1 for mit wpu at pune .pptx
Produktkatalog für HOBO Datenlogger, Wetterstationen, Sensoren, Software und ...
STKI Israel Market Study 2025 version august
Modernising the Digital Integration Hub
How IoT Sensor Integration in 2025 is Transforming Industries Worldwide
UiPath Agentic Automation session 1: RPA to Agents
MicrosoftCybserSecurityReferenceArchitecture-April-2025.pptx
Benefits of Physical activity for teenagers.pptx

CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine

  • 1. Media Frameworks and Swift: This is Fine Chris Adamson • @invalidname CocoaConf Chicago, April 2017 Slides available at slideshare.net/invalidname Code available at github.com/invalidstream
  • 2. Who the what, now? @invalidname
  • 16. import Cocoa import AVFoundation import CoreMediaIO if let devices = AVCaptureDevice.devices(), let avDevices = devices.filter( {$0 is AVCaptureDevice}) as? [AVCaptureDevice] { for device in avDevices { print("(device.description)") } }
  • 17. <AVCaptureHALDevice: 0x100b16ab0 [Loopback Simulator][com.rogueamoeba.Loopback:E8577B20-0806-4472-A5E6-426CABCD6C8E]> <AVCaptureHALDevice: 0x100c1a7c0 [Loopback Line-In][com.rogueamoeba.Loopback:A00F38FD-C2B6-43FD-98B7-23BAA6FACB03]> <AVCaptureHALDevice: 0x100c16910 [iMic USB audio system][AppleUSBAudioEngine:Griffin Technology, Inc:iMic USB audio system:220000:2,1]> <AVCaptureHALDevice: 0x100d13900 [Loopback Keynote][com.rogueamoeba.Loopback:1936D2A3-6D0B-428E-899E-0ABE46628EA4]> <AVCaptureHALDevice: 0x100a26850 [Soundflower (64ch)][SoundflowerEngine:1]> <AVCaptureHALDevice: 0x100a26310 [HD Pro Webcam C920][AppleUSBAudioEngine:Unknown Manufacturer:HD Pro Webcam C920:1218B05F:3]> <AVCaptureHALDevice: 0x100d13660 [Soundflower (2ch)][SoundflowerEngine:0]> <AVCaptureDALDevice: 0x100a348f0 [iGlasses][iGlasses]> <AVCaptureDALDevice: 0x100a28d00 [HD Pro Webcam C920][0x244000046d082d]> Program ended with exit code: 0
  • 19. CMIOObjectPropertyAddress prop = { kCMIOHardwarePropertyAllowScreenCaptureDevices, kCMIOObjectPropertyScopeGlobal, kCMIOObjectPropertyElementMaster }; UInt32 allow = 1; CMIOObjectSetPropertyData( kCMIOObjectSystemObject, &prop, 0, NULL, sizeof(allow), &allow );
  • 20. var prop = CMIOObjectPropertyAddress( mSelector: CMIOObjectPropertySelector( kCMIOHardwarePropertyAllowScreenCaptureDevices), mScope: CMIOObjectPropertyScope(kCMIOObjectPropertyScopeGlobal), mElement: CMIOObjectPropertyElement( kCMIOObjectPropertyElementMaster)) var allow : UInt32 = 1 CMIOObjectSetPropertyData(CMIOObjectID(kCMIOObjectSystemObject), &prop, 0, nil, UInt32(MemoryLayout<UInt32>.size), &allow)
  • 21. CMIOObjectPropertyAddress prop = { kCMIOHardwarePropertyAllowScreenCaptureDevices, kCMIOObjectPropertyScopeGlobal, kCMIOObjectPropertyElementMaster }; UInt32 allow = 1; CMIOObjectSetPropertyData( kCMIOObjectSystemObject, &prop, 0, NULL, sizeof(allow), &allow ); var prop = CMIOObjectPropertyAddress( mSelector: CMIOObjectPropertySelector( kCMIOHardwarePropertyAllowScreenCaptureDevices), mScope: CMIOObjectPropertyScope(kCMIOObjectPropertyScopeGlobal), mElement: CMIOObjectPropertyElement( kCMIOObjectPropertyElementMaster)) var allow : UInt32 = 1 CMIOObjectSetPropertyData(CMIOObjectID(kCMIOObjectSystemObject), &prop, 0, nil, UInt32(MemoryLayout<UInt32>.size), &allow)
  • 22. CMIOObjectPropertyAddress prop = { kCMIOHardwarePropertyAllowScreenCaptureDevices, kCMIOObjectPropertyScopeGlobal, kCMIOObjectPropertyElementMaster }; UInt32 allow = 1; CMIOObjectSetPropertyData( kCMIOObjectSystemObject, &prop, 0, NULL, sizeof(allow), &allow ); var prop = CMIOObjectPropertyAddress( mSelector: CMIOObjectPropertySelector( kCMIOHardwarePropertyAllowScreenCaptureDevices), mScope: CMIOObjectPropertyScope(kCMIOObjectPropertyScopeGlobal), mElement: CMIOObjectPropertyElement( kCMIOObjectPropertyElementMaster)) var allow : UInt32 = 1 CMIOObjectSetPropertyData(CMIOObjectID(kCMIOObjectSystemObject), &prop, 0, nil, UInt32(MemoryLayout<UInt32>.size), &allow) This is fine
  • 23. CMIOObjectPropertyAddress prop = { kCMIOHardwarePropertyAllowScreenCaptureDevices, kCMIOObjectPropertyScopeGlobal, kCMIOObjectPropertyElementMaster }; UInt32 allow = 1; CMIOObjectSetPropertyData( kCMIOObjectSystemObject, &prop, 0, NULL, sizeof(allow), &allow ); var prop = CMIOObjectPropertyAddress( mSelector: CMIOObjectPropertySelector( kCMIOHardwarePropertyAllowScreenCaptureDevices), mScope: CMIOObjectPropertyScope(kCMIOObjectPropertyScopeGlobal), mElement: CMIOObjectPropertyElement( kCMIOObjectPropertyElementMaster)) var allow : UInt32 = 1 CMIOObjectSetPropertyData(CMIOObjectID(kCMIOObjectSystemObject), &prop, 0, nil, UInt32(MemoryLayout<UInt32>.size), &allow) This is fine
  • 25. var prop = CMIOObjectPropertyAddress( mSelector: CMIOObjectPropertySelector( kCMIOHardwarePropertyAllowScreenCaptureDevices), mScope: CMIOObjectPropertyScope( kCMIOObjectPropertyScopeGlobal), mElement: CMIOObjectPropertyElement( kCMIOObjectPropertyElementMaster))
  • 26. var prop = CMIOObjectPropertyAddress( mSelector: CMIOObjectPropertySelector( kCMIOHardwarePropertyAllowScreenCaptureDevices), mScope: CMIOObjectPropertyScope( kCMIOObjectPropertyScopeGlobal), mElement: CMIOObjectPropertyElement( kCMIOObjectPropertyElementMaster))
  • 29. public typealias CMIOObjectPropertySelector = UInt32 public typealias CMIOObjectPropertyScope = UInt32 public typealias CMIOObjectPropertyElement = UInt32 public struct CMIOObjectPropertyAddress { public var mSelector: CMIOObjectPropertySelector public var mScope: CMIOObjectPropertyScope public var mElement: CMIOObjectPropertyElement public init() public init(mSelector: CMIOObjectPropertySelector, mScope: CMIOObjectPropertyScope, mElement: CMIOObjectPropertyElement) }
  • 30. extension CMIOObjectPropertySelector { static let allowScreenCaptureDevices = CMIOObjectPropertySelector( kCMIOHardwarePropertyAllowScreenCaptureDevices) } extension CMIOObjectPropertyScope { static let global = CMIOObjectPropertyScope(kCMIOObjectPropertyScopeGlobal) } extension CMIOObjectPropertyElement { static let master = CMIOObjectPropertyElement(kCMIOObjectPropertyElementMaster) }
  • 31. var prop = CMIOObjectPropertyAddress( mSelector: CMIOObjectPropertySelector( kCMIOHardwarePropertyAllowScreenCaptureDevices), mScope: CMIOObjectPropertyScope( kCMIOObjectPropertyScopeGlobal), mElement: CMIOObjectPropertyElement( kCMIOObjectPropertyElementMaster))
  • 32. var prop = CMIOObjectPropertyAddress( mSelector: .allowScreenCaptureDevices, mScope: .global, mElement: .master)
  • 35. Reversing Audio 1. Decode the MP3/AAC to LPCM
  • 36. Reversing Audio 2. Grab a buffer from the end
  • 37. Reversing Audio 3. Reverse its samples in memory
  • 38. Reversing Audio 4. Write it to the front of a new file
  • 39. Reversing Audio 5. Repeat until fully baked
  • 40. API Needs • Convert from MP3/AAC to LPCM • Write sequentially to audio file (.caf, .aif, .wav) • Random-access read from audio file
  • 41. Plan A (Swift) • AV Foundation • AVAssetReader/Writer can do format conversion while reading/writing audio files • Can’t (easily) read from arbitrary packet offsets; meant to process everything forward
  • 42. Plan B (C, Swift?) • Audio Toolbox (part of Core Audio) • ExtAudioFile can do format conversions while reading/writing audio files • AudioFile can read from arbitrary packet offset
  • 44. // declare LPCM format we are converting to AudioStreamBasicDescription format = {0}; format.mSampleRate = 44100.0; format.mFormatID = kAudioFormatLinearPCM; format.mFormatFlags = kAudioFormatFlagIsPacked | kAudioFormatFlagIsSignedInteger; format.mBitsPerChannel = 16; format.mChannelsPerFrame = 2; format.mBytesPerFrame = 4; format.mFramesPerPacket = 1; format.mBytesPerPacket = 4;
  • 45. // declare LPCM format we are converting to var format = AudioStreamBasicDescription( mSampleRate: 44100.0, mFormatID: kAudioFormatLinearPCM, mFormatFlags: kAudioFormatFlagIsPacked + kAudioFormatFlagIsSignedInteger, mBytesPerPacket: 4, mFramesPerPacket: 1, mBytesPerFrame: 4, mChannelsPerFrame: 2, mBitsPerChannel: 16, mReserved: 0)
  • 46. // open AudioFile for output AudioFileID forwardAudioFile; err = AudioFileCreateWithURL(forwardURL, kAudioFileCAFType, &format, kAudioFileFlags_EraseFile, &forwardAudioFile); IF_ERR_RETURN #define IF_ERR_RETURN if (err != noErr) { return err; }
  • 47. // open AudioFile for output var forwardAudioFile: AudioFileID? err = AudioFileCreateWithURL(forwardURL, kAudioFileCAFType, &format, AudioFileFlags.eraseFile, &forwardAudioFile) if err != noErr { return err }
  • 49. // open AudioFile for output var forwardAudioFile: AudioFileID? err = AudioFileCreateWithURL(forwardURL, kAudioFileCAFType, &format, AudioFileFlags.eraseFile, &forwardAudioFile) if err != noErr { return err }
  • 50. // open AudioFile for output var forwardAudioFile: AudioFileID? err = AudioFileCreateWithURL(forwardURL, kAudioFileCAFType, &format, AudioFileFlags.eraseFile, &forwardAudioFile) if err != noErr { return err } 1. Uses a free function, rather than a method on AudioFile
  • 51. // open AudioFile for output var forwardAudioFile: AudioFileID? err = AudioFileCreateWithURL(forwardURL, kAudioFileCAFType, &format, AudioFileFlags.eraseFile, &forwardAudioFile) if err != noErr { return err } 2. Errors are communicated via the return value, rather than throws
  • 52. // open AudioFile for output var forwardAudioFile: AudioFileID? err = AudioFileCreateWithURL(forwardURL, kAudioFileCAFType, &format, AudioFileFlags.eraseFile, &forwardAudioFile) if err != noErr { return err } 3. Some parameters are UInt32 constants, some are enums
  • 53. // open AudioFile for output var forwardAudioFile: AudioFileID? err = AudioFileCreateWithURL(forwardURL, kAudioFileCAFType, &format, AudioFileFlags.eraseFile, &forwardAudioFile) if err != noErr { return err } 4. Audio format is passed as an UnsafePointer<AudioStreamBasicDescription>
  • 54. // open AudioFile for output var forwardAudioFile: AudioFileID? err = AudioFileCreateWithURL(forwardURL, kAudioFileCAFType, &format, AudioFileFlags.eraseFile, &forwardAudioFile) if err != noErr { return err } 5. Created object is returned via an in-out parameter
  • 55. To say nothing of…
  • 56. Pointer arithmetic! // swap packets inside transfer buffer for i in 0..<packetsToTransfer/2 { let swapSrc = transferBuffer.advanced(by: Int(i) * Int(format.mBytesPerPacket)) let swapDst = transferBuffer.advanced(by: transferBufferSize - (Int(i+1) * Int(format.mBytesPerPacket))) memcpy(swapBuffer, swapSrc, Int(format.mBytesPerPacket)) memcpy(swapSrc, swapDst, Int(format.mBytesPerPacket)) memcpy(swapDst, swapBuffer, Int(format.mBytesPerPacket)) }
  • 57. Pointer arithmetic! // swap packets inside transfer buffer for i in 0..<packetsToTransfer/2 { let swapSrc = transferBuffer.advanced(by: Int(i) * Int(format.mBytesPerPacket)) let swapDst = transferBuffer.advanced(by: transferBufferSize - (Int(i+1) * Int(format.mBytesPerPacket))) memcpy(swapBuffer, swapSrc, Int(format.mBytesPerPacket)) memcpy(swapSrc, swapDst, Int(format.mBytesPerPacket)) memcpy(swapDst, swapBuffer, Int(format.mBytesPerPacket)) }
  • 60. extension AudioFileID { init? (url: URL, fileType: UInt32, format: AudioStreamBasicDescription, flags: AudioFileFlags) { var fileId : AudioFileID? var format = format let err = AudioFileCreateWithURL(url as CFURL, fileType, &format, flags, &fileId) guard err != noErr, let createdFile = fileId else { return nil } self = createdFile } }
  • 61. Been there, done that • The Amazing Audio Engine ! • Novocaine (!?) • EZAudio ! • AudioKit • Superpowered • etc…
  • 63. /** Convert a source audio file (using any Core Audio-supported codec) and create LPCM .caf files for its forward and backward versions. - parameter sourceURL: A file URL containing the source audio to be read from - parameter forwardURL: A file URL with the destination to write the decompressed (LPCM) forward file - parameter backwardURL: A file URL with the destination to write the backward file */ OSStatus convertAndReverse(CFURLRef sourceURL, CFURLRef forwardURL, CFURLRef backwardURL); AudioReversingC.h // // Use this file to import your target's public headers that you would like to expose to Swift. // #import <CoreFoundation/CoreFoundation.h> #import <AudioToolbox/AudioToolbox.h> OSStatus convertAndReverse(CFURLRef sourceURL, CFURLRef forwardURL, CFURLRef backwardURL); AudioReverser-Bridging-Header.h
  • 64. if USE_SWIFT_CONVERTER { err = convertAndReverseSwift(sourceURL: source as CFURL, forwardURL: self.forwardURL as! CFURL, backwardURL: self.backwardURL as! CFURL) } else { err = convertAndReverse(source as! CFURL, self.forwardURL as! CFURL, self.backwardURL as! CFURL) }
  • 65. C APIs on iOS/macOS • Core Foundation • Core Audio • Core Media • Video Toolbox • Keychain • IOKit • OpenGL • SQLite • Accelerate • OpenCV • BSD, Mach • etc…
  • 68. Audio Units • Discrete software objects for working with audio • Generators, I/O, Filters/Effects, Mixers, Converters • Typically combined in a “graph” model • Used by Garage Band, Logic, etc.
  • 69. Demo R(t) = C(t) x M(t) https://github.com/invalidstream/ring-modulator-v3audiounit
  • 70. Ring Modulator • Multiplication of two signals • One is usually a sine wave • Originally implemented as a ring-shaped circuit
  • 71. 0 0.8 1.6 2.4 3.2 4 4.8 5.6 6.4 7.2 -2.4 -1.6 -0.8 0.8 1.6 2.4 R(t) = C(t) x M(t)
  • 72. 0 0.8 1.6 2.4 3.2 4 4.8 5.6 6.4 7.2 -2.4 -1.6 -0.8 0.8 1.6 2.4 R(t) = C(t) x M(t)
  • 73. 0 0.8 1.6 2.4 3.2 4 4.8 5.6 6.4 7.2 -2.4 -1.6 -0.8 0.8 1.6 2.4 R(t) = C(t) x M(t)
  • 74. 0 0.8 1.6 2.4 3.2 4 4.8 5.6 6.4 7.2 -2.4 -1.6 -0.8 0.8 1.6 2.4 R(t) = C(t) x M(t)
  • 75. Modulate! Modulate! • Ring modulator best known as the “Dalek” voice effect on Doctor Who (circa 1963) • Also used in early electronic music
  • 81. -(instancetype)initWithComponentDescription:(AudioComponentDescription)componentDescription
 options:(AudioComponentInstantiationOptions)options
 error:(NSError **)outError { self = [super initWithComponentDescription:componentDescription
 options:options error:outError]; if (self == nil) { return nil; } // ... return self; } MyAudioUnit.m
  • 83. XPC
  • 92. “Given the importance of getting the core ABI and the related fundamentals correct, we are going to defer the declaration of ABI stability out of Swift 4 while still focusing the majority of effort to get to the point where the ABI can be declared stable.” —Ted Kremenek, Feb. 16, 2017
 “Swift 4, stage 2 starts now” https://lists.swift.org/pipermail/swift-evolution/Week-of- Mon-20170213/032116.html
  • 96. // Block which subclassers must provide to implement rendering. - (AUInternalRenderBlock)internalRenderBlock { // Capture in locals to avoid Obj-C member lookups. // If "self" is captured in render, we're doing it wrong. See sample code. return ^AUAudioUnitStatus(AudioUnitRenderActionFlags *actionFlags, const AudioTimeStamp *timestamp, AVAudioFrameCount frameCount, NSInteger outputBusNumber, AudioBufferList *outputData, const AURenderEvent *realtimeEventListHead, AURenderPullInputBlock pullInputBlock) { // Do event handling and signal processing here. return noErr; }; }
  • 97. Don’t do this • An audio unit’s render block is called on a realtime thread • Therefore it cannot perform any action that could block: • I/O (file or network) • Waiting on a mutex or semaphore
  • 98. Also, don’t do this • Call objc_msg_send() • Capture any Objective-C or Swift object • Allocate memory Basically, if you touch anything in the block other than a pre- allocated C struct or primitive, you’re asking for trouble.
  • 99. // Block which subclassers must provide to implement rendering. - (AUInternalRenderBlock)internalRenderBlock { // Capture in locals to avoid Obj-C member lookups. // If "self" is captured in render, we're doing it wrong. See sample code. AUValue *frequencyCapture = &frequency; AudioStreamBasicDescription *asbdCapture = &asbd; __block UInt64 *totalFramesCapture = &totalFrames; AudioBufferList *renderABLCapture = &renderABL; return ^AUAudioUnitStatus(AudioUnitRenderActionFlags *actionFlags, const AudioTimeStamp *timestamp, AVAudioFrameCount frameCount, NSInteger outputBusNumber, AudioBufferList *outputData, const AURenderEvent *realtimeEventListHead, AURenderPullInputBlock pullInputBlock) { // Do event handling and signal processing here. // BLOCK IMPLEMENTATION ON NEXT SLIDE return noErr; };
  • 100. // Block which subclassers must provide to implement rendering. - (AUInternalRenderBlock)internalRenderBlock { // Capture in locals to avoid Obj-C member lookups. // If "self" is captured in render, we're doing it wrong. See sample code. AUValue *frequencyCapture = &frequency; AudioStreamBasicDescription *asbdCapture = &asbd; __block UInt64 *totalFramesCapture = &totalFrames; AudioBufferList *renderABLCapture = &renderABL; return ^AUAudioUnitStatus(AudioUnitRenderActionFlags *actionFlags, const AudioTimeStamp *timestamp, AVAudioFrameCount frameCount, NSInteger outputBusNumber, AudioBufferList *outputData, const AURenderEvent *realtimeEventListHead, AURenderPullInputBlock pullInputBlock) { // Do event handling and signal processing here. // BLOCK IMPLEMENTATION ON NEXT SLIDE return noErr; };
  • 101. // Block which subclassers must provide to implement rendering. - (AUInternalRenderBlock)internalRenderBlock { // Capture in locals to avoid Obj-C member lookups. // If "self" is captured in render, we're doing it wrong. See sample code. AUValue *frequencyCapture = &frequency; AudioStreamBasicDescription *asbdCapture = &asbd; __block UInt64 *totalFramesCapture = &totalFrames; AudioBufferList *renderABLCapture = &renderABL; return ^AUAudioUnitStatus(AudioUnitRenderActionFlags *actionFlags, const AudioTimeStamp *timestamp, AVAudioFrameCount frameCount, NSInteger outputBusNumber, AudioBufferList *outputData, const AURenderEvent *realtimeEventListHead, AURenderPullInputBlock pullInputBlock) { // Do event handling and signal processing here. // BLOCK IMPLEMENTATION ON NEXT SLIDE return noErr; }; ❌
  • 102. // pull in samples to filter pullInputBlock(actionFlags, timestamp, frameCount, 0, renderABLCapture); // copy samples from ABL, apply filter, write to outputData size_t sampleSize = sizeof(Float32); for (int frame = 0; frame < frameCount; frame++) { *totalFramesCapture += 1; for (int renderBuf = 0; renderBuf < renderABLCapture->mNumberBuffers; renderBuf++) { Float32 *sample = renderABLCapture->mBuffers[renderBuf].mData + (frame * asbdCapture->mBytesPerFrame); // apply modulation Float32 time = totalFrames / asbdCapture->mSampleRate; *sample = *sample * fabs(sinf(M_PI * 2 * time * *frequencyCapture)); memcpy(outputData->mBuffers[renderBuf].mData + (frame * asbdCapture->mBytesPerFrame), sample, sampleSize); } } return noErr;
  • 103. Obj-C Instance Variables! ❌ AUValue *frequencyCapture = &frequency; AudioStreamBasicDescription *asbdCapture = &asbd; __block UInt64 *totalFramesCapture = &totalFrames; AudioBufferList *renderABLCapture = &renderABL;
  • 105. Certain kinds of low-level programming require stricter performance guarantees. Often these guarantees are less about absolute performance than predictable performance. For example, keeping up with an audio stream is not a taxing job for a modern processor, even with significant per- sample overheads, but any sort of unexpected hiccup is immediately noticeable by users. —“Swift Ownership Manifesto”,
 February 2017
  • 106. We believe that these problems can be addressed with an opt-in set of features that we collectively call ownership. […] Swift already has an ownership system, but it's “under the covers”: it's an implementation detail that programmers have little ability to influence. What we are proposing here is easy to summarize: • We should add a core rule to the ownership system, called the Law of Exclusivity […] • We should add features to give programmers more control over the ownership system […] • We should add features to allow programmers to express types with unique ownership […]
  • 108. “[Swift] is the first industrial-quality systems programming language that is as expressive and enjoyable as a scripting language.” https://developer.apple.com/library/content/documentation/ Swift/Conceptual/Swift_Programming_Language/
  • 110. Waiting… • ABI stability — will not be in Swift 4 • Ownership — unclear • Are these traits sufficient?
  • 114. Strategies • Use AV Foundation if you can • Learn to balance C and Swift • “Render undo C-sar what is C-sar’s…” • The goal is to have idiomatic Swift, not Swift that may work but looks like C
  • 115. Media Frameworks and Swift: This is Fine Chris Adamson • @invalidname CocoaConf Chicago, April 2017 Slides available at slideshare.net/invalidname Code available at github.com/invalidstream