SlideShare a Scribd company logo
Core Audio in iOS 6
                            Chris Adamson • @invalidname
                                  CocoaConf Raleigh
                                  December 1, 2012



                            Sides and code available on my blog:
                               http://www.subfurther.com/blog

Sunday, December 2, 12
Plug!




Sunday, December 2, 12
The Reviews Are In!




Sunday, December 2, 12
The Reviews Are In!




Sunday, December 2, 12
The Reviews Are In!




Sunday, December 2, 12
The Reviews Are In!




Sunday, December 2, 12
Legitimate copies!
                   • Amazon (paper or Kindle)
                   • Barnes & Noble (paper or Nook)
                   • Apple (iBooks)
                   • Direct from InformIT (paper, eBook [.epub
                         + .mobi + .pdf], or Bundle)
                         • 35% off with code COREAUDIO3174
Sunday, December 2, 12
What You’ll Learn

                   • What Core Audio does and doesn’t do
                   • When to use and not use it
                   • What’s new in Core Audio for iOS 6


Sunday, December 2, 12
Sunday, December 2, 12
Simple things should be simple,
                         complex things should be possible.
                                      –Alan Kay




Sunday, December 2, 12
AV Foundation,
      Media Player

                           Simple things should be simple,
                         complex things should be possible.
                                      –Alan Kay




Sunday, December 2, 12
AV Foundation,
      Media Player

                           Simple things should be simple,
                         complex things should be possible.
                                      –Alan Kay
                                                   Core Audio




Sunday, December 2, 12
Core Audio

                   • Low-level C framework for processing
                         audio
                         • Capture, play-out, real-time or off-line
                           processing
                   • The “complex things should be possible”
                         part of audio on OS X and iOS



Sunday, December 2, 12
Chris’ CA Taxonomy
                   • Engines: process streams of audio
                    • Capture, play-out, mixing, effects
                          processing
                   • Helpers: deal with formats, encodings, etc.
                    • File I/O, stream I/O, format conversion,
                          iOS “session” management


Sunday, December 2, 12
Helpers: Audio File

                   • Read from / write to multiple audio file
                         types (.aiff, .wav, .caf, .m4a, .mp3) in a
                         content-agnostic way
                   • Get metadata (data format, duration,
                         iTunes/ID3 info)




Sunday, December 2, 12
Helpers: Audio File
                                Stream

                   • Read audio from non-random-access
                         source like a network stream
                   • Discover encoding and encapsulation on
                         the fly, then deliver audio packets to client
                         application




Sunday, December 2, 12
Helpers: Converters

                   • Convert buffers of audio to and from
                         different encodings
                   • One side must be in an uncompressed
                         format (i.e., Linear PCM)




Sunday, December 2, 12
Helpers: ExtAudioFile

                   • Combine file I/O and format conversion
                   • Read a compressed file into PCM buffers
                   • Write PCM buffers into a compressed file


Sunday, December 2, 12
Helpers: Audio Session
                   • iOS-only API to negotiate use of audio
                         resources with the rest of the system
                   • Deetermine whether your app mixes with
                         other apps’ audio, honors ring/silent
                         switch, can play in background, etc.
                   • Gets notified of audio interruptions
                   • See also AVAudioSession
Sunday, December 2, 12
Engines: Audio Units

                   • Low-latency (~10ms) processing of
                         capture/play-out audio data
                   • Effects, mixing, etc.
                   • Connect units manually or via an AUGraph
                   • Much more on this topic momentarily…

Sunday, December 2, 12
Engines: Audio Queue
                   • Convenience API for recording or play-out,
                         built atop audio units
                   • Rather than processing on-demand and on
                         Core Audio’s thread, your callback provides
                         or receives buffers of audio (at whatever size
                         is convenient to you)
                   • Higher latency, naturally
                   • Supports compressed formats (MP3, AAC)
Sunday, December 2, 12
Engines: Open AL

                   • API for 3D spatialized audio, implemented
                         atop audio units
                   • Set a source’s properties (x/y/z
                         coordinates, orientation, audio buffer, etc.),
                         OpenAL renders what it sounds like to the
                         listener from that location



Sunday, December 2, 12
Engines and Helpers
                   •     Audio Units   •   Audio File

                   •     Audio Queue   •   Audio File Stream

                   •     Open AL       •   Audio Converter

                                       •   ExtAudioFile

                                       •   Audio Session




Sunday, December 2, 12
Audio Units



Sunday, December 2, 12
Audio Unit


                           AUSomething




Sunday, December 2, 12
Types of Audio Units
                   • Output (which also do input)
                   • Generator
                   • Converter
                   • Effect
                   • Mixer
                   • Music
Sunday, December 2, 12
Pull Model


                          AUSomething




Sunday, December 2, 12
Pull Model


                          AUSomething
                                        AudioUnitRender()




Sunday, December 2, 12
Pull Model



                         AUSomethingElse   AUSomething




Sunday, December 2, 12
Buses (aka, Elements)

                            AUSomethingElse




                                              AUSomething




                            AUSomethingElse




Sunday, December 2, 12
AUGraph

                         AUSomethingElse




                                           AUSomething




                         AUSomethingElse




Sunday, December 2, 12
Render Callbacks
                OSStatus converterInputRenderCallback (void *inRefCon,
                                                        AudioUnitRenderActionFlags *ioActionFlags,
                                                        const AudioTimeStamp *inTimeStamp,
                                                        UInt32 inBusNumber,
                                                        UInt32 inNumberFrames,
                                                        AudioBufferList * ioData) {
                       CCFWebRadioPlayer *player = (__bridge CCFWebRadioPlayer*) inRefCon;

                         // read from buffer
                         ioData->mBuffers[0].mData = player.preRenderData;

                         return noErr;
                }



                                                                                                     AUSomething




                                                                             AUSomethingElse




Sunday, December 2, 12
AURemoteIO
                   • Output unit used for play-out, capture
                   • A Core Audio thread repeatedly and
                         automatically calls AudioUnitRender()
                   • Must set EnableIO property to explicitly
                         enable capture and/or play-out
                         • Capture requires setting appropriate
                           AudioSession category


Sunday, December 2, 12
Create AURemoteIO
      CheckError(NewAUGraph(&_auGraph),
      ! !     "couldn't create au graph");
      !
      CheckError(AUGraphOpen(_auGraph),
      ! !     "couldn't open au graph");
      !
      AudioComponentDescription componentDesc;
      componentDesc.componentType = kAudioUnitType_Output;
      componentDesc.componentSubType = kAudioUnitSubType_RemoteIO;
      componentDesc.componentManufacturer =
                   kAudioUnitManufacturer_Apple;
      !
      AUNode remoteIONode;
      CheckError(AUGraphAddNode(_auGraph,
      ! ! ! ! ! !      &componentDesc,
      ! ! ! ! ! !      &remoteIONode),
      ! !     "couldn't add remote io node");



Sunday, December 2, 12
Getting an AudioUnit
                            from AUNode

                !    CheckError(AUGraphNodeInfo(self.auGraph,
                !    ! ! ! ! ! !       remoteIONode,
                !    ! ! ! ! ! !       NULL,
                !    ! ! ! ! ! !       &_remoteIOUnit),
                !    ! !     "couldn't get remote io unit from node");




Sunday, December 2, 12
AURemoteIO Buses


                              AURemoteIO




Sunday, December 2, 12
AURemoteIO Buses


                              AURemoteIO
                                                bus 0
                                           to output H/W




Sunday, December 2, 12
AURemoteIO Buses


                              AURemoteIO
        bus 0                                   bus 0
      from app                             to output H/W




Sunday, December 2, 12
AURemoteIO Buses

     bus 1
from input H/W
                              AURemoteIO
        bus 0                                   bus 0
      from app                             to output H/W




Sunday, December 2, 12
AURemoteIO Buses

     bus 1                                     bus 1
from input H/W                                to app
                              AURemoteIO
        bus 0                                   bus 0
      from app                             to output H/W




Sunday, December 2, 12
EnableIO
         !    UInt32 oneFlag = 1;
         !    UInt32 busZero = 0;
         !    CheckError(AudioUnitSetProperty(self.remoteIOUnit,
         !    ! ! ! ! ! ! ! ! kAudioOutputUnitProperty_EnableIO,
         !    ! ! ! ! ! ! ! ! kAudioUnitScope_Output,
         !    ! ! ! ! ! ! ! ! busZero,
         !    ! ! ! ! ! ! ! ! &oneFlag,
         !    ! ! ! ! ! ! ! ! sizeof(oneFlag)),
         !    ! !     "couldn't enable remote io output");
         !    UInt32 busOne = 1;
         !    CheckError(AudioUnitSetProperty(self.remoteIOUnit,
         !    ! ! ! ! ! ! ! ! kAudioOutputUnitProperty_EnableIO,
         !    ! ! ! ! ! ! ! ! kAudioUnitScope_Input,
         !    ! ! ! ! ! ! ! ! busOne,
         !    ! ! ! ! ! ! ! ! &oneFlag,
         !    ! ! ! ! ! ! ! ! sizeof(oneFlag)),
         !    ! !     "couldn't enable remote io input");



Sunday, December 2, 12
Pass Through

                              bus 1
                         from input H/W
                                          AURemoteIO
                                                            bus 0
                                                       to output H/W




Sunday, December 2, 12
Connect In to Out
            !    UInt32 busZero = 0;
            !    UInt32 busOne = 1;
            !    CheckError(AUGraphConnectNodeInput(self.auGraph,
            !    ! ! ! ! ! ! ! !        remoteIONode,
            !    ! ! ! ! ! ! ! !        busOne,
            !    ! ! ! ! ! ! ! !        remoteIONode,
            !    ! ! ! ! ! ! ! !        busZero),
            !    ! !     "couldn't connect remote io bus 1 to 0");




Sunday, December 2, 12
Pass-Through with Effect

                                           AUEffect




                              bus 1
                         from input H/W
                                          AURemoteIO
                                                            bus 0
                                                       to output H/W



Sunday, December 2, 12
Demo: Delay Effect
                               New in iOS 6!




Sunday, December 2, 12
Creating the AUDelay
      ! componentDesc.componentType = kAudioUnitType_Effect;
      ! componentDesc.componentSubType = kAudioUnitSubType_Delay;
      ! componentDesc.componentManufacturer =
                 kAudioUnitManufacturer_Apple;
      !
      ! AUNode effectNode;
      ! CheckError(AUGraphAddNode(self.auGraph,
      ! ! ! ! ! ! !      &componentDesc,
      ! ! ! ! ! ! !      &effectNode),
      ! ! !     "couldn't create effect node");
      ! AudioUnit effectUnit;
      ! CheckError(AUGraphNodeInfo(self.auGraph,
      ! ! ! ! ! ! !        effectNode,
      ! ! ! ! ! ! !        NULL,
      ! ! ! ! ! ! !        &effectUnit),
      ! ! !     "couldn't get effect unit from node");



Sunday, December 2, 12
The problem with effect
                           units
                   • Audio Units available since iPhone OS 2.0
                         prefer int formats
                   • Effect units arrived with iOS 5 (arm7 era)
                         and only work with float format
                   • Have to set the AUEffect unit’s format on
                         AURemoteIO



Sunday, December 2, 12
Setting formats
       !    AudioStreamBasicDescription effectDataFormat;
       !    UInt32 propSize = sizeof (effectDataFormat);
       !    CheckError(AudioUnitGetProperty(effectUnit,
       !    ! ! ! ! ! ! ! ! kAudioUnitProperty_StreamFormat,
       !    ! ! ! ! ! ! ! ! kAudioUnitScope_Output,
       !    ! ! ! ! ! ! ! ! busZero,
       !    ! ! ! ! ! ! ! ! &effectDataFormat,
       !    ! ! ! ! ! ! ! ! &propSize),
       !    ! !     "couldn't read effect format");
       !    CheckError(AudioUnitSetProperty(self.remoteIOUnit,
       !    ! ! ! ! ! ! ! ! kAudioUnitProperty_StreamFormat,
       !    ! ! ! ! ! ! ! ! kAudioUnitScope_Output,
       !    ! ! ! ! ! ! ! ! busOne,
       !    ! ! ! ! ! ! ! ! &effectDataFormat,
       !    ! ! ! ! ! ! ! ! propSize),
       !    ! !     "couldn't set bus one output format");

           Then repeat AudioUnitSetProperty() for input scope / bus 0
Sunday, December 2, 12
AUNewTimePitch

                   • New in iOS 6!
                   • Allows you to change pitch independent of
                         time, or time independent of pitch
                   • How do you use it?


Sunday, December 2, 12
AUTimePitch
   !   AudioComponentDescription effectcd = {0};
   !   effectcd.componentType = kAudioUnitType_FormatConverter;
   !   effectcd.componentSubType = kAudioUnitSubType_NewTimePitch;
   !   effectcd.componentManufacturer = kAudioUnitManufacturer_Apple;
   !
   !   AUNode effectNode;
   !   CheckError(AUGraphAddNode(self.auGraph,
   !   ! ! ! ! ! !       &effectcd,
   !   ! ! ! ! ! !       &effectNode),
   !   ! !      "couldn't get effect node [time/pitch]");




        Notice the type is AUFormatConverter, not AUEffect
Sunday, December 2, 12
AudioUnitParameters.h
                  // Parameters for AUNewTimePitch
                  enum {
                  ! ! // Global, rate, 1/32 -> 32.0, 1.0
                  ! kNewTimePitchParam_Rate! ! ! ! ! ! =
                                           !                  0,
                  ! ! // Global, Cents, -2400 -> 2400, 1.0
                  ! kNewTimePitchParam_Pitch! ! ! ! ! ! =     1,
                  ! ! // Global, generic, 3.0 -> 32.0, 8.0
                  ! kNewTimePitchParam_Overlap! ! ! ! ! !     = 4,
                  ! ! // Global, Boolean, 0->1, 1
                  ! kNewTimePitchParam_EnablePeakLocking! !   ! = 6
                  };



  This is the entire documentation for the AUNewTimePitch parameters


Sunday, December 2, 12
AUNewTimePitch
                              parameters
                   • Rate: kNewTimePitchParam_Rate takes a
                         Float32 rate from 1/32 speed to 32x
                         speed.
                         • Use powers of 2: 1/32, 1/16, …, 2, 4, 8…
                   • Pitch: kNewTimePitchParam_Pitch takes
                         a Float32 representing cents, meaning
                         1/100 of a musical semitone


Sunday, December 2, 12
Pitch shifting


                   • Pitch can vary, time does not
                   • Suitable for real-time sources, such as audio
                         capture




Sunday, December 2, 12
Demo: Pitch Shift
                              New in iOS 6!




Sunday, December 2, 12
Rate shifting
                   • Rate can vary, pitch does not
                    • Think of 1.5x and 2x speed modes in
                           Podcasts app
                   • Not suitable for real-time sources, as data
                         will be consumed faster. Files work well.
                         • Sources must be able to map time
                           systems with
                           kAudioUnitProperty_InputSamplesInOutput


Sunday, December 2, 12
Demo: Rate Shift
                              New in iOS 6!




Sunday, December 2, 12
AUSplitter

                                            AUSomethingElse




                         AUSplitter



                                            AUSomethingElse




                                  New in iOS 6!
Sunday, December 2, 12
AUMatrixMixer
                         AUSomethingElse



                                                              AUSomethingElse




                         AUSomethingElse      AUMatrixMixer



                                                              AUSomethingElse




                         AUSomethingElse




                                           New in iOS 6!
Sunday, December 2, 12
Audio Queues
                    (and the APIs that help them)




Sunday, December 2, 12
AudioQueue
                   • Easier than AURemoteIO - provide data
                         when you want to, less time pressure, can
                         accept or provide compressed formats
                         (MP3, AAC)
                   • Recording queue - receive buffers of
                         captured audio in a callback
                   • Play-out queue - enqueue buffers of audio
                         to play, optionally refill in a callback


Sunday, December 2, 12
AudioQueue


                            2   1   0




Sunday, December 2, 12
Common AQ scenarios
                   • File player - Read from file and “prime”
                         queue buffers, start queue, when called
                         back with used buffer, refill from next part
                         of file
                   • Synthesis - Maintain state in your own
                         code, write raw samples into buffers during
                         callbacks


Sunday, December 2, 12
Web Radio

                   • Thursday class’ third project
                   • Use Audio File Stream Services to pick out
                         audio data from a network stream
                   • Enqueue these packets as new AQ buffers
                   • Dispose used buffers in callback

Sunday, December 2, 12
Parsing web radio




Sunday, December 2, 12
Parsing web radio
   NSURLConnection delivers
   NSData buffers, containing audio
   and framing info. We pass it to              NSData                 NSData
   Audio File Services.               Packets      Packets   Packets   Packets   Packets




Sunday, December 2, 12
Parsing web radio
   NSURLConnection delivers
   NSData buffers, containing audio
   and framing info. We pass it to                NSData                         NSData
   Audio File Services.                 Packets        Packets         Packets   Packets   Packets




                                        Packets   Packets
   Audio File Services calls us back
   with parsed packets of audio data.   Packets   Packets    Packets




Sunday, December 2, 12
Parsing web radio
   NSURLConnection delivers
   NSData buffers, containing audio
   and framing info. We pass it to                NSData                                  NSData
   Audio File Services.                 Packets        Packets              Packets       Packets   Packets




                                        Packets   Packets
   Audio File Services calls us back
   with parsed packets of audio data.   Packets   Packets       Packets




 We create an AudioQueueBuffer
                                                   Packets                      Packets
 with those packets and enqueue it                   Packets
                                                       2          Packets
                                                                    1              0
 for play-out.
                                                      Packets                   Packets




Sunday, December 2, 12
A complex thing!

                   • What if we want to see that data after it’s
                         been decoded to PCM and is about to be
                         played?
                         • e.g., spectrum analysis, effects, visualizers
                   • AudioQueue design is “fire-and-forget”

Sunday, December 2, 12
AudioQueue Tap!




                         http://www.last.fm/music/Spinal+Tap
Sunday, December 2, 12
AudioQueueProcessingTap

                   • Set as a property on the Audio Queue
                   • Calls back to your function with decoded
                         (PCM) audio data
                   • Three types: pre- or post- effects (that the
                         AQ performs), or siphon. First two can
                         modify the data.
                   • Only documentation is in AudioQueue.h
Sunday, December 2, 12
Creating an AQ Tap
   !    !      // create the tap
   !    !      UInt32 maxFrames = 0;
   !    !      AudioStreamBasicDescription tapFormat = {0};
   !    !      AudioQueueProcessingTapRef tapRef;
   !    !      CheckError(AudioQueueProcessingTapNew(audioQueue,
   !    !      ! ! ! ! ! ! ! ! !       tapProc,
   !    !      ! ! ! ! ! ! ! ! !       (__bridge void *)(player),
   !    !      ! ! ! ! ! ! ! ! !       kAudioQueueProcessingTap_PreEffects,
   !    !      ! ! ! ! ! ! ! ! !       &maxFrames,
   !    !      ! ! ! ! ! ! ! ! !       &tapFormat,
   !    !      ! ! ! ! ! ! ! ! !       &tapRef),
   !    !      ! !     "couldn't create AQ tap");



            Notice that you receive maxFrames and tapFormat. These do not appear to be settable.


Sunday, December 2, 12
AQ Tap Proc
       void tapProc (void *                            inClientData,
       ! ! !     AudioQueueProcessingTapRef       inAQTap,
       ! ! !     UInt32                           inNumberFrames,
       ! ! !     AudioTimeStamp *                 ioTimeStamp,
       ! ! !     UInt32 *                         ioFlags,
       ! ! !     UInt32 *                         outNumberFrames,
       ! ! !     AudioBufferList *                ioData) {
       ! CCFWebRadioPlayer *player =
               (__bridge CCFWebRadioPlayer*) inClientData;
       ! UInt32 getSourceFlags = 0;
       ! UInt32 getSourceFrames = 0;
       ! AudioQueueProcessingTapGetSourceAudio(inAQTap,
       ! ! ! ! ! ! ! ! ! !         inNumberFrames,
       ! ! ! ! ! ! ! ! ! !         ioTimeStamp,
       ! ! ! ! ! ! ! ! ! !         &getSourceFlags,
       ! ! ! ! ! ! ! ! ! !         &getSourceFrames,
       ! ! ! ! ! ! ! ! ! !         ioData);
         // then do something with ioData
         // ...
Sunday, December 2, 12
So what should we do
                            with the audio?




Sunday, December 2, 12
So what should we do
                            with the audio?


                           Let’s apply our pitch-shift effect




Sunday, December 2, 12
Shouldn’t this work?


                                AUEffect




Sunday, December 2, 12
Shouldn’t this work?


                                AUEffect
                                           AudioUnitRender()




Sunday, December 2, 12
AudioUnitRender()
                   • Last argument is an AudioBufferList, whose
                         AudioBuffer members have mData pointers
                         • If mData != NULL, audio unit does its
                           thing with those samples
                         • If mData == NULL, audio data pulls from
                           whatever it’s connected to
                   • So we just call with AudioBufferList ioData
                         we got from tap callback, right?


Sunday, December 2, 12
Psych!

                   • AQ tap provides data as signed ints
                   • Effect units only work with floating point
                   • We need to do an on-the-spot format
                         conversion




Sunday, December 2, 12
invalidname’s convert-
                            and-effect recipe
                          OSStatus converterInputRenderCallback (void *inRefCon,
                                                                  AudioUnitRenderActionFlags *ioActionFlags,
                                                                  const AudioTimeStamp *inTimeStamp,
                                                                  UInt32 inBusNumber,
                                                                  UInt32 inNumberFrames,
                                                                  AudioBufferList * ioData) {
                                 CCFWebRadioPlayer *player = (__bridge CCFWebRadioPlayer*) inRefCon;

                                // read from buffer
                                ioData->mBuffers[0].mData = player.preRenderData;

                                return noErr;
                          }




                                   AUConverter                               AUEffect                          AUConverter   AUGenericOutput




               Note: red arrows are float format, yellow arrows are int

Sunday, December 2, 12
How it works

                   • AUGraph: AUConverter → AUEffect →
                         AUConverter → AUGenericOutput
                   • Top AUConverter is connected to a render
                         callback function




Sunday, December 2, 12
The trick!
                   • Copy mData pointer to a state variable and
                         NULL it in ioData
                   • Call AudioQueueRender() on output unit.
                         The NULL makes it pull from the graph.
                   • Top of the graph pulls on render callback,
                         which gives it back the mData we copied
                         off.


Sunday, December 2, 12
Yes, really
                         This is the rest of tapProc()
               ! // copy off the ioData so the graph can read from it
                 // in render callback
               ! player.preRenderData = ioData->mBuffers[0].mData;
               ! ioData->mBuffers[0].mData = NULL;
               !
               ! OSStatus renderErr = noErr;
               ! AudioUnitRenderActionFlags actionFlags = 0;
               ! renderErr = AudioUnitRender(player.genericOutputUnit,
               ! ! ! ! ! ! ! ! &actionFlags,
               ! ! ! ! ! ! ! ! player.renderTimeStamp,
               ! ! ! ! ! ! ! ! 0,
               ! ! ! ! ! ! ! ! inNumberFrames,
               ! ! ! ! ! ! ! ! ioData);
               ! NSLog (@"AudioUnitRender, renderErr = %ld",renderErr);
               }
Sunday, December 2, 12
Yes, really
           This is the render callback that supplies data to the int→float converter

    OSStatus converterInputRenderCallback (void *inRefCon,
    ! ! ! ! ! ! ! ! !        AudioUnitRenderActionFlags *ioActionFlags,
    ! ! ! ! ! ! ! ! !        const AudioTimeStamp *inTimeStamp,
    ! ! ! ! ! ! ! ! !        UInt32 inBusNumber,
    ! ! ! ! ! ! ! ! !        UInt32 inNumberFrames,
    ! ! ! ! ! ! ! ! !        AudioBufferList * ioData) {
    ! CCFWebRadioPlayer *player =
                   (__bridge CCFWebRadioPlayer*) inRefCon;
    !
    ! // read from buffer
    ! ioData->mBuffers[0].mData = player.preRenderData;

    ! return noErr;
    }


Sunday, December 2, 12
Demo: AQ Tap +
                         AUNewTimePitch
                              New in iOS 6!




Sunday, December 2, 12
Sunday, December 2, 12
Other new stuff



Sunday, December 2, 12
Multi-Route
                   • Ordinarily, one input or output is active:
                         earpiece, speaker, headphones, dock-
                         connected device
                         • “Last in wins”
                   • With AV Session “multi-route” category,
                         you can use several at once
                   • WWDC 2012 session 505
Sunday, December 2, 12
Utility classes moved
                                 again
                   • C++ utilities, including the CARingBuffer
                    • < Xcode 4.3, installed into /Developer
                    • Xcode 4.3-4.4, optional download from
                           developer.apple.com
                         • ≧ Xcode 4.5, sample code project “Core
                           Audio Utility Classes”


Sunday, December 2, 12
Takeaways
                   • Core Audio fundamentals never change
                   • New stuff is added as properties, typedefs,
                         enums, etc.
                   • Watch the SDK API diffs document to find
                         the new stuff
                   • Hope you like header files and
                         experimentation


Sunday, December 2, 12
Q&A
                   • Slides will be posted to slideshare.net/
                         invalidname
                   • Code will be linked from there and my blog
                   • Watch CocoaConf RDU glassboard,
                         @invalidname on Twitter/ADN, or [Time
                         code]; blog for announcement
                   • Thanks!
Sunday, December 2, 12

More Related Content

PDF
Core Audio Cranks It Up
PDF
Core MIDI and Friends
PDF
Capturing Stills, Sounds, and Scenes with AV Foundation
PDF
iOS Media APIs (MobiDevDay Detroit, May 2013)
PDF
Get On The Audiobus (CocoaConf Atlanta, November 2013)
PDF
Next Gen: More Than Extra Channels?
PPT
Core audio
PDF
Adaptive Music in Games
Core Audio Cranks It Up
Core MIDI and Friends
Capturing Stills, Sounds, and Scenes with AV Foundation
iOS Media APIs (MobiDevDay Detroit, May 2013)
Get On The Audiobus (CocoaConf Atlanta, November 2013)
Next Gen: More Than Extra Channels?
Core audio
Adaptive Music in Games

Similar to Core Audio in iOS 6 (CocoaConf Raleigh, Dec. '12) (20)

PDF
Core Audio in iOS 6 (CocoaConf DC, March 2013)
PDF
Become a rockstar using FOSS!
PDF
AudioSIG28Nov2011
KEY
Core Audio in iOS 6 (CocoaConf Portland, Oct. '12)
PDF
Core Audio in iOS 6 (CocoaConf San Jose, April 2013)
PDF
Core Audio in iOS 6 (CocoaConf Chicago, March 2013)
PDF
Voice That Matter 2010 - Core Audio
PDF
Low power fpga solution for dab audio decoder
PPT
Multimedia tools (sound)
PPT
Sound
PDF
Multimedia elements
PDF
Computer Music Instruments Foundations Design And Development Victor Lazzarini
PPTX
Digital Audio
PPTX
Audios in Unity
PDF
Computer sound design synthesis techniques and programming 2nd Edition Eduard...
PDF
Computer Sound Design Synthesis Techniques And Programming 2nd Edition Eduard...
PPT
Soundpres
PDF
Core Audio: Don't Be Afraid to Play it LOUD! [360iDev, San Jose 2010]
PPT
Chapter 02 audio recording - part ii
Core Audio in iOS 6 (CocoaConf DC, March 2013)
Become a rockstar using FOSS!
AudioSIG28Nov2011
Core Audio in iOS 6 (CocoaConf Portland, Oct. '12)
Core Audio in iOS 6 (CocoaConf San Jose, April 2013)
Core Audio in iOS 6 (CocoaConf Chicago, March 2013)
Voice That Matter 2010 - Core Audio
Low power fpga solution for dab audio decoder
Multimedia tools (sound)
Sound
Multimedia elements
Computer Music Instruments Foundations Design And Development Victor Lazzarini
Digital Audio
Audios in Unity
Computer sound design synthesis techniques and programming 2nd Edition Eduard...
Computer Sound Design Synthesis Techniques And Programming 2nd Edition Eduard...
Soundpres
Core Audio: Don't Be Afraid to Play it LOUD! [360iDev, San Jose 2010]
Chapter 02 audio recording - part ii
Ad

More from Chris Adamson (20)

PDF
Whatever Happened to Visual Novel Anime? (AWA/Youmacon 2018)
PDF
Whatever Happened to Visual Novel Anime? (JAFAX 2018)
PDF
Media Frameworks Versus Swift (Swift by Northwest, October 2017)
PDF
Fall Premieres: Media Frameworks in iOS 11, macOS 10.13, and tvOS 11 (CocoaCo...
PDF
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
PDF
Forward Swift 2017: Media Frameworks and Swift: This Is Fine
PDF
Firebase: Totally Not Parse All Over Again (Unless It Is) (CocoaConf San Jose...
PDF
Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016)
PDF
Firebase: Totally Not Parse All Over Again (Unless It Is)
PDF
Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)
PDF
Video Killed the Rolex Star (CocoaConf San Jose, November, 2015)
PDF
Video Killed the Rolex Star (CocoaConf Columbus, July 2015)
PDF
Revenge of the 80s: Cut/Copy/Paste, Undo/Redo, and More Big Hits (CocoaConf C...
PDF
Core Image: The Most Fun API You're Not Using, CocoaConf Atlanta, December 2014
PDF
Stupid Video Tricks, CocoaConf Seattle 2014
PDF
Stupid Video Tricks, CocoaConf Las Vegas
PDF
Core Image: The Most Fun API You're Not Using (CocoaConf Columbus 2014)
PDF
Stupid Video Tricks (CocoaConf DC, March 2014)
PDF
Stupid Video Tricks
PDF
Introduction to the Roku SDK
Whatever Happened to Visual Novel Anime? (AWA/Youmacon 2018)
Whatever Happened to Visual Novel Anime? (JAFAX 2018)
Media Frameworks Versus Swift (Swift by Northwest, October 2017)
Fall Premieres: Media Frameworks in iOS 11, macOS 10.13, and tvOS 11 (CocoaCo...
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
Forward Swift 2017: Media Frameworks and Swift: This Is Fine
Firebase: Totally Not Parse All Over Again (Unless It Is) (CocoaConf San Jose...
Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016)
Firebase: Totally Not Parse All Over Again (Unless It Is)
Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)
Video Killed the Rolex Star (CocoaConf San Jose, November, 2015)
Video Killed the Rolex Star (CocoaConf Columbus, July 2015)
Revenge of the 80s: Cut/Copy/Paste, Undo/Redo, and More Big Hits (CocoaConf C...
Core Image: The Most Fun API You're Not Using, CocoaConf Atlanta, December 2014
Stupid Video Tricks, CocoaConf Seattle 2014
Stupid Video Tricks, CocoaConf Las Vegas
Core Image: The Most Fun API You're Not Using (CocoaConf Columbus 2014)
Stupid Video Tricks (CocoaConf DC, March 2014)
Stupid Video Tricks
Introduction to the Roku SDK
Ad

Recently uploaded (20)

PPTX
Digital-Transformation-Roadmap-for-Companies.pptx
PPTX
Detection-First SIEM: Rule Types, Dashboards, and Threat-Informed Strategy
PDF
Review of recent advances in non-invasive hemoglobin estimation
PPT
“AI and Expert System Decision Support & Business Intelligence Systems”
PDF
Dropbox Q2 2025 Financial Results & Investor Presentation
PDF
Modernizing your data center with Dell and AMD
PDF
Per capita expenditure prediction using model stacking based on satellite ima...
PPTX
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
PPTX
Cloud computing and distributed systems.
PDF
NewMind AI Weekly Chronicles - August'25 Week I
PDF
Agricultural_Statistics_at_a_Glance_2022_0.pdf
PDF
KodekX | Application Modernization Development
PDF
Encapsulation theory and applications.pdf
PDF
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
PDF
Reach Out and Touch Someone: Haptics and Empathic Computing
PDF
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
PDF
Chapter 3 Spatial Domain Image Processing.pdf
PDF
Advanced methodologies resolving dimensionality complications for autism neur...
PDF
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
PDF
Mobile App Security Testing_ A Comprehensive Guide.pdf
Digital-Transformation-Roadmap-for-Companies.pptx
Detection-First SIEM: Rule Types, Dashboards, and Threat-Informed Strategy
Review of recent advances in non-invasive hemoglobin estimation
“AI and Expert System Decision Support & Business Intelligence Systems”
Dropbox Q2 2025 Financial Results & Investor Presentation
Modernizing your data center with Dell and AMD
Per capita expenditure prediction using model stacking based on satellite ima...
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
Cloud computing and distributed systems.
NewMind AI Weekly Chronicles - August'25 Week I
Agricultural_Statistics_at_a_Glance_2022_0.pdf
KodekX | Application Modernization Development
Encapsulation theory and applications.pdf
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
Reach Out and Touch Someone: Haptics and Empathic Computing
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
Chapter 3 Spatial Domain Image Processing.pdf
Advanced methodologies resolving dimensionality complications for autism neur...
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
Mobile App Security Testing_ A Comprehensive Guide.pdf

Core Audio in iOS 6 (CocoaConf Raleigh, Dec. '12)

  • 1. Core Audio in iOS 6 Chris Adamson • @invalidname CocoaConf Raleigh December 1, 2012 Sides and code available on my blog: http://www.subfurther.com/blog Sunday, December 2, 12
  • 3. The Reviews Are In! Sunday, December 2, 12
  • 4. The Reviews Are In! Sunday, December 2, 12
  • 5. The Reviews Are In! Sunday, December 2, 12
  • 6. The Reviews Are In! Sunday, December 2, 12
  • 7. Legitimate copies! • Amazon (paper or Kindle) • Barnes & Noble (paper or Nook) • Apple (iBooks) • Direct from InformIT (paper, eBook [.epub + .mobi + .pdf], or Bundle) • 35% off with code COREAUDIO3174 Sunday, December 2, 12
  • 8. What You’ll Learn • What Core Audio does and doesn’t do • When to use and not use it • What’s new in Core Audio for iOS 6 Sunday, December 2, 12
  • 10. Simple things should be simple, complex things should be possible. –Alan Kay Sunday, December 2, 12
  • 11. AV Foundation, Media Player Simple things should be simple, complex things should be possible. –Alan Kay Sunday, December 2, 12
  • 12. AV Foundation, Media Player Simple things should be simple, complex things should be possible. –Alan Kay Core Audio Sunday, December 2, 12
  • 13. Core Audio • Low-level C framework for processing audio • Capture, play-out, real-time or off-line processing • The “complex things should be possible” part of audio on OS X and iOS Sunday, December 2, 12
  • 14. Chris’ CA Taxonomy • Engines: process streams of audio • Capture, play-out, mixing, effects processing • Helpers: deal with formats, encodings, etc. • File I/O, stream I/O, format conversion, iOS “session” management Sunday, December 2, 12
  • 15. Helpers: Audio File • Read from / write to multiple audio file types (.aiff, .wav, .caf, .m4a, .mp3) in a content-agnostic way • Get metadata (data format, duration, iTunes/ID3 info) Sunday, December 2, 12
  • 16. Helpers: Audio File Stream • Read audio from non-random-access source like a network stream • Discover encoding and encapsulation on the fly, then deliver audio packets to client application Sunday, December 2, 12
  • 17. Helpers: Converters • Convert buffers of audio to and from different encodings • One side must be in an uncompressed format (i.e., Linear PCM) Sunday, December 2, 12
  • 18. Helpers: ExtAudioFile • Combine file I/O and format conversion • Read a compressed file into PCM buffers • Write PCM buffers into a compressed file Sunday, December 2, 12
  • 19. Helpers: Audio Session • iOS-only API to negotiate use of audio resources with the rest of the system • Deetermine whether your app mixes with other apps’ audio, honors ring/silent switch, can play in background, etc. • Gets notified of audio interruptions • See also AVAudioSession Sunday, December 2, 12
  • 20. Engines: Audio Units • Low-latency (~10ms) processing of capture/play-out audio data • Effects, mixing, etc. • Connect units manually or via an AUGraph • Much more on this topic momentarily… Sunday, December 2, 12
  • 21. Engines: Audio Queue • Convenience API for recording or play-out, built atop audio units • Rather than processing on-demand and on Core Audio’s thread, your callback provides or receives buffers of audio (at whatever size is convenient to you) • Higher latency, naturally • Supports compressed formats (MP3, AAC) Sunday, December 2, 12
  • 22. Engines: Open AL • API for 3D spatialized audio, implemented atop audio units • Set a source’s properties (x/y/z coordinates, orientation, audio buffer, etc.), OpenAL renders what it sounds like to the listener from that location Sunday, December 2, 12
  • 23. Engines and Helpers • Audio Units • Audio File • Audio Queue • Audio File Stream • Open AL • Audio Converter • ExtAudioFile • Audio Session Sunday, December 2, 12
  • 25. Audio Unit AUSomething Sunday, December 2, 12
  • 26. Types of Audio Units • Output (which also do input) • Generator • Converter • Effect • Mixer • Music Sunday, December 2, 12
  • 27. Pull Model AUSomething Sunday, December 2, 12
  • 28. Pull Model AUSomething AudioUnitRender() Sunday, December 2, 12
  • 29. Pull Model AUSomethingElse AUSomething Sunday, December 2, 12
  • 30. Buses (aka, Elements) AUSomethingElse AUSomething AUSomethingElse Sunday, December 2, 12
  • 31. AUGraph AUSomethingElse AUSomething AUSomethingElse Sunday, December 2, 12
  • 32. Render Callbacks OSStatus converterInputRenderCallback (void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList * ioData) { CCFWebRadioPlayer *player = (__bridge CCFWebRadioPlayer*) inRefCon; // read from buffer ioData->mBuffers[0].mData = player.preRenderData; return noErr; } AUSomething AUSomethingElse Sunday, December 2, 12
  • 33. AURemoteIO • Output unit used for play-out, capture • A Core Audio thread repeatedly and automatically calls AudioUnitRender() • Must set EnableIO property to explicitly enable capture and/or play-out • Capture requires setting appropriate AudioSession category Sunday, December 2, 12
  • 34. Create AURemoteIO CheckError(NewAUGraph(&_auGraph), ! ! "couldn't create au graph"); ! CheckError(AUGraphOpen(_auGraph), ! ! "couldn't open au graph"); ! AudioComponentDescription componentDesc; componentDesc.componentType = kAudioUnitType_Output; componentDesc.componentSubType = kAudioUnitSubType_RemoteIO; componentDesc.componentManufacturer = kAudioUnitManufacturer_Apple; ! AUNode remoteIONode; CheckError(AUGraphAddNode(_auGraph, ! ! ! ! ! ! &componentDesc, ! ! ! ! ! ! &remoteIONode), ! ! "couldn't add remote io node"); Sunday, December 2, 12
  • 35. Getting an AudioUnit from AUNode ! CheckError(AUGraphNodeInfo(self.auGraph, ! ! ! ! ! ! ! remoteIONode, ! ! ! ! ! ! ! NULL, ! ! ! ! ! ! ! &_remoteIOUnit), ! ! ! "couldn't get remote io unit from node"); Sunday, December 2, 12
  • 36. AURemoteIO Buses AURemoteIO Sunday, December 2, 12
  • 37. AURemoteIO Buses AURemoteIO bus 0 to output H/W Sunday, December 2, 12
  • 38. AURemoteIO Buses AURemoteIO bus 0 bus 0 from app to output H/W Sunday, December 2, 12
  • 39. AURemoteIO Buses bus 1 from input H/W AURemoteIO bus 0 bus 0 from app to output H/W Sunday, December 2, 12
  • 40. AURemoteIO Buses bus 1 bus 1 from input H/W to app AURemoteIO bus 0 bus 0 from app to output H/W Sunday, December 2, 12
  • 41. EnableIO ! UInt32 oneFlag = 1; ! UInt32 busZero = 0; ! CheckError(AudioUnitSetProperty(self.remoteIOUnit, ! ! ! ! ! ! ! ! ! kAudioOutputUnitProperty_EnableIO, ! ! ! ! ! ! ! ! ! kAudioUnitScope_Output, ! ! ! ! ! ! ! ! ! busZero, ! ! ! ! ! ! ! ! ! &oneFlag, ! ! ! ! ! ! ! ! ! sizeof(oneFlag)), ! ! ! "couldn't enable remote io output"); ! UInt32 busOne = 1; ! CheckError(AudioUnitSetProperty(self.remoteIOUnit, ! ! ! ! ! ! ! ! ! kAudioOutputUnitProperty_EnableIO, ! ! ! ! ! ! ! ! ! kAudioUnitScope_Input, ! ! ! ! ! ! ! ! ! busOne, ! ! ! ! ! ! ! ! ! &oneFlag, ! ! ! ! ! ! ! ! ! sizeof(oneFlag)), ! ! ! "couldn't enable remote io input"); Sunday, December 2, 12
  • 42. Pass Through bus 1 from input H/W AURemoteIO bus 0 to output H/W Sunday, December 2, 12
  • 43. Connect In to Out ! UInt32 busZero = 0; ! UInt32 busOne = 1; ! CheckError(AUGraphConnectNodeInput(self.auGraph, ! ! ! ! ! ! ! ! ! remoteIONode, ! ! ! ! ! ! ! ! ! busOne, ! ! ! ! ! ! ! ! ! remoteIONode, ! ! ! ! ! ! ! ! ! busZero), ! ! ! "couldn't connect remote io bus 1 to 0"); Sunday, December 2, 12
  • 44. Pass-Through with Effect AUEffect bus 1 from input H/W AURemoteIO bus 0 to output H/W Sunday, December 2, 12
  • 45. Demo: Delay Effect New in iOS 6! Sunday, December 2, 12
  • 46. Creating the AUDelay ! componentDesc.componentType = kAudioUnitType_Effect; ! componentDesc.componentSubType = kAudioUnitSubType_Delay; ! componentDesc.componentManufacturer = kAudioUnitManufacturer_Apple; ! ! AUNode effectNode; ! CheckError(AUGraphAddNode(self.auGraph, ! ! ! ! ! ! ! &componentDesc, ! ! ! ! ! ! ! &effectNode), ! ! ! "couldn't create effect node"); ! AudioUnit effectUnit; ! CheckError(AUGraphNodeInfo(self.auGraph, ! ! ! ! ! ! ! effectNode, ! ! ! ! ! ! ! NULL, ! ! ! ! ! ! ! &effectUnit), ! ! ! "couldn't get effect unit from node"); Sunday, December 2, 12
  • 47. The problem with effect units • Audio Units available since iPhone OS 2.0 prefer int formats • Effect units arrived with iOS 5 (arm7 era) and only work with float format • Have to set the AUEffect unit’s format on AURemoteIO Sunday, December 2, 12
  • 48. Setting formats ! AudioStreamBasicDescription effectDataFormat; ! UInt32 propSize = sizeof (effectDataFormat); ! CheckError(AudioUnitGetProperty(effectUnit, ! ! ! ! ! ! ! ! ! kAudioUnitProperty_StreamFormat, ! ! ! ! ! ! ! ! ! kAudioUnitScope_Output, ! ! ! ! ! ! ! ! ! busZero, ! ! ! ! ! ! ! ! ! &effectDataFormat, ! ! ! ! ! ! ! ! ! &propSize), ! ! ! "couldn't read effect format"); ! CheckError(AudioUnitSetProperty(self.remoteIOUnit, ! ! ! ! ! ! ! ! ! kAudioUnitProperty_StreamFormat, ! ! ! ! ! ! ! ! ! kAudioUnitScope_Output, ! ! ! ! ! ! ! ! ! busOne, ! ! ! ! ! ! ! ! ! &effectDataFormat, ! ! ! ! ! ! ! ! ! propSize), ! ! ! "couldn't set bus one output format"); Then repeat AudioUnitSetProperty() for input scope / bus 0 Sunday, December 2, 12
  • 49. AUNewTimePitch • New in iOS 6! • Allows you to change pitch independent of time, or time independent of pitch • How do you use it? Sunday, December 2, 12
  • 50. AUTimePitch ! AudioComponentDescription effectcd = {0}; ! effectcd.componentType = kAudioUnitType_FormatConverter; ! effectcd.componentSubType = kAudioUnitSubType_NewTimePitch; ! effectcd.componentManufacturer = kAudioUnitManufacturer_Apple; ! ! AUNode effectNode; ! CheckError(AUGraphAddNode(self.auGraph, ! ! ! ! ! ! ! &effectcd, ! ! ! ! ! ! ! &effectNode), ! ! ! "couldn't get effect node [time/pitch]"); Notice the type is AUFormatConverter, not AUEffect Sunday, December 2, 12
  • 51. AudioUnitParameters.h // Parameters for AUNewTimePitch enum { ! ! // Global, rate, 1/32 -> 32.0, 1.0 ! kNewTimePitchParam_Rate! ! ! ! ! ! = ! 0, ! ! // Global, Cents, -2400 -> 2400, 1.0 ! kNewTimePitchParam_Pitch! ! ! ! ! ! = 1, ! ! // Global, generic, 3.0 -> 32.0, 8.0 ! kNewTimePitchParam_Overlap! ! ! ! ! ! = 4, ! ! // Global, Boolean, 0->1, 1 ! kNewTimePitchParam_EnablePeakLocking! ! ! = 6 }; This is the entire documentation for the AUNewTimePitch parameters Sunday, December 2, 12
  • 52. AUNewTimePitch parameters • Rate: kNewTimePitchParam_Rate takes a Float32 rate from 1/32 speed to 32x speed. • Use powers of 2: 1/32, 1/16, …, 2, 4, 8… • Pitch: kNewTimePitchParam_Pitch takes a Float32 representing cents, meaning 1/100 of a musical semitone Sunday, December 2, 12
  • 53. Pitch shifting • Pitch can vary, time does not • Suitable for real-time sources, such as audio capture Sunday, December 2, 12
  • 54. Demo: Pitch Shift New in iOS 6! Sunday, December 2, 12
  • 55. Rate shifting • Rate can vary, pitch does not • Think of 1.5x and 2x speed modes in Podcasts app • Not suitable for real-time sources, as data will be consumed faster. Files work well. • Sources must be able to map time systems with kAudioUnitProperty_InputSamplesInOutput Sunday, December 2, 12
  • 56. Demo: Rate Shift New in iOS 6! Sunday, December 2, 12
  • 57. AUSplitter AUSomethingElse AUSplitter AUSomethingElse New in iOS 6! Sunday, December 2, 12
  • 58. AUMatrixMixer AUSomethingElse AUSomethingElse AUSomethingElse AUMatrixMixer AUSomethingElse AUSomethingElse New in iOS 6! Sunday, December 2, 12
  • 59. Audio Queues (and the APIs that help them) Sunday, December 2, 12
  • 60. AudioQueue • Easier than AURemoteIO - provide data when you want to, less time pressure, can accept or provide compressed formats (MP3, AAC) • Recording queue - receive buffers of captured audio in a callback • Play-out queue - enqueue buffers of audio to play, optionally refill in a callback Sunday, December 2, 12
  • 61. AudioQueue 2 1 0 Sunday, December 2, 12
  • 62. Common AQ scenarios • File player - Read from file and “prime” queue buffers, start queue, when called back with used buffer, refill from next part of file • Synthesis - Maintain state in your own code, write raw samples into buffers during callbacks Sunday, December 2, 12
  • 63. Web Radio • Thursday class’ third project • Use Audio File Stream Services to pick out audio data from a network stream • Enqueue these packets as new AQ buffers • Dispose used buffers in callback Sunday, December 2, 12
  • 64. Parsing web radio Sunday, December 2, 12
  • 65. Parsing web radio NSURLConnection delivers NSData buffers, containing audio and framing info. We pass it to NSData NSData Audio File Services. Packets Packets Packets Packets Packets Sunday, December 2, 12
  • 66. Parsing web radio NSURLConnection delivers NSData buffers, containing audio and framing info. We pass it to NSData NSData Audio File Services. Packets Packets Packets Packets Packets Packets Packets Audio File Services calls us back with parsed packets of audio data. Packets Packets Packets Sunday, December 2, 12
  • 67. Parsing web radio NSURLConnection delivers NSData buffers, containing audio and framing info. We pass it to NSData NSData Audio File Services. Packets Packets Packets Packets Packets Packets Packets Audio File Services calls us back with parsed packets of audio data. Packets Packets Packets We create an AudioQueueBuffer Packets Packets with those packets and enqueue it Packets 2 Packets 1 0 for play-out. Packets Packets Sunday, December 2, 12
  • 68. A complex thing! • What if we want to see that data after it’s been decoded to PCM and is about to be played? • e.g., spectrum analysis, effects, visualizers • AudioQueue design is “fire-and-forget” Sunday, December 2, 12
  • 69. AudioQueue Tap! http://www.last.fm/music/Spinal+Tap Sunday, December 2, 12
  • 70. AudioQueueProcessingTap • Set as a property on the Audio Queue • Calls back to your function with decoded (PCM) audio data • Three types: pre- or post- effects (that the AQ performs), or siphon. First two can modify the data. • Only documentation is in AudioQueue.h Sunday, December 2, 12
  • 71. Creating an AQ Tap ! ! // create the tap ! ! UInt32 maxFrames = 0; ! ! AudioStreamBasicDescription tapFormat = {0}; ! ! AudioQueueProcessingTapRef tapRef; ! ! CheckError(AudioQueueProcessingTapNew(audioQueue, ! ! ! ! ! ! ! ! ! ! ! tapProc, ! ! ! ! ! ! ! ! ! ! ! (__bridge void *)(player), ! ! ! ! ! ! ! ! ! ! ! kAudioQueueProcessingTap_PreEffects, ! ! ! ! ! ! ! ! ! ! ! &maxFrames, ! ! ! ! ! ! ! ! ! ! ! &tapFormat, ! ! ! ! ! ! ! ! ! ! ! &tapRef), ! ! ! ! "couldn't create AQ tap"); Notice that you receive maxFrames and tapFormat. These do not appear to be settable. Sunday, December 2, 12
  • 72. AQ Tap Proc void tapProc (void * inClientData, ! ! ! AudioQueueProcessingTapRef inAQTap, ! ! ! UInt32 inNumberFrames, ! ! ! AudioTimeStamp * ioTimeStamp, ! ! ! UInt32 * ioFlags, ! ! ! UInt32 * outNumberFrames, ! ! ! AudioBufferList * ioData) { ! CCFWebRadioPlayer *player = (__bridge CCFWebRadioPlayer*) inClientData; ! UInt32 getSourceFlags = 0; ! UInt32 getSourceFrames = 0; ! AudioQueueProcessingTapGetSourceAudio(inAQTap, ! ! ! ! ! ! ! ! ! ! inNumberFrames, ! ! ! ! ! ! ! ! ! ! ioTimeStamp, ! ! ! ! ! ! ! ! ! ! &getSourceFlags, ! ! ! ! ! ! ! ! ! ! &getSourceFrames, ! ! ! ! ! ! ! ! ! ! ioData); // then do something with ioData // ... Sunday, December 2, 12
  • 73. So what should we do with the audio? Sunday, December 2, 12
  • 74. So what should we do with the audio? Let’s apply our pitch-shift effect Sunday, December 2, 12
  • 75. Shouldn’t this work? AUEffect Sunday, December 2, 12
  • 76. Shouldn’t this work? AUEffect AudioUnitRender() Sunday, December 2, 12
  • 77. AudioUnitRender() • Last argument is an AudioBufferList, whose AudioBuffer members have mData pointers • If mData != NULL, audio unit does its thing with those samples • If mData == NULL, audio data pulls from whatever it’s connected to • So we just call with AudioBufferList ioData we got from tap callback, right? Sunday, December 2, 12
  • 78. Psych! • AQ tap provides data as signed ints • Effect units only work with floating point • We need to do an on-the-spot format conversion Sunday, December 2, 12
  • 79. invalidname’s convert- and-effect recipe OSStatus converterInputRenderCallback (void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList * ioData) { CCFWebRadioPlayer *player = (__bridge CCFWebRadioPlayer*) inRefCon; // read from buffer ioData->mBuffers[0].mData = player.preRenderData; return noErr; } AUConverter AUEffect AUConverter AUGenericOutput Note: red arrows are float format, yellow arrows are int Sunday, December 2, 12
  • 80. How it works • AUGraph: AUConverter → AUEffect → AUConverter → AUGenericOutput • Top AUConverter is connected to a render callback function Sunday, December 2, 12
  • 81. The trick! • Copy mData pointer to a state variable and NULL it in ioData • Call AudioQueueRender() on output unit. The NULL makes it pull from the graph. • Top of the graph pulls on render callback, which gives it back the mData we copied off. Sunday, December 2, 12
  • 82. Yes, really This is the rest of tapProc() ! // copy off the ioData so the graph can read from it // in render callback ! player.preRenderData = ioData->mBuffers[0].mData; ! ioData->mBuffers[0].mData = NULL; ! ! OSStatus renderErr = noErr; ! AudioUnitRenderActionFlags actionFlags = 0; ! renderErr = AudioUnitRender(player.genericOutputUnit, ! ! ! ! ! ! ! ! &actionFlags, ! ! ! ! ! ! ! ! player.renderTimeStamp, ! ! ! ! ! ! ! ! 0, ! ! ! ! ! ! ! ! inNumberFrames, ! ! ! ! ! ! ! ! ioData); ! NSLog (@"AudioUnitRender, renderErr = %ld",renderErr); } Sunday, December 2, 12
  • 83. Yes, really This is the render callback that supplies data to the int→float converter OSStatus converterInputRenderCallback (void *inRefCon, ! ! ! ! ! ! ! ! ! AudioUnitRenderActionFlags *ioActionFlags, ! ! ! ! ! ! ! ! ! const AudioTimeStamp *inTimeStamp, ! ! ! ! ! ! ! ! ! UInt32 inBusNumber, ! ! ! ! ! ! ! ! ! UInt32 inNumberFrames, ! ! ! ! ! ! ! ! ! AudioBufferList * ioData) { ! CCFWebRadioPlayer *player = (__bridge CCFWebRadioPlayer*) inRefCon; ! ! // read from buffer ! ioData->mBuffers[0].mData = player.preRenderData; ! return noErr; } Sunday, December 2, 12
  • 84. Demo: AQ Tap + AUNewTimePitch New in iOS 6! Sunday, December 2, 12
  • 86. Other new stuff Sunday, December 2, 12
  • 87. Multi-Route • Ordinarily, one input or output is active: earpiece, speaker, headphones, dock- connected device • “Last in wins” • With AV Session “multi-route” category, you can use several at once • WWDC 2012 session 505 Sunday, December 2, 12
  • 88. Utility classes moved again • C++ utilities, including the CARingBuffer • < Xcode 4.3, installed into /Developer • Xcode 4.3-4.4, optional download from developer.apple.com • ≧ Xcode 4.5, sample code project “Core Audio Utility Classes” Sunday, December 2, 12
  • 89. Takeaways • Core Audio fundamentals never change • New stuff is added as properties, typedefs, enums, etc. • Watch the SDK API diffs document to find the new stuff • Hope you like header files and experimentation Sunday, December 2, 12
  • 90. Q&A • Slides will be posted to slideshare.net/ invalidname • Code will be linked from there and my blog • Watch CocoaConf RDU glassboard, @invalidname on Twitter/ADN, or [Time code]; blog for announcement • Thanks! Sunday, December 2, 12