SlideShare a Scribd company logo
Core Image 
The Most Fun API 
You’re Not Using 
Chris Adamson • @invalidname 
CocoaConf Atlanta, December 2014
Core Image: The Most Fun API You're Not Using, CocoaConf Atlanta, December 2014
Core Image: The Most Fun API You're Not Using, CocoaConf Atlanta, December 2014
“Core Image is an image processing and 
analysis technology designed to provide near 
real-time processing for still and video images.”
Agenda 
• Images, Filters, and Contexts 
• The Core Image Filter Gallery 
• Neat Tricks with Built-In Filters 
• Core Image on OS X
Core Image, Core Concepts 
• Core Image is not about pixels… at least not most 
of the time 
• A chain of filters describes a “recipe” of processing 
steps to be applied to one or more images 
• “Stringly typed” 
• You only get pixels when you render
Typical Workflow 
• Start with a source CIImage 
• Apply one or more filters 
• Render resulting CIImage to a CIContext, or 
convert CIImage out to another type 
• A few filters take or produce types other than 
CIImage (CIQRCodeGenerator)
CIImage 
• An image provided to or produced by Core Image 
• But no bitmap of pixel data! 
• Immutable 
• -imageByCroppingToRect, 
-imageByApplyingTransform 
• -extent — a CGRect of the image’s size
CIImage sources 
• NSURL 
• CGImageRef 
• Bitmap or JPEG/PNG/TIFF in NSData 
• OpenGL texture 
• Core Video image/pixel buffer
CIContext 
• Rendering destination for a CIImage (- 
[drawImage:inRect:fromRect:]) 
• This is where you get pixels (also, this is the 
processor-intenstive part) 
• On iOS, must be created from an EAGLContext. On 
Mac, can be created with CGContextRef 
• Can also produce output as a CGImageRef, bitmap 
data, or a CVPixelBuffer (iOS only)
????
CIFilter 
• Performs an image processing operation 
• Typically takes and produces a CIImage 
• All parameters are provided via -[setValue:forKey:] 
• Stringly-typed! 
• Output is retrieved -[outputImage] (or - 
[valueForKey:kCIOutputImageKey])
“I can haz filters?” 
–Core Image Cat
Yes, you can has Filterz!
Core Image: The Most Fun API You're Not Using, CocoaConf Atlanta, December 2014
Core Image Filter Reference 
Filter Name 
Parameters 
Note the type & number to 
provide 
Categories 
Watch for CICategoryBuiltIn 
and CICategoryVideo 
Example Figure 
Availability 
Watch for versioning and 
OS X-only filters
Filter Categories 
• Group filters by functionality: CICategoryBlur, 
CICategoryGenerator, 
CICategoryCompositeOperation, etc. 
• Also group filters by availability and 
appropriateness: CICategoryBuiltIn, 
CICategoryVideo, CICategoryNonSquarePixels
CICategoryGenerator 
• No input image, just produces an output 
• CICategoryGradient is also output-only 
• Example: CICheckerboardGenerator
CICategoryBlur 
• Algorithmically spreads/blends pixels 
• CICategorySharpen offers an opposite effect 
• Example: CIGaussianBlur
CICategoryColorAdjustement 
• Changes distribution of color throughout an image 
• Example: CIColorControls (adjusts saturation, 
brightness, contrast)
CICategoryColorEffect 
• Color changes that affect the subjective nature of 
the image 
• Example: CIPhotoEffectNoir
CICategoryDistortionEffect 
• Moves pixels to achieve an effect 
• Example: CITorusLensDistortion
CICategoryStylize 
• Various stylistic effects 
• Example: CIPointillize
CICategoryGeometryAdjust 
ment 
• Moves pixels via cropping, affine transforms, etc. 
• Example: CICrop
CICategoryTileEffect 
• Repeatedly copies all or part of an image 
• Example: CIAffineTile
CICategoryCompositeOpera 
tion 
• Combines multiple images 
• Example: CISourceOverCompositing
Demo
Creating CIColorControls 
Filter 
_colorControlsFilter = [CIFilter 
filterWithName:@"CIColorControls"];
Setting input values 
[self.colorControlsFilter 
setValue:@(self.saturationSlider.value) 
forKey:kCIInputSaturationKey]; 
[self.colorControlsFilter 
setValue:@(self.brightnessSlider.value) 
forKey:kCIInputBrightnessKey]; 
[self.colorControlsFilter 
setValue:@(self.contrastSlider.value) 
forKey:kCIInputContrastKey];
Setting input image 
CIImage *ciImage = 
[CIImage imageWithCGImage: 
self.imageView.image.CGImage]; 
[self.colorControlsFilter 
setValue:ciImage 
forKey:kCIInputImageKey]; 
Note: source image is 3264 × 2448 pixels
Getting output image 
ciImage = [self.colorControlsFilter 
outputImage]; 
UIImage *filteredUIImage = 
[UIImage imageWithCIImage:ciImage]; 
self.imageView.image = filteredUIImage; 
Can also use CIFilter outputImage property instead of 
valueForKey:
API Modernizations 
• iOS 8 and Mac OS X 10.10 
• Can provide input parameters when creating a filter 
with -[CIFilter filterWithName:withInputParameters:] 
• Can apply a filter to an image in a one-off fashion 
with -[CIImage 
imageByApplyingFilter:withInputParameters:]
Other output options 
• Use a CIContext 
• -[drawImage:inRect:fromRect:] draws pixels to 
the EAGLContext (iOS) or CGContextRef (OS X) 
that the CIContext was created from. 
• CIContext can also render to a void* bitmap 
• On iOS, can create a CVPixelBufferRef, typically 
used for writing to a file with AVAssetWriter
Chaining filters 
• Use the output of one filter as the input to the next 
• This doesn’t cost anything, because the CIImages 
just hold state, not pixels
Demo
Creating CIContext 
if (self.context.API != kEAGLRenderingAPIOpenGLES2) { 
EAGLContext *eagl2Context = [[EAGLContext alloc] 
initWithAPI:kEAGLRenderingAPIOpenGLES2]; 
self.context = eagl2Context; 
} 
// make CIContext from GL context, 
// clearing out default color space 
self.ciContext = 
[CIContext contextWithEAGLContext:self.context 
options: @{ 
kCIContextWorkingColorSpace : [NSNull null] 
}]; 
Note: This is in a subclass of GLKView
Set up Sepia Tone filter 
_sepiaToneFilter = [CIFilter 
filterWithName:@"CISepiaTone"]; 
[_sepiaToneFilter setValue:@(1.0) 
forKey:@"inputIntensity"];
Set up Hole Distortion Filter 
_holeDistortionFilter = [CIFilter 
filterWithName:@"CIHoleDistortion"]; 
[_holeDistortionFilter 
setValue:[CIVector vectorWithX:100.0 
Y:100.0] 
forKey:kCIInputCenterKey]; 
[_holeDistortionFilter 
setValue:@(50.0) 
forKey:kCIInputRadiusKey];
Set up Mask to Alpha filter 
UIImage *circleImageUI = [UIImage 
imageNamed:@"circle-mask-200x200"]; 
_circleMaskFilter = [CIFilter 
filterWithName:@"CIMaskToAlpha"]; 
CIImage *circleImageCI = [CIImage 
imageWithCGImage: circleImageUI.CGImage]; 
[_circleMaskFilter setValue:circleImageCI 
forKey:kCIInputImageKey]; 
_circleMask = [_circleMaskFilter 
valueForKey:kCIOutputImageKey]; 
circle-mask-200x200.png
Set up Blend with Mask filter 
_blendWithMaskFilter = [CIFilter 
filterWithName:@"CIBlendWithMask"]; 
[_blendWithMaskFilter setValue:circleImageCI 
forKey:kCIInputMaskImageKey]; 
[_blendWithMaskFilter setValue:_backgroundAlphaFill 
forKey:kCIInputBackgroundImageKey];
redrawAtOrigin (1/3) 
// Get CIImage from source image 
CGImageRef loupeImageCG = 
CGImageCreateWithImageInRect( 
self.sourceImage.CGImage, fromRect); 
loupeImage = [CIImage imageWithCGImage:loupeImageCG];
redrawAtOrigin (2/3) 
// Apply sepia filter 
[self.sepiaToneFilter setValue:loupeImage 
forKey:kCIInputImageKey]; 
loupeImage = [self.sepiaToneFilter outputImage]; 
// Apply hole distortion filter 
[self.holeDistortionFilter setValue:loupeImage 
forKey:kCIInputImageKey]; 
loupeImage = [self.holeDistortionFilter outputImage]; 
// Set double-filtered image as input to blend-with-mask 
[self.blendWithMaskFilter setValue:loupeImage 
forKey:kCIInputImageKey]; 
loupeImage = [_blendWithMaskFilter outputImage];
redrawAtOrigin (3/3) 
if ([EAGLContext currentContext] != self.context) { 
[EAGLContext setCurrentContext: self.context]; 
} 
[self bindDrawable]; 
// GL-on-Retina fix 
CGRect drawBoundsInPoints = self.glDrawBounds; 
drawBoundsInPoints.size.width /= self.contentScaleFactor; 
drawBoundsInPoints.size.height /= self.contentScaleFactor; 
// drawing to CIContext draws to the 
// EAGLESContext it's based on 
[self.ciContext drawImage:loupeImage 
inRect:self.glDrawBounds 
fromRect:drawBoundsInPoints]; 
// Refresh GLKView contents immediately 
[self display];
Working with Video 
• AVFoundation AVCaptureVideoDataOutput and 
AVAssetReader deliver CMSampleBuffers 
• CMSampleBuffers have timing information and 
CVImageBuffers/CVPixelBuffers 
• +[CIImage imageWithCVPixelBuffer:]
Demo
Chroma Key (“green screen” 
recipe 
• Use a CIColorCube to map green-ish colors to 
transparent 
• Use CISourceOverCompositing to draw this 
alpha’ed image over another image
CIColorCube 
Maps colors from one RGB “cube” to another 
http://en.wikipedia.org/wiki/RGB_color_space
Using CIColorCube 
CIColorCube maps green(-ish) colors to 0.0 alpha, all other 
colors pass through
CISourceOverCompositing 
inputBackgroundImage inputImage 
outputImage
CIColorCube Data const unsigned int size = 64; 
size_t cubeDataSize = size * size * size * sizeof (float) * 4; 
float *keyCubeData = (float *)malloc (cubeDataSize); 
// float *alphaMatteCubeData = (float *)malloc (cubeDataSize); 
// float rgb[3], hsv[3], *keyC = keyCubeData, *alphaC = alphaMatteCubeData; 
float rgb[3], hsv[3], *keyC = keyCubeData; 
// Populate cube with a simple gradient going from 0 to 1 
for (int z = 0; z < size; z++){ 
rgb[2] = ((double)z)/(size-1); // Blue value 
for (int y = 0; y < size; y++){ 
rgb[1] = ((double)y)/(size-1); // Green value 
for (int x = 0; x < size; x ++){ 
rgb[0] = ((double)x)/(size-1); // Red value 
// Convert RGB to HSV 
// You can find publicly available rgbToHSV functions on the Internet 
RGBtoHSV(rgb[0], rgb[1], rgb[2], 
&hsv[0], &hsv[1], &hsv[2]); 
// RGBtoHSV uses 0 to 360 for hue, while UIColor (used above) uses 0 to 1. 
hsv[0] /= 360.0; 
// Use the hue value to determine which to make transparent 
// The minimum and maximum hue angle depends on 
// the color you want to remove 
bool keyed = (hsv[0] > minHueAngle && hsv[0] < maxHueAngle) && 
(hsv[1] > minSaturation && hsv[1] < maxSaturation) && 
(hsv[2] > minBrightness && hsv[2] < maxBrightness); 
float alpha = keyed ? 0.0f : 1.0f; 
// re-calculate c pointer 
keyC = (((z * size * size) + (y * size) + x) * sizeof(float)) + keyCubeData; 
// Calculate premultiplied alpha values for the cube 
keyC[0] = rgb[0] * alpha; 
keyC[1] = rgb[1] * alpha; 
keyC[2] = rgb[2] * alpha; 
keyC[3] = alpha; 
} 
} 
} 
See “Chroma Key Filter Recipe” in Core Image Programming Guide
Create CIColorCube from 
mapping data 
// build the color cube filter and set its data to above 
self.colorCubeFilter = [CIFilter 
filterWithName:@"CIColorCube"]; 
[self.colorCubeFilter setValue:[NSNumber numberWithInt:size] 
forKey:@"inputCubeDimension"]; 
NSData *data = [NSData dataWithBytesNoCopy:keyCubeData 
length:cubeDataSize 
freeWhenDone:YES]; 
[self.colorCubeFilter setValue:data 
forKey:@"inputCubeData"];
Create 
CISourceOverCompositing 
// source over filter 
self.backgroundImage = [UIImage imageNamed: 
@"img_washington_small_02.jpg"]; 
self.backgroundCIImage = [CIImage imageWithCGImage: 
self.backgroundImage.CGImage]; 
self.sourceOverFilter = [CIFilter filterWithName: 
@"CISourceOverCompositing"]; 
[self.sourceOverFilter setValue:self.backgroundCIImage 
forKeyPath:@"inputBackgroundImage"];
Apply Filters in Capture 
Callback 
CIImage *bufferCIImage = [CIImage 
imageWithCVPixelBuffer:cvBuffer]; 
[self.colorCubeFilter setValue:bufferCIImage 
forKey:kCIInputImageKey]; 
CIImage *keyedCameraImage = 
[self.colorCubeFilter outputImage]; 
[self.sourceOverFilter setValue:keyedCameraImage 
forKeyPath:kCIInputImageKey]; 
CIImage *compositedImage = 
[self.sourceOverFilter outputImage]; 
Then draw compositedImage to CIContext as before
Other Points of Interest 
• CIQRCodeGenerator filter — Converts data to a QR Code 
• CILenticularHaloGenerator filter — aka, lens flare 
• CIDetector — Class (not a filter) to find features in images. 
iOS 7 / Lion only support face finding (returned as an array 
of CIFeatures). Optionally detects smiles and eye blinks 
within faces. 
• iOS 8 / Yosemite add rectangle and QR code detection 
• CIImage has a red-eye enhancement that takes the array 
of face CIFeatures to tell it where to apply the effect
Core Image on OS X 
• Core Image is part of QuartzCore (or Image Kit), so 
you don’t @import CoreImage 
• Many more filters are available 
• Filters can be set on CALayers
CALayer Filters on OS X 
• Views must be layer-backed (obviously) 
• Must also call -[NSView 
setLayerUsesCoreImageFilters:] on 10.9+ 
• CALayer has properties: filters, compositingFilter, 
backgroundFilters, minificationFilter, 
magnificationFilter 
• These exist on iOS, but do nothing
Demo
Core Image: The Most Fun API You're Not Using, CocoaConf Atlanta, December 2014
Adding CIPixellate to layer’s 
filters 
self.pixellateFilter = [CIFilter filterWithName: 
@"CIPixellate"]; 
self.pixellateFilter.name = @"myPixellateFilter"; 
[self.pixellateFilter setValue: 
[CIVector vectorWithX:100.0 Y:100.0] 
forKey:@“inputCenter"]; 
[self.pixellateFilter setValue: 
@([self.pixellationScaleSlider floatValue]) 
forKey:@"inputScale"]; 
self.someTextField.layer.filters = 
@[self.pixellateFilter];
Updating a layer’s filters 
-(void) updatePixellationScale { 
[self.someTextField.layer setValue: 
@([self.pixellationScaleSlider floatValue]) 
forKeyPath: 
@"filters.myPixellateFilter.inputScale"]; 
}
Building Your Own Filter
CIKernel (new in iOS 8) 
• Write per-pixel image processing code in Core 
Image Kernel Language (subset of OpenGL + CI 
extensions) 
• -[CIKernel kernelWithString:] 
• Subclass CIFilter, call apply:arguments:options: in 
outputImage 
• apply: takes your CIKernel as argument
Wrap Up: Stuff to Remember 
• Get psyched about filters, but remember to check 
that they’re on your targeted platform/version. 
• Drawing to a CIContext on iOS must be GL-backed 
(e.g., with a GLKView) 
• Not the only game in town: GPUImage offers an 
open-source alternative
Q&A 
Slides and code will be posted to: 
http://www.slideshare.net/invalidname/ 
@invalidname 
http://subfurther.com/blog

More Related Content

PDF
Forward Swift 2017: Media Frameworks and Swift: This Is Fine
PDF
Video Killed the Rolex Star (CocoaConf Columbus, July 2015)
PDF
Firebase: Totally Not Parse All Over Again (Unless It Is) (CocoaConf San Jose...
PDF
Get On The Audiobus (CocoaConf Atlanta, November 2013)
PDF
Get On The Audiobus (CocoaConf Boston, October 2013)
PDF
Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)
PDF
Stupid Video Tricks, CocoaConf Seattle 2014
Forward Swift 2017: Media Frameworks and Swift: This Is Fine
Video Killed the Rolex Star (CocoaConf Columbus, July 2015)
Firebase: Totally Not Parse All Over Again (Unless It Is) (CocoaConf San Jose...
Get On The Audiobus (CocoaConf Atlanta, November 2013)
Get On The Audiobus (CocoaConf Boston, October 2013)
Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)
Stupid Video Tricks, CocoaConf Seattle 2014

Viewers also liked (14)

PDF
Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016)
PDF
Video Killed the Rolex Star (CocoaConf San Jose, November, 2015)
PDF
Stupid Video Tricks (CocoaConf DC, March 2014)
PDF
Introduction to the Roku SDK
PDF
Firebase: Totally Not Parse All Over Again (Unless It Is)
PDF
Revenge of the 80s: Cut/Copy/Paste, Undo/Redo, and More Big Hits (CocoaConf C...
PDF
Core Image: The Most Fun API You're Not Using (CocoaConf Columbus 2014)
PDF
Stupid Video Tricks
PDF
Advanced Imaging on iOS
PDF
Stupid Video Tricks, CocoaConf Las Vegas
KEY
OpenCVの基礎
PDF
OpenCVをAndroidで動かしてみた
PDF
OpenCV 3.0 on iOS
PPTX
画像処理ライブラリ OpenCV で 出来ること・出来ないこと
Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016)
Video Killed the Rolex Star (CocoaConf San Jose, November, 2015)
Stupid Video Tricks (CocoaConf DC, March 2014)
Introduction to the Roku SDK
Firebase: Totally Not Parse All Over Again (Unless It Is)
Revenge of the 80s: Cut/Copy/Paste, Undo/Redo, and More Big Hits (CocoaConf C...
Core Image: The Most Fun API You're Not Using (CocoaConf Columbus 2014)
Stupid Video Tricks
Advanced Imaging on iOS
Stupid Video Tricks, CocoaConf Las Vegas
OpenCVの基礎
OpenCVをAndroidで動かしてみた
OpenCV 3.0 on iOS
画像処理ライブラリ OpenCV で 出来ること・出来ないこと
Ad

Similar to Core Image: The Most Fun API You're Not Using, CocoaConf Atlanta, December 2014 (20)

PDF
Core Image
PDF
What's a Core Image? An Image-Processing Framework on iOS and OS X
PDF
What is image in Swift?/はるふ
PDF
아이폰강의(7) pdf
PDF
The Great CALayer Tour
PDF
Drawing with Quartz on iOS
PDF
Hi performance table views with QuartzCore and CoreText
KEY
Core image presentation
PDF
iOS Visual F/X Using GLSL
PPT
november29.ppt
PDF
iOS GPUImage introduction
PDF
iOS Core Animation
PDF
Advanced AV Foundation (CocoaConf, Aug '11)
PDF
Chameleon game engine
PPTX
computer graphics-C/C++-dancingdollcode
KEY
Core Graphics & Core Animation
KEY
Core animation
KEY
openFrameworks 007 - graphics
KEY
Getting Started with CoreGraphics
PDF
Ryan Phelan - Bending and Flexing
Core Image
What's a Core Image? An Image-Processing Framework on iOS and OS X
What is image in Swift?/はるふ
아이폰강의(7) pdf
The Great CALayer Tour
Drawing with Quartz on iOS
Hi performance table views with QuartzCore and CoreText
Core image presentation
iOS Visual F/X Using GLSL
november29.ppt
iOS GPUImage introduction
iOS Core Animation
Advanced AV Foundation (CocoaConf, Aug '11)
Chameleon game engine
computer graphics-C/C++-dancingdollcode
Core Graphics & Core Animation
Core animation
openFrameworks 007 - graphics
Getting Started with CoreGraphics
Ryan Phelan - Bending and Flexing
Ad

More from Chris Adamson (14)

PDF
Whatever Happened to Visual Novel Anime? (AWA/Youmacon 2018)
PDF
Whatever Happened to Visual Novel Anime? (JAFAX 2018)
PDF
Media Frameworks Versus Swift (Swift by Northwest, October 2017)
PDF
Fall Premieres: Media Frameworks in iOS 11, macOS 10.13, and tvOS 11 (CocoaCo...
PDF
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
PDF
Glitch-Free A/V Encoding (CocoaConf Boston, October 2013)
PDF
iOS Media APIs (MobiDevDay Detroit, May 2013)
PDF
Core Audio in iOS 6 (CocoaConf San Jose, April 2013)
PDF
Core Audio in iOS 6 (CocoaConf DC, March 2013)
PDF
Mobile Movies with HTTP Live Streaming (CocoaConf DC, March 2013)
PDF
Core Audio in iOS 6 (CocoaConf Chicago, March 2013)
PDF
Core Audio Intro (Detroit Mobile City 2013)
PDF
Objective-C Is Not Java
PDF
Core Audio in iOS 6 (CocoaConf Raleigh, Dec. '12)
Whatever Happened to Visual Novel Anime? (AWA/Youmacon 2018)
Whatever Happened to Visual Novel Anime? (JAFAX 2018)
Media Frameworks Versus Swift (Swift by Northwest, October 2017)
Fall Premieres: Media Frameworks in iOS 11, macOS 10.13, and tvOS 11 (CocoaCo...
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
Glitch-Free A/V Encoding (CocoaConf Boston, October 2013)
iOS Media APIs (MobiDevDay Detroit, May 2013)
Core Audio in iOS 6 (CocoaConf San Jose, April 2013)
Core Audio in iOS 6 (CocoaConf DC, March 2013)
Mobile Movies with HTTP Live Streaming (CocoaConf DC, March 2013)
Core Audio in iOS 6 (CocoaConf Chicago, March 2013)
Core Audio Intro (Detroit Mobile City 2013)
Objective-C Is Not Java
Core Audio in iOS 6 (CocoaConf Raleigh, Dec. '12)

Recently uploaded (20)

PPTX
MYSQL Presentation for SQL database connectivity
PDF
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
PPTX
Digital-Transformation-Roadmap-for-Companies.pptx
PDF
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
PPTX
ACSFv1EN-58255 AWS Academy Cloud Security Foundations.pptx
PDF
Encapsulation_ Review paper, used for researhc scholars
PDF
Diabetes mellitus diagnosis method based random forest with bat algorithm
PDF
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
PDF
Advanced methodologies resolving dimensionality complications for autism neur...
PDF
Encapsulation theory and applications.pdf
PDF
Spectral efficient network and resource selection model in 5G networks
PDF
KodekX | Application Modernization Development
PPTX
Cloud computing and distributed systems.
PDF
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
PDF
Mobile App Security Testing_ A Comprehensive Guide.pdf
PDF
Machine learning based COVID-19 study performance prediction
PPTX
20250228 LYD VKU AI Blended-Learning.pptx
PDF
Dropbox Q2 2025 Financial Results & Investor Presentation
PPTX
Programs and apps: productivity, graphics, security and other tools
PDF
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
MYSQL Presentation for SQL database connectivity
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
Digital-Transformation-Roadmap-for-Companies.pptx
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
ACSFv1EN-58255 AWS Academy Cloud Security Foundations.pptx
Encapsulation_ Review paper, used for researhc scholars
Diabetes mellitus diagnosis method based random forest with bat algorithm
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
Advanced methodologies resolving dimensionality complications for autism neur...
Encapsulation theory and applications.pdf
Spectral efficient network and resource selection model in 5G networks
KodekX | Application Modernization Development
Cloud computing and distributed systems.
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
Mobile App Security Testing_ A Comprehensive Guide.pdf
Machine learning based COVID-19 study performance prediction
20250228 LYD VKU AI Blended-Learning.pptx
Dropbox Q2 2025 Financial Results & Investor Presentation
Programs and apps: productivity, graphics, security and other tools
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton

Core Image: The Most Fun API You're Not Using, CocoaConf Atlanta, December 2014

  • 1. Core Image The Most Fun API You’re Not Using Chris Adamson • @invalidname CocoaConf Atlanta, December 2014
  • 4. “Core Image is an image processing and analysis technology designed to provide near real-time processing for still and video images.”
  • 5. Agenda • Images, Filters, and Contexts • The Core Image Filter Gallery • Neat Tricks with Built-In Filters • Core Image on OS X
  • 6. Core Image, Core Concepts • Core Image is not about pixels… at least not most of the time • A chain of filters describes a “recipe” of processing steps to be applied to one or more images • “Stringly typed” • You only get pixels when you render
  • 7. Typical Workflow • Start with a source CIImage • Apply one or more filters • Render resulting CIImage to a CIContext, or convert CIImage out to another type • A few filters take or produce types other than CIImage (CIQRCodeGenerator)
  • 8. CIImage • An image provided to or produced by Core Image • But no bitmap of pixel data! • Immutable • -imageByCroppingToRect, -imageByApplyingTransform • -extent — a CGRect of the image’s size
  • 9. CIImage sources • NSURL • CGImageRef • Bitmap or JPEG/PNG/TIFF in NSData • OpenGL texture • Core Video image/pixel buffer
  • 10. CIContext • Rendering destination for a CIImage (- [drawImage:inRect:fromRect:]) • This is where you get pixels (also, this is the processor-intenstive part) • On iOS, must be created from an EAGLContext. On Mac, can be created with CGContextRef • Can also produce output as a CGImageRef, bitmap data, or a CVPixelBuffer (iOS only)
  • 11. ????
  • 12. CIFilter • Performs an image processing operation • Typically takes and produces a CIImage • All parameters are provided via -[setValue:forKey:] • Stringly-typed! • Output is retrieved -[outputImage] (or - [valueForKey:kCIOutputImageKey])
  • 13. “I can haz filters?” –Core Image Cat
  • 14. Yes, you can has Filterz!
  • 16. Core Image Filter Reference Filter Name Parameters Note the type & number to provide Categories Watch for CICategoryBuiltIn and CICategoryVideo Example Figure Availability Watch for versioning and OS X-only filters
  • 17. Filter Categories • Group filters by functionality: CICategoryBlur, CICategoryGenerator, CICategoryCompositeOperation, etc. • Also group filters by availability and appropriateness: CICategoryBuiltIn, CICategoryVideo, CICategoryNonSquarePixels
  • 18. CICategoryGenerator • No input image, just produces an output • CICategoryGradient is also output-only • Example: CICheckerboardGenerator
  • 19. CICategoryBlur • Algorithmically spreads/blends pixels • CICategorySharpen offers an opposite effect • Example: CIGaussianBlur
  • 20. CICategoryColorAdjustement • Changes distribution of color throughout an image • Example: CIColorControls (adjusts saturation, brightness, contrast)
  • 21. CICategoryColorEffect • Color changes that affect the subjective nature of the image • Example: CIPhotoEffectNoir
  • 22. CICategoryDistortionEffect • Moves pixels to achieve an effect • Example: CITorusLensDistortion
  • 23. CICategoryStylize • Various stylistic effects • Example: CIPointillize
  • 24. CICategoryGeometryAdjust ment • Moves pixels via cropping, affine transforms, etc. • Example: CICrop
  • 25. CICategoryTileEffect • Repeatedly copies all or part of an image • Example: CIAffineTile
  • 26. CICategoryCompositeOpera tion • Combines multiple images • Example: CISourceOverCompositing
  • 27. Demo
  • 28. Creating CIColorControls Filter _colorControlsFilter = [CIFilter filterWithName:@"CIColorControls"];
  • 29. Setting input values [self.colorControlsFilter setValue:@(self.saturationSlider.value) forKey:kCIInputSaturationKey]; [self.colorControlsFilter setValue:@(self.brightnessSlider.value) forKey:kCIInputBrightnessKey]; [self.colorControlsFilter setValue:@(self.contrastSlider.value) forKey:kCIInputContrastKey];
  • 30. Setting input image CIImage *ciImage = [CIImage imageWithCGImage: self.imageView.image.CGImage]; [self.colorControlsFilter setValue:ciImage forKey:kCIInputImageKey]; Note: source image is 3264 × 2448 pixels
  • 31. Getting output image ciImage = [self.colorControlsFilter outputImage]; UIImage *filteredUIImage = [UIImage imageWithCIImage:ciImage]; self.imageView.image = filteredUIImage; Can also use CIFilter outputImage property instead of valueForKey:
  • 32. API Modernizations • iOS 8 and Mac OS X 10.10 • Can provide input parameters when creating a filter with -[CIFilter filterWithName:withInputParameters:] • Can apply a filter to an image in a one-off fashion with -[CIImage imageByApplyingFilter:withInputParameters:]
  • 33. Other output options • Use a CIContext • -[drawImage:inRect:fromRect:] draws pixels to the EAGLContext (iOS) or CGContextRef (OS X) that the CIContext was created from. • CIContext can also render to a void* bitmap • On iOS, can create a CVPixelBufferRef, typically used for writing to a file with AVAssetWriter
  • 34. Chaining filters • Use the output of one filter as the input to the next • This doesn’t cost anything, because the CIImages just hold state, not pixels
  • 35. Demo
  • 36. Creating CIContext if (self.context.API != kEAGLRenderingAPIOpenGLES2) { EAGLContext *eagl2Context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2]; self.context = eagl2Context; } // make CIContext from GL context, // clearing out default color space self.ciContext = [CIContext contextWithEAGLContext:self.context options: @{ kCIContextWorkingColorSpace : [NSNull null] }]; Note: This is in a subclass of GLKView
  • 37. Set up Sepia Tone filter _sepiaToneFilter = [CIFilter filterWithName:@"CISepiaTone"]; [_sepiaToneFilter setValue:@(1.0) forKey:@"inputIntensity"];
  • 38. Set up Hole Distortion Filter _holeDistortionFilter = [CIFilter filterWithName:@"CIHoleDistortion"]; [_holeDistortionFilter setValue:[CIVector vectorWithX:100.0 Y:100.0] forKey:kCIInputCenterKey]; [_holeDistortionFilter setValue:@(50.0) forKey:kCIInputRadiusKey];
  • 39. Set up Mask to Alpha filter UIImage *circleImageUI = [UIImage imageNamed:@"circle-mask-200x200"]; _circleMaskFilter = [CIFilter filterWithName:@"CIMaskToAlpha"]; CIImage *circleImageCI = [CIImage imageWithCGImage: circleImageUI.CGImage]; [_circleMaskFilter setValue:circleImageCI forKey:kCIInputImageKey]; _circleMask = [_circleMaskFilter valueForKey:kCIOutputImageKey]; circle-mask-200x200.png
  • 40. Set up Blend with Mask filter _blendWithMaskFilter = [CIFilter filterWithName:@"CIBlendWithMask"]; [_blendWithMaskFilter setValue:circleImageCI forKey:kCIInputMaskImageKey]; [_blendWithMaskFilter setValue:_backgroundAlphaFill forKey:kCIInputBackgroundImageKey];
  • 41. redrawAtOrigin (1/3) // Get CIImage from source image CGImageRef loupeImageCG = CGImageCreateWithImageInRect( self.sourceImage.CGImage, fromRect); loupeImage = [CIImage imageWithCGImage:loupeImageCG];
  • 42. redrawAtOrigin (2/3) // Apply sepia filter [self.sepiaToneFilter setValue:loupeImage forKey:kCIInputImageKey]; loupeImage = [self.sepiaToneFilter outputImage]; // Apply hole distortion filter [self.holeDistortionFilter setValue:loupeImage forKey:kCIInputImageKey]; loupeImage = [self.holeDistortionFilter outputImage]; // Set double-filtered image as input to blend-with-mask [self.blendWithMaskFilter setValue:loupeImage forKey:kCIInputImageKey]; loupeImage = [_blendWithMaskFilter outputImage];
  • 43. redrawAtOrigin (3/3) if ([EAGLContext currentContext] != self.context) { [EAGLContext setCurrentContext: self.context]; } [self bindDrawable]; // GL-on-Retina fix CGRect drawBoundsInPoints = self.glDrawBounds; drawBoundsInPoints.size.width /= self.contentScaleFactor; drawBoundsInPoints.size.height /= self.contentScaleFactor; // drawing to CIContext draws to the // EAGLESContext it's based on [self.ciContext drawImage:loupeImage inRect:self.glDrawBounds fromRect:drawBoundsInPoints]; // Refresh GLKView contents immediately [self display];
  • 44. Working with Video • AVFoundation AVCaptureVideoDataOutput and AVAssetReader deliver CMSampleBuffers • CMSampleBuffers have timing information and CVImageBuffers/CVPixelBuffers • +[CIImage imageWithCVPixelBuffer:]
  • 45. Demo
  • 46. Chroma Key (“green screen” recipe • Use a CIColorCube to map green-ish colors to transparent • Use CISourceOverCompositing to draw this alpha’ed image over another image
  • 47. CIColorCube Maps colors from one RGB “cube” to another http://en.wikipedia.org/wiki/RGB_color_space
  • 48. Using CIColorCube CIColorCube maps green(-ish) colors to 0.0 alpha, all other colors pass through
  • 50. CIColorCube Data const unsigned int size = 64; size_t cubeDataSize = size * size * size * sizeof (float) * 4; float *keyCubeData = (float *)malloc (cubeDataSize); // float *alphaMatteCubeData = (float *)malloc (cubeDataSize); // float rgb[3], hsv[3], *keyC = keyCubeData, *alphaC = alphaMatteCubeData; float rgb[3], hsv[3], *keyC = keyCubeData; // Populate cube with a simple gradient going from 0 to 1 for (int z = 0; z < size; z++){ rgb[2] = ((double)z)/(size-1); // Blue value for (int y = 0; y < size; y++){ rgb[1] = ((double)y)/(size-1); // Green value for (int x = 0; x < size; x ++){ rgb[0] = ((double)x)/(size-1); // Red value // Convert RGB to HSV // You can find publicly available rgbToHSV functions on the Internet RGBtoHSV(rgb[0], rgb[1], rgb[2], &hsv[0], &hsv[1], &hsv[2]); // RGBtoHSV uses 0 to 360 for hue, while UIColor (used above) uses 0 to 1. hsv[0] /= 360.0; // Use the hue value to determine which to make transparent // The minimum and maximum hue angle depends on // the color you want to remove bool keyed = (hsv[0] > minHueAngle && hsv[0] < maxHueAngle) && (hsv[1] > minSaturation && hsv[1] < maxSaturation) && (hsv[2] > minBrightness && hsv[2] < maxBrightness); float alpha = keyed ? 0.0f : 1.0f; // re-calculate c pointer keyC = (((z * size * size) + (y * size) + x) * sizeof(float)) + keyCubeData; // Calculate premultiplied alpha values for the cube keyC[0] = rgb[0] * alpha; keyC[1] = rgb[1] * alpha; keyC[2] = rgb[2] * alpha; keyC[3] = alpha; } } } See “Chroma Key Filter Recipe” in Core Image Programming Guide
  • 51. Create CIColorCube from mapping data // build the color cube filter and set its data to above self.colorCubeFilter = [CIFilter filterWithName:@"CIColorCube"]; [self.colorCubeFilter setValue:[NSNumber numberWithInt:size] forKey:@"inputCubeDimension"]; NSData *data = [NSData dataWithBytesNoCopy:keyCubeData length:cubeDataSize freeWhenDone:YES]; [self.colorCubeFilter setValue:data forKey:@"inputCubeData"];
  • 52. Create CISourceOverCompositing // source over filter self.backgroundImage = [UIImage imageNamed: @"img_washington_small_02.jpg"]; self.backgroundCIImage = [CIImage imageWithCGImage: self.backgroundImage.CGImage]; self.sourceOverFilter = [CIFilter filterWithName: @"CISourceOverCompositing"]; [self.sourceOverFilter setValue:self.backgroundCIImage forKeyPath:@"inputBackgroundImage"];
  • 53. Apply Filters in Capture Callback CIImage *bufferCIImage = [CIImage imageWithCVPixelBuffer:cvBuffer]; [self.colorCubeFilter setValue:bufferCIImage forKey:kCIInputImageKey]; CIImage *keyedCameraImage = [self.colorCubeFilter outputImage]; [self.sourceOverFilter setValue:keyedCameraImage forKeyPath:kCIInputImageKey]; CIImage *compositedImage = [self.sourceOverFilter outputImage]; Then draw compositedImage to CIContext as before
  • 54. Other Points of Interest • CIQRCodeGenerator filter — Converts data to a QR Code • CILenticularHaloGenerator filter — aka, lens flare • CIDetector — Class (not a filter) to find features in images. iOS 7 / Lion only support face finding (returned as an array of CIFeatures). Optionally detects smiles and eye blinks within faces. • iOS 8 / Yosemite add rectangle and QR code detection • CIImage has a red-eye enhancement that takes the array of face CIFeatures to tell it where to apply the effect
  • 55. Core Image on OS X • Core Image is part of QuartzCore (or Image Kit), so you don’t @import CoreImage • Many more filters are available • Filters can be set on CALayers
  • 56. CALayer Filters on OS X • Views must be layer-backed (obviously) • Must also call -[NSView setLayerUsesCoreImageFilters:] on 10.9+ • CALayer has properties: filters, compositingFilter, backgroundFilters, minificationFilter, magnificationFilter • These exist on iOS, but do nothing
  • 57. Demo
  • 59. Adding CIPixellate to layer’s filters self.pixellateFilter = [CIFilter filterWithName: @"CIPixellate"]; self.pixellateFilter.name = @"myPixellateFilter"; [self.pixellateFilter setValue: [CIVector vectorWithX:100.0 Y:100.0] forKey:@“inputCenter"]; [self.pixellateFilter setValue: @([self.pixellationScaleSlider floatValue]) forKey:@"inputScale"]; self.someTextField.layer.filters = @[self.pixellateFilter];
  • 60. Updating a layer’s filters -(void) updatePixellationScale { [self.someTextField.layer setValue: @([self.pixellationScaleSlider floatValue]) forKeyPath: @"filters.myPixellateFilter.inputScale"]; }
  • 62. CIKernel (new in iOS 8) • Write per-pixel image processing code in Core Image Kernel Language (subset of OpenGL + CI extensions) • -[CIKernel kernelWithString:] • Subclass CIFilter, call apply:arguments:options: in outputImage • apply: takes your CIKernel as argument
  • 63. Wrap Up: Stuff to Remember • Get psyched about filters, but remember to check that they’re on your targeted platform/version. • Drawing to a CIContext on iOS must be GL-backed (e.g., with a GLKView) • Not the only game in town: GPUImage offers an open-source alternative
  • 64. Q&A Slides and code will be posted to: http://www.slideshare.net/invalidname/ @invalidname http://subfurther.com/blog