learning av foundation
Post on 13-Jan-2015
12.677 Views
Preview:
DESCRIPTION
TRANSCRIPT
Bob McCuneAbout...
‣MN Developer and Instructor‣Owner of TapHarmonic, LLC.‣Founded Minnesota CocoaHeads in 2008
What will I learn?Agenda
‣ AV Foundation Overview‣ Decomposing AV Foundation‣ Code Examples
OverviewAV Foundation Framework
‣ Apple’s advanced Objective-C framework for working with timed-media‣ High performance, asynchronous processing‣ Hardware accelerated handling of AV media
‣ Available in its current form since iOS 4‣ Additions and enhancements iOS 5 and 6‣ Part of Mac OS X since 10.7 Lion
‣ Apple’s focus for media apps on both iOS and Mac‣ Should be yours too!
iOS Media OptionsWhere does it !t?
MediaPlayer
UIKit
AVFoundation
CoreAudio CoreMedia CoreAnimationCoreVideo
Where do I start?Challenges and Prerequisites
‣ Large and feature-rich framework‣ Over 70 classes (as of iOS 6)‣ Variety of functions, protocols, and constants
‣ Technical Concepts‣ Blocks‣ Key-Value Observing‣ Grand Central Dispatch
‣ Additional frameworks‣ Core Animation‣ Quartz & OpenGL ES‣ Core Media‣ Core Audio
What can I do with it?Decomposing AV Foundation
Inspect
Playback Export
Capture Compose
Media Inspection
Static Media ModelingMedia Assets
‣ AVAsset models the static aspects of a media resource‣ Abstraction over underlying format‣ Models details common to whole media resource
‣ Composed of one or more tracks
‣ AVAssetTrack models the static aspects of the individual media streams within an asset‣ Tracks are of a uniform type (video, audio, etc.)
AVAssetTrack (Video)
AVAssetTrack (Audio)
Timed-media ChallengesMedia Inspection
‣ Processing media takes time‣ Media resources can be large and possibly remote‣ Need to keep the UI responsive‣ Need to handle interruptions
Need to perform inspection asynchronously!
InspectionAsynchronous Inspection
‣ Creating an AVAsset does not the load resource‣ Media not loaded until properties are queried‣ Standard property access happens synchronously
‣ Properties should be loaded asynchronously using the AVAsynchronousKeyValueLoading protocol- statusOfValueForKey:error:- loadValuesAsynchronouslyForKeys:completionHandler:
NSURL *assetURL = [[NSBundle mainBundle] URLForResource:@"song" withExtension:@"mp3"];AVURLAsset *asset = [AVURLAsset URLAssetWithURL:assetURL options:nil];
Asynchronous Inspection
NSURL *url = ...AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil];NSArray *keys = @[@"tracks"];[asset loadValuesAsynchronouslyForKeys:keys completionHandler:^() { NSError *error = nil; AVKeyValueStatus status = [asset statusOfValueForKey:@"tracks" error:&error]; switch (status) { case AVKeyValueStatusLoaded: [self processTracks]; break; case AVKeyValueStatusFailed: [self reportError:error forAsset:asset]; break; case AVKeyValueStatusCancelled: // Do whatever is appropriate for cancelation. break; }}];
Example
Media Processing
AVAssetExportSessionTranscoding and Export
‣Export presets for transcoding to other formats+ (NSArray *)exportPresetsCompatibleWithAsset:(AVAsset *)asset;
+ (NSArray *)allExportPresets;
‣Can specify a time range to perform trimming@property (nonatomic) CMTimeRange timeRange;
‣Can optionally specify metadata to be written@property(nonatomic, copy) NSArray *metadata;
ExampleAVAssetExportSessionNSURL *assetURL = // bundle URL for ‘jam.mp3’AVAsset *audioAsset = [AVURLAsset assetWithURL:assetURL];
AVAssetExportSession *session = [AVAssetExportSession exportSessionWithAsset:audioAsset presetName:AVAssetExportPresetAppleM4A];
session.outputURL = // Documents directory URL for ‘jam.m4a’session.outputFileType = AVFileTypeAppleM4A;[session exportAsynchronouslyWithCompletionHandler:^{ switch (session.status) { case AVAssetExportSessionStatusFailed: NSLog(@"Export Failed"); break; case AVAssetExportSessionStatusCancelled: NSLog(@"Export Cancelled"); break; case AVAssetExportSessionStatusCompleted: NSLog(@"Success!"); break; }}];
AVAssetImageGeneratorImage Generation
AVAssetImageGenerator
Generate thumbnail images for speci!ed time periods
101010101010101010101010101010101010101010101010101010101010101010010101010101010101010101010101010101010101010101010101010101010101
NSURL *assetURL = ... // Asset URLAVAsset *asset = [AVAsset assetWithURL:assetURL];
// Generator must be retained!self.imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
[self.imageGenerator generateCGImagesAsynchronouslyForTimes:times completionHandler:^(CMTime requestedTime, CGImageRef image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error) { switch (result) { case AVAssetImageGeneratorFailed: // Handle Failure break; case AVAssetImageGeneratorCancelled: // Handle Cancellation break; case AVAssetImageGeneratorSucceeded: // Process Image break; }}];
ExampleAVAssetImageGenerator
Reading and WritingAdvanced Media Processing
AVAssetReader
AVAssetReader
AVAssetWriter
AVAssetWriter
Media Playback
Playback ControllerAVPlayer
‣ AVPlayer is a controller for managing playback‣ play‣ pause‣ seekToTime:
‣ Use KVO to observe playback readiness and state‣ status
‣ Timed Observations‣ addPeriodicTimeObserverForInterval:queue:usingBlock‣ addBoundaryTimeObserverForInterval:queue:usingBlock
Static
Static vs Dynamic ModelsPlaying Media
‣ AV Foundation distinguishes between static and dynamic aspects of media
D ynamic
AVPlayerItemAVPlayerItemTrack
AVPlayerItemTrackAVPlayerItemTrack
AVAssetAVAsset
AVAssetAVAssetTrack
Core Media EssentialsUnderstanding Time
‣ CMTime‣ Rational number representing time‣ 64-bit time value (numerator)‣ 32-bit time scale (denominator)
‣ CMTimeRange‣ Struct containing start and end times
CMTime fiveSeconds = CMTimeMake(5, 1);CMTime halfSecond = CMTimeMake(1, 2);CMTime thirtyFPS = CMTimeMake(1, 30);
AVPlayerLayerRendering Video Content
AVPlayerAVPlayerItem
AVPlayerItemTrackAVPlayerItemTrackAVPlayerItemTrack
AVAsset
AVAssetAVAssetAVAssetTrack
AVPlayerLayerRendering Video Content
AVPlayerAVPlayerItem
AVPlayerItemTrackAVPlayerItemTrackAVPlayerItemTrack
AVPlayerLayer
Demo
Media Capture
OverviewMedia Capture
‣ Image Capture‣ Independent control of white balance, focus, exposure‣ Ability to write EXIF metadata‣ Uncompressed output
‣ Video Capture‣ Con!gurable formats and resolution‣ Ability to write video metadata
‣ Ability to access and process input data‣ Pixel buffers containing still and video frames‣ Audio sample buffers containing PCM data
AVCaptureSessionCapture Sessions
AVCaptureStillImageOutput
AVCaptureAudioDataOutput
AVCaptureMovieFileOutput
AVCaptureVideoDataOutput
AVCaptureDevice
AVCaptureDevice
AVCaptureSession
AVCaptureVideoPreviewLayer
GPUImage from Brad Larsonhttps://github.com/BradLarson/GPUImage
ExampleBasic CaptureAVCaptureSession *captureSession = [[AVCaptureSession alloc] init];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (device) { AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; if (input) { [captureSession addInput:input]; }}
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
if (audioDevice) { AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; if (input) { [captureSession addInput:input]; }}
Example (Continued)Basic Capture
AVCaptureMovieFileOutput *movieOutput = [[AVCaptureMovieFileOutput alloc] init];
NSURL *movieURL = ... // Write to URL in iOS Documents directory
[captureSession addOutput:movieOutput];
[captureSession startRunning];[movieOutput startRecordingToOutputFileURL:movieURL recordingDelegate:self];
// Record for a while[captureSession stopRunning];[movieOutput stopRecording];
Demo
Composing Media
AVCompositionComposing Assets
‣Concrete extension of AVAsset‣Composes asset segments on a timeline
Tracks and SegmentsComposing Assets
AVComposition
AVMutableComposition *composition = [AVMutableComposition composition];
AVComposition
Tracks and SegmentsComposing Assets
CMPersistentTrackID trackID = kCMPersistentTrackID_Invalid;
AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:trackID];
AVMutableCompositionTrack *audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:trackID];
AVCompositionTrack (Video)
AVCompositionTrack (Audio)
AVComposition
Tracks and SegmentsComposing Assets
AVCompositionTrack (Video)
AVCompositionTrack (Audio)
AVCompositionTrackSegment AVCompositionTrackSegment
Seconds 10-30 of “redpanda.m4v” Seconds 20-60 of “waves.m4v”
AVCompositionTrackSegment
Seconds 0-60 of “soundtrack.mp3”
AVAssetTrack *srcVideoTrack1 = // source video track 1[videoTrack insertTimeRange:timeRange ofTrack:srcVideoTrack1 atTime:startTime error:&error];
AVAssetTrack *srcVideoTrack2 = // source video track 2[videoTrack insertTimeRange:timeRange ofTrack:srcVideoTrack2 atTime:startTime error:&error];
AVAssetTrack *srcAudioTrack = // source audio track[audioTrack insertTimeRange:timeRange ofTrack:srcAudioTrack atTime:startTime error:&error];
Advanced Techniques
‣ Video Transitions‣ AVVideoComposition used to describe
transitions between video tracks‣ Provides adjustment to opacity and transform
‣ Audio Mixing‣ AVAudioMix provides dynamic volume control‣ Used for crossfades, ducking, etc.
‣ Core Animation‣ AVSynchronizedLayer‣ AVVideoCompositionCoreAnimationTool
Powerful Editing
Demo
SummaryAV Foundation Rocks!
‣ Extremely impressive and capable‣ Challenging, but fun and rewarding
‣ Steep learning curve‣ Large framework with broad set of features‣ Requires understanding of advanced Objective-C‣ Inadequate documentation
‣ Apple’s current and future media direction
Resources
Presentation Materialshttp://www.slideshare.net/bobmccune/https://github.com/tapharmonic/AVFoundationDemos
WWDC 2011: Exploring AV Foundationhttps://developer.apple.com/videos/wwdc/2011/?id=405
WWDC 2011: Working with Media in AV Foundationhttps://developer.apple.com/videos/wwdc/2011/?id=415
WWDC 2011: Capturing from the Camerahttps://developer.apple.com/videos/wwdc/2011/?id=419
BobMcCune.com @bobmccune
top related