0%

编辑Assets

本章译自Apple官方文档AVFoundation Programming Guide, 是AVFoundation系列译文第四篇, 介绍了AVFoundation框架中对资源进行编辑的相关内容, 全部译文参见我的GitBook: AVFoundation编程指南.

AVFoundation框架为音视频编辑提供了功能丰富的类集. 这些API的核心称为组件(compositions). composition是一个或多个媒体资源的track的集合. AVMutableComposition类提供了插入和删除track,以及管理其时间顺序的的接口. 下图展示了如何通过已存在的assets组合成为一个composition. 如果你需要顺序合并多个asset到一个文件中, 这就刚刚够用. 但是如果要对track执行任何自定义的音视频处理操作, 那么你需要分别对音频和视频进行合并.

AVMutableComposition assembles assets together

如下图中所示, 使用AVMutableAudioMix类可以对composition中的audio track进行自定义操作. 你还可以指定audio track的最大音量以及为其设置渐变效果.

AVMutableAudioMix performs audio mixing

如下图所示, 使用AVMutableVideoComposition类可以直接处理视频track. 从一个video composition输出视频时, 还可以指定输出的尺寸,缩放比例,以及帧率. 通过video composition的指令(instructions,AVMutableVideoCompositionInstruction), 可以修改视频背景色, 以及设置layer的instructions. Layer的instructions(AVMutableVideoCompositionLayerInstruction)可以对video track实现渐变,渐变变换,透明度, 透明度变换等效果. Video composition还允许通过animationTool属性在视频中应用Core Animation框架的一些效果.

AVMutableVideoComposition

如下图所示, 要对音视频进行组合, 可以使用AVAssetExportSession. 使用composition初始化一个 export session, 然后分别其设置 audioMixvideoComposition属性.

Use AVAssetExportSession to combine media elements into an output file

创建Composition

可以使用 AVMutableComposition类创建一个自定义的Composition. 可以使用AVMutableCompositionTrack类在自定义的Composition中添加一个或多个composition tracks. 下面是一个通过video track和audio track创建composition的例子:

AVMutableComposition *mutableComposition = [AVMutableComposition composition];
// Create the video composition track.
AVMutableCompositionTrack *mutableCompositionVideoTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
// Create the audio composition track.
AVMutableCompositionTrack *mutableCompositionAudioTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

初始化Composition的选项

当在composition中添加一个新的track时, 必须同时提供媒体类型(media type)和track ID. 除了最常用的音频和视频类型, 还有其他的媒体类型可以选择, 比如AVMediaTypeSubtitle, AVMediaTypeText.

每个track都会有一个唯一的标识符track ID. 如果指定kCMPersistentTrackID_Invalid作为track ID, 会为关联的track自动生成一个唯一的ID.

为Composition增加音视频数据

要将媒体数据添加到composition track, 需要访问媒体数据所在的AVAsset对象. 可以使用mutable composition track的接口将具有相同媒体类型的多个track添加到同一个mutable composition track中. 下面的例子说明了如何将两个不同的video asset tracks顺序添加到一个composition track中:

// You can retrieve AVAssets from a number of places, like the camera roll for example.
AVAsset *videoAsset = <#AVAsset with at least one video track#>;
AVAsset *anotherVideoAsset = <#another AVAsset with at least one video track#>;
// Get the first video track from each asset.
AVAssetTrack *videoAssetTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack *anotherVideoAssetTrack = [[anotherVideoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
// Add them both to the composition.
[mutableCompositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,videoAssetTrack.timeRange.duration) ofTrack:videoAssetTrack atTime:kCMTimeZero error:nil];
[mutableCompositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,anotherVideoAssetTrack.timeRange.duration) ofTrack:anotherVideoAssetTrack atTime:videoAssetTrack.timeRange.duration error:nil];

检索兼容的Composition Tracks

可能的情况下, 每种媒体类型都应当只有一个对之对应的composition track, 这样会减少资源的使用量. 当连续呈现媒体数据时, 应当将相同类型的媒体数据放到同一个composition track中. 通过查询一个mutable composition, 找出是否有与asset track对应的 composition track.

AVMutableCompositionTrack *compatibleCompositionTrack = [mutableComposition mutableTrackCompatibleWithTrack:<#the AVAssetTrack you want to insert#>];
if (compatibleCompositionTrack) {
    // Implementation continues.
}

注意: 在同一个composition track添加多个视频段, 在视频段之间进行切换时可能会掉帧, 嵌入式设备尤其明显. 如何为composition track选择合适数量的视频段取决于App的设计以及其目标设备.

生成音量渐变

使用一个AVMutableAudioMix对象就可以为composition中的每一个audio tracks单独执行自定义的音频处理操作.

通过AVMutableAudioMix的类方法audioMix创建一个 audio mix, 然后使用AVMutableAudioMixInputParameters类的接口将audio mix与composition中特定的track关联起来. audio mix可以用来修改audio track的音量. 下面的例子展示了如何给一个audio track设置音量渐变让声音有一个缓慢淡出结束的效果:

AVMutableAudioMix *mutableAudioMix = [AVMutableAudioMix audioMix];
// Create the audio mix input parameters object.
AVMutableAudioMixInputParameters *mixParameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:mutableCompositionAudioTrack];
// Set the volume ramp to slowly fade the audio out over the duration of the composition.
[mixParameters setVolumeRampFromStartVolume:1.f toEndVolume:0.f timeRange:CMTimeRangeMake(kCMTimeZero, mutableComposition.duration)];
// Attach the input parameters to the audio mix.
mutableAudioMix.inputParameters = @[mixParameters];

自定义视频处理

使用AVMutableVideoComposition对象可以对composition中的video tracks执行自定义处理操作.使用video composition, 还可以为video tracks指定尺寸,缩放比例,以及帧率.

设置Composition的背景色

Video compositions必须包含一个AVVideoCompositionInstruction对象的数组, 其中至少包含一个video composition instruction. 使用AVMutableVideoCompositionInstruction可以创建自定义的 video composition instructions. 使用video composition instructions可以修改composition的背景色:

AVMutableVideoCompositionInstruction *mutableVideoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mutableVideoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, mutableComposition.duration);
mutableVideoCompositionInstruction.backgroundColor = [[UIColor redColor] CGColor];

透明度渐变

Video composition instructions也可以用来设置video composition layer instructions. AVMutableVideoCompositionLayerInstruction可以用来设置video track的transforms, transforms渐变, opacity, opacity渐变. Video composition instruction的属性数组layerInstructions中的layer instructions的顺序, 决定了tracks中的视频帧是如何被放置和组合的. 下面的代码片段展示了如何在第二个视频出现之前为第一个视频增加一个透明度淡出效果:

AVAsset *firstVideoAssetTrack = <#AVAssetTrack representing the first video segment played in the composition#>;
AVAsset *secondVideoAssetTrack = <#AVAssetTrack representing the second video segment played in the composition#>;
// Create the first video composition instruction.
AVMutableVideoCompositionInstruction *firstVideoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
// Set its time range to span the duration of the first video track.
firstVideoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, firstVideoAssetTrack.timeRange.duration);
// Create the layer instruction and associate it with the composition video track.
AVMutableVideoCompositionLayerInstruction *firstVideoLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:mutableCompositionVideoTrack];
// Create the opacity ramp to fade out the first video track over its entire duration.
[firstVideoLayerInstruction setOpacityRampFromStartOpacity:1.f toEndOpacity:0.f timeRange:CMTimeRangeMake(kCMTimeZero, firstVideoAssetTrack.timeRange.duration)];
// Create the second video composition instruction so that the second video track isn't transparent.
AVMutableVideoCompositionInstruction *secondVideoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
// Set its time range to span the duration of the second video track.
secondVideoCompositionInstruction.timeRange = CMTimeRangeMake(firstVideoAssetTrack.timeRange.duration, CMTimeAdd(firstVideoAssetTrack.timeRange.duration, secondVideoAssetTrack.timeRange.duration));
// Create the second layer instruction and associate it with the composition video track.
AVMutableVideoCompositionLayerInstruction *secondVideoLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:mutableCompositionVideoTrack];
// Attach the first layer instruction to the first video composition instruction.
firstVideoCompositionInstruction.layerInstructions = @[firstVideoLayerInstruction];
// Attach the second layer instruction to the second video composition instruction.
secondVideoCompositionInstruction.layerInstructions = @[secondVideoLayerInstruction];
// Attach both of the video composition instructions to the video composition.
AVMutableVideoComposition *mutableVideoComposition = [AVMutableVideoComposition videoComposition];
mutableVideoComposition.instructions = @[firstVideoCompositionInstruction, secondVideoCompositionInstruction];

结合Core Animation

Video composition的animationTool属性可以在composition中展示Core Animation框架的强大能力, 例如视频水印, 视频标题和动画遮罩等. 在Video compositions中Core Animatio有两种不同的使用方式: 添加一个 Core Animation layer 作为独立的composition track, 或者直接使用Core Animation layer在视频帧中渲染动画效果. 下面的代码展示了后面一种使用方式, 在视频区域的中心添加水印:

CALayer *watermarkLayer = <#CALayer representing your desired watermark image#>;
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, mutableVideoComposition.renderSize.width, mutableVideoComposition.renderSize.height);
videoLayer.frame = CGRectMake(0, 0, mutableVideoComposition.renderSize.width, mutableVideoComposition.renderSize.height);
[parentLayer addSublayer:videoLayer];
watermarkLayer.position = CGPointMake(mutableVideoComposition.renderSize.width/2, mutableVideoComposition.renderSize.height/4);
[parentLayer addSublayer:watermarkLayer];
mutableVideoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];

最终示例

下面的代码简要的展示了如何合并两个video asset tracks和一个 audio asset track为一个视频文件. 包括:

提示: 为了展示核心代码, 这份示例省略了某些内容, 比如内存管理和通知的移除等. 使用AV Foundation之前, 你最好已经拥有Cocoa框架的使用经验.

创建Composition

使用AVMutableComposition对象组合多个assets中的tracks. 下面的代码创建了一个composition, 并向其添加了一个audio track和一个video track.

AVMutableComposition *mutableComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *videoCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *audioCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

添加Assets

向composition添加两个video asset tracks和一个 audio asset track.

AVAssetTrack *firstVideoAssetTrack = [[firstVideoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack *secondVideoAssetTrack = [[secondVideoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstVideoAssetTrack.timeRange.duration) ofTrack:firstVideoAssetTrack atTime:kCMTimeZero error:nil];
[videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, secondVideoAssetTrack.timeRange.duration) ofTrack:secondVideoAssetTrack atTime:firstVideoAssetTrack.timeRange.duration error:nil];
[audioCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeAdd(firstVideoAssetTrack.timeRange.duration, secondVideoAssetTrack.timeRange.duration)) ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];

判断视频方向

一旦在composition中添加了audio tracks和videotracks, 必须确保其中所有的video tracks的视频方向都是正确的. 默认情况下, video tracks默认为横屏模式. 如果video track是在竖屏模式下采集的, 那么导出视频时会出现方向错误. 同理, 也不能将一个横向的视频和一个纵向的视频进行合并后导出.

BOOL isFirstVideoPortrait = NO;
CGAffineTransform firstTransform = firstVideoAssetTrack.preferredTransform;
// Check the first video track's preferred transform to determine if it was recorded in portrait mode.
if (firstTransform.a == 0 && firstTransform.d == 0 && (firstTransform.b == 1.0 || firstTransform.b == -1.0) && (firstTransform.c == 1.0 || firstTransform.c == -1.0)) {
    isFirstVideoPortrait = YES;
}
BOOL isSecondVideoPortrait = NO;
CGAffineTransform secondTransform = secondVideoAssetTrack.preferredTransform;
// Check the second video track's preferred transform to determine if it was recorded in portrait mode.
if (secondTransform.a == 0 && secondTransform.d == 0 && (secondTransform.b == 1.0 || secondTransform.b == -1.0) && (secondTransform.c == 1.0 || secondTransform.c == -1.0)) {
    isSecondVideoPortrait = YES;
}
if ((isFirstVideoAssetPortrait && !isSecondVideoAssetPortrait) || (!isFirstVideoAssetPortrait && isSecondVideoAssetPortrait)) {
    UIAlertView *incompatibleVideoOrientationAlert = [[UIAlertView alloc] initWithTitle:@"Error!" message:@"Cannot combine a video shot in portrait mode with a video shot in landscape mode." delegate:self cancelButtonTitle:@"Dismiss" otherButtonTitles:nil];
    [incompatibleVideoOrientationAlert show];
    return;
}

设置Video Composition Layer Instructions

一旦确认了视频方向, 就可以对每个视频应用必要的layer instructions, 并将这些layer instructions添加到video composition中去.

AVMutableVideoCompositionInstruction *firstVideoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
// Set the time range of the first instruction to span the duration of the first video track.
firstVideoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, firstVideoAssetTrack.timeRange.duration);
AVMutableVideoCompositionInstruction * secondVideoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
// Set the time range of the second instruction to span the duration of the second video track.
secondVideoCompositionInstruction.timeRange = CMTimeRangeMake(firstVideoAssetTrack.timeRange.duration, CMTimeAdd(firstVideoAssetTrack.timeRange.duration, secondVideoAssetTrack.timeRange.duration));
AVMutableVideoCompositionLayerInstruction *firstVideoLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack];
// Set the transform of the first layer instruction to the preferred transform of the first video track.
[firstVideoLayerInstruction setTransform:firstTransform atTime:kCMTimeZero];
AVMutableVideoCompositionLayerInstruction *secondVideoLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack];
// Set the transform of the second layer instruction to the preferred transform of the second video track.
[secondVideoLayerInstruction setTransform:secondTransform atTime:firstVideoAssetTrack.timeRange.duration];
firstVideoCompositionInstruction.layerInstructions = @[firstVideoLayerInstruction];
secondVideoCompositionInstruction.layerInstructions = @[secondVideoLayerInstruction];
AVMutableVideoComposition *mutableVideoComposition = [AVMutableVideoComposition videoComposition];
mutableVideoComposition.instructions = @[firstVideoCompositionInstruction, secondVideoCompositionInstruction];

所有的AVAssetTrack对象都有一个preferredTransform属性, 包含了 asset track的方向信息. 这个transform会在 asset track 在屏幕上展示时被应用. 在上面的代码中, layer instruction的transform被设置为asset track的transform, 便于在你修改了视频尺寸时,新的composition中的视频也能正确的进行展示.

设置渲染尺寸和帧率

要修正视频方向, 还必须对renderSize属性进行调整. 同时也需要设置一个合理的帧率frameDuration, 比如30FPS. 默认情况下, renderScale值为1.0.

CGSize naturalSizeFirst, naturalSizeSecond;
// If the first video asset was shot in portrait mode, then so was the second one if we made it here.
if (isFirstVideoAssetPortrait) {
// Invert the width and height for the video tracks to ensure that they display properly.
    naturalSizeFirst = CGSizeMake(firstVideoAssetTrack.naturalSize.height, firstVideoAssetTrack.naturalSize.width);
    naturalSizeSecond = CGSizeMake(secondVideoAssetTrack.naturalSize.height, secondVideoAssetTrack.naturalSize.width);
}
else {
// If the videos weren't shot in portrait mode, we can just use their natural sizes.
    naturalSizeFirst = firstVideoAssetTrack.naturalSize;
    naturalSizeSecond = secondVideoAssetTrack.naturalSize;
}
float renderWidth, renderHeight;
// Set the renderWidth and renderHeight to the max of the two videos widths and heights.
if (naturalSizeFirst.width > naturalSizeSecond.width) {
    renderWidth = naturalSizeFirst.width;
}
else {
    renderWidth = naturalSizeSecond.width;
}
if (naturalSizeFirst.height > naturalSizeSecond.height) {
    renderHeight = naturalSizeFirst.height;
}
else {
    renderHeight = naturalSizeSecond.height;
}
mutableVideoComposition.renderSize = CGSizeMake(renderWidth, renderHeight);
// Set the frame duration to an appropriate value (i.e. 30 frames per second for video).
mutableVideoComposition.frameDuration = CMTimeMake(1,30);
    
    

导出Composition

最后一步是导出composition到一个视频文件中, 并将视频文件保存到用户相册中. 使用AVAssetExportSession创建一个新的视频文件, 并指定要输出的文件目录的URL. 使用ALAssetsLibrary可以将生成的视频文件保存到用户相册中.

// Create a static date formatter so we only have to initialize it once.
static NSDateFormatter *kDateFormatter;
if (!kDateFormatter) {
    kDateFormatter = [[NSDateFormatter alloc] init];
    kDateFormatter.dateStyle = NSDateFormatterMediumStyle;
    kDateFormatter.timeStyle = NSDateFormatterShortStyle;
}
// Create the export session with the composition and set the preset to the highest quality.
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mutableComposition presetName:AVAssetExportPresetHighestQuality];
// Set the desired output URL for the file created by the export process.
exporter.outputURL = [[[[NSFileManager defaultManager] URLForDirectory:NSDocumentDirectory inDomain:NSUserDomainMask appropriateForURL:nil create:@YES error:nil] URLByAppendingPathComponent:[kDateFormatter stringFromDate:[NSDate date]]] URLByAppendingPathExtension:CFBridgingRelease(UTTypeCopyPreferredTagWithClass((CFStringRef)AVFileTypeQuickTimeMovie, kUTTagClassFilenameExtension))];
// Set the output file type to be a QuickTime movie.
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
exporter.videoComposition = mutableVideoComposition;
// Asynchronously export the composition to a video file and save this file to the camera roll once export completes.
[exporter exportAsynchronouslyWithCompletionHandler:^{
    dispatch_async(dispatch_get_main_queue(), ^{
        if (exporter.status == AVAssetExportSessionStatusCompleted) {
            ALAssetsLibrary *assetsLibrary = [[ALAssetsLibrary alloc] init];
            if ([assetsLibrary videoAtPathIsCompatibleWithSavedPhotosAlbum:exporter.outputURL]) {
                [assetsLibrary writeVideoAtPathToSavedPhotosAlbum:exporter.outputURL completionBlock:NULL];
            }
        }
    });
}];