如何在IOS中做慢动作video
我必须在video文件中伴随audio,在一些帧之间进行“ 慢动作 ”,并且需要将渐变video存储为新video。
参考: http : //www.youtube.com/watch?v = BJ3_xMGzauk (从0到10秒钟)
从我的分析,我发现AVFoundation框架可以是有帮助的。
参考: http : //developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/00_Introduction.html
从上面的链接复制并粘贴:
“编辑AV Foundation使用合成从现有媒体(通常是一个或多个video和audio轨道)创build新的资产,使用可变的合成来添加和移除轨道,并调整它们的时间顺序,也可以设置相对音量和渐变音轨;以及设置video轨道的不透明度和不透明度斜坡,合成是内存中保存的多个媒体的集合,当您使用导出会话导出合成时,它被合并到一个文件中。 iOS 4.1及更高版本,您还可以使用资产编写器从媒体(如样本缓冲区或静止图像)创build资产。
“
问题: 我可以使用AVFoundation框架“慢动作”video/audio文件吗? 还是有其他可用的包吗? 如果我想单独处理audio和video,请指导我如何做?
更新::代码AV导出会话:
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *outputURL = paths[0]; NSFileManager *manager = [NSFileManager defaultManager]; [manager createDirectoryAtPath:outputURL withIntermediateDirectories:YES attributes:nil error:nil]; outputURL = [outputURL stringByAppendingPathComponent:@"output.mp4"]; // Remove Existing File [manager removeItemAtPath:outputURL error:nil]; AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:self.inputAsset presetName:AVAssetExportPresetLowQuality]; exportSession.outputURL = [NSURL fileURLWithPath:outputURL]; // output path; exportSession.outputFileType = AVFileTypeQuickTimeMovie; [exportSession exportAsynchronouslyWithCompletionHandler:^(void) { if (exportSession.status == AVAssetExportSessionStatusCompleted) { [self writeVideoToPhotoLibrary:[NSURL fileURLWithPath:outputURL]]; ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init]; [library writeVideoAtPathToSavedPhotosAlbum:[NSURL fileURLWithPath:outputURL] completionBlock:^(NSURL *assetURL, NSError *error){ if (error) { NSLog(@"Video could not be saved"); } }]; } else { NSLog(@"error: %@", [exportSession error]); } }];
您可以使用AVFoundation和CoreMedia框架来缩放video。 看看AVMutableCompositionTrack方法:
- (void)scaleTimeRange:(CMTimeRange)timeRange toDuration:(CMTime)duration;
样品:
AVURLAsset* videoAsset = nil; //self.inputAsset; //create mutable composition AVMutableComposition *mixComposition = [AVMutableComposition composition]; AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; NSError *videoInsertError = nil; BOOL videoInsertResult = [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:&videoInsertError]; if (!videoInsertResult || nil != videoInsertError) { //handle error return; } //slow down whole video by 2.0 double videoScaleFactor = 2.0; CMTime videoDuration = videoAsset.duration; [compositionVideoTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, videoDuration) toDuration:CMTimeMake(videoDuration.value*videoScaleFactor, videoDuration.timescale)]; //export AVAssetExportSession* assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetLowQuality];
(可能来自videoAsset的音轨也应该被添加到mixComposition中)
我知道我在这个话题上晚了,但我已经实现了在video中添加慢动作,包括audio以及正确的输出方向,希望这可以帮助别人。
- (void)SlowMotion:(NSURL *)URl { AVURLAsset* videoAsset = [AVURLAsset URLAssetWithURL:URl options:nil]; //self.inputAsset; AVAsset *currentAsset = [AVAsset assetWithURL:URl]; AVAssetTrack *vdoTrack = [[currentAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; //create mutable composition AVMutableComposition *mixComposition = [AVMutableComposition composition]; AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; AVMutableCompositionTrack *compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; NSError *videoInsertError = nil; BOOL videoInsertResult = [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:&videoInsertError]; if (!videoInsertResult || nil != videoInsertError) { //handle error return; } NSError *audioInsertError =nil; BOOL audioInsertResult =[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:[[currentAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:&audioInsertError]; if (!audioInsertResult || nil != audioInsertError) { //handle error return; } CMTime duration =kCMTimeZero; duration=CMTimeAdd(duration, currentAsset.duration); //slow down whole video by 2.0 double videoScaleFactor = 2.0; CMTime videoDuration = videoAsset.duration; [compositionVideoTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, videoDuration) toDuration:CMTimeMake(videoDuration.value*videoScaleFactor, videoDuration.timescale)]; [compositionAudioTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, videoDuration) toDuration:CMTimeMake(videoDuration.value*videoScaleFactor, videoDuration.timescale)]; [compositionVideoTrack setPreferredTransform:vdoTrack.preferredTransform]; NSArray *dirPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *docsDir = [dirPaths objectAtIndex:0]; NSString *outputFilePath = [docsDir stringByAppendingPathComponent:[NSString stringWithFormat:@"slowMotion.mov"]]; if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath]) [[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil]; NSURL *_filePath = [NSURL fileURLWithPath:outputFilePath]; //export AVAssetExportSession* assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetLowQuality]; assetExport.outputURL=_filePath; assetExport.outputFileType = AVFileTypeQuickTimeMovie; exporter.shouldOptimizeForNetworkUse = YES; [assetExport exportAsynchronouslyWithCompletionHandler:^ { switch ([assetExport status]) { case AVAssetExportSessionStatusFailed: { NSLog(@"Export session faiied with error: %@", [assetExport error]); dispatch_async(dispatch_get_main_queue(), ^{ // completion(nil); }); } break; case AVAssetExportSessionStatusCompleted: { NSLog(@"Successful"); NSURL *outputURL = assetExport.outputURL; ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init]; if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputURL]) { [self writeExportedVideoToAssetsLibrary:outputURL]; } dispatch_async(dispatch_get_main_queue(), ^{ // completion(_filePath); }); } break; default: break; } }]; } - (void)writeExportedVideoToAssetsLibrary :(NSURL *)url { NSURL *exportURL = url; ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init]; if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:exportURL]) { [library writeVideoAtPathToSavedPhotosAlbum:exportURL completionBlock:^(NSURL *assetURL, NSError *error){ dispatch_async(dispatch_get_main_queue(), ^{ if (error) { UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:[error localizedDescription] message:[error localizedRecoverySuggestion] delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil]; [alertView show]; } if(!error) { // [activityView setHidden:YES]; UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:@"Sucess" message:@"video added to gallery successfully" delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil]; [alertView show]; } #if !TARGET_IPHONE_SIMULATOR [[NSFileManager defaultManager] removeItemAtURL:exportURL error:nil]; #endif }); }]; } else { NSLog(@"Video could not be exported to assets library."); } }
我将使用ffmpeg从最初的video中提取所有帧,然后使用AVAssetWriter一起收集,但帧率较低。 为了获得更丰满的慢动作,可能需要应用一些模糊效果,或者甚至在现有帧之间生成帧,这将从两帧混合。
快速的例子:
一世
var asset: AVAsset? func configureAssets(){ let videoAsset = AVURLAsset(url: Bundle.main.url(forResource: "sample", withExtension: "m4v")!) let audioAsset = AVURLAsset(url: Bundle.main.url(forResource: "sample", withExtension: "m4a")!) // let audioAsset2 = AVURLAsset(url: Bundle.main.url(forResource: "audio2", withExtension: "m4a")!) let comp = AVMutableComposition() let videoAssetSourceTrack = videoAsset.tracks(withMediaType: AVMediaTypeVideo).first! as AVAssetTrack let audioAssetSourceTrack = videoAsset.tracks(withMediaType: AVMediaTypeAudio).first! as AVAssetTrack // let audioAssetSourceTrack2 = audioAsset2.tracks(withMediaType: AVMediaTypeAudio).first! as AVAssetTrack let videoCompositionTrack = comp.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid) let audioCompositionTrack = comp.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid) do { try videoCompositionTrack.insertTimeRange( CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(9 , 600)), of: videoAssetSourceTrack, at: kCMTimeZero) try audioCompositionTrack.insertTimeRange( CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(9, 600)), of: audioAssetSourceTrack, at: kCMTimeZero) // // try audioCompositionTrack.insertTimeRange( // CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(3, 600)), // of: audioAssetSourceTrack2, // at: CMTimeMakeWithSeconds(7, 600)) let videoScaleFactor = Int64(2.0) let videoDuration: CMTime = videoAsset.duration videoCompositionTrack.scaleTimeRange(CMTimeRangeMake(kCMTimeZero, videoDuration), toDuration: CMTimeMake(videoDuration.value * videoScaleFactor, videoDuration.timescale)) audioCompositionTrack.scaleTimeRange(CMTimeRangeMake(kCMTimeZero, videoDuration), toDuration: CMTimeMake(videoDuration.value * videoScaleFactor, videoDuration.timescale)) videoCompositionTrack.preferredTransform = videoAssetSourceTrack.preferredTransform }catch { print(error) } asset = comp }
II
func createFileFromAsset(_ asset: AVAsset){ let documentsDirectory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0] as URL let filePath = documentsDirectory.appendingPathComponent("rendered-audio.m4v") deleteFile(filePath) if let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetLowQuality){ exportSession.canPerformMultiplePassesOverSourceMediaData = true exportSession.outputURL = filePath exportSession.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration) exportSession.outputFileType = AVFileTypeQuickTimeMovie exportSession.exportAsynchronously { _ in print("finished: \(filePath) : \(exportSession.status.rawValue) ") } } } func deleteFile(_ filePath:URL) { guard FileManager.default.fileExists(atPath: filePath.path) else { return } do { try FileManager.default.removeItem(atPath: filePath.path) }catch{ fatalError("Unable to delete file: \(error) : \(#function).") } }
- ios使用AVFramework捕获图像
- AVFoundation + AssetWriter:通过图像和audio生成电影
- UIWebView:当应用程序进入后台时,HTML5audio会在iOS 6中暂停
- AVAudioPlayer不能播放任何声音
- AVPlayer HLS实时stream量计(显示FFT数据)
- AVFoundation,如何captureStillImageAsynchronouslyFromConnectionclosures快门声音?
- 如何解决警告:发送'ViewController * const __strong'参数的不兼容types'id <AVAudioPlayerDelegate>?
- AVCaptureVideoPreviewLayer方向 – 需要景观
- 将实时相机video从iOS(iPhone / iPad)stream式传输到远程PC /服务器