如何减less使用UIImagePickerController创build的video的文件大小?
我有一个应用程序,允许用户使用UIImagePickerController
录制video,然后将其上传到YouTube。 问题是, UIImagePickerController
创build的video文件是巨大的,即使video只有5秒长。 例如,5秒长的video是16-20兆字节。 我想保持在540或720质量的video,但我想减less文件的大小。
我一直在尝试使用AVFoundation和AVAssetExportSession
来尝试获得更小的文件大小。 我试过下面的代码:
AVAsset *video = [AVAsset assetWithURL:videoURL]; AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:video presetName:AVAssetExportPresetPassthrough]; exportSession.shouldOptimizeForNetworkUse = YES; exportSession.outputFileType = AVFileTypeMPEG4; exportSession.outputURL = [pathToSavedVideosDirectory URLByAppendingPathComponent:@"vid1.mp4"]; [exportSession exportAsynchronouslyWithCompletionHandler:^{ NSLog(@"done processing video!"); }];
但是这并没有减less文件的大小 。 我知道我在做什么是可能的,因为在苹果的照片应用程序中,当你select“在YouTube上共享” ,将自动处理video文件,使其足够小,上传。 我想在我的应用程序中做同样的事情。
我怎样才能做到这一点?
使用AVCaptureSession
和AVAssetWriter
您可以设置压缩设置:
NSDictionary *settings = @{AVVideoCodecKey:AVVideoCodecH264, AVVideoWidthKey:@(video_width), AVVideoHeightKey:@(video_height), AVVideoCompressionPropertiesKey: @{AVVideoAverageBitRateKey:@(desired_bitrate), AVVideoProfileLevelKey:AVVideoProfileLevelH264Main31, /* Or whatever profile & level you wish to use */ AVVideoMaxKeyFrameIntervalKey:@(desired_keyframe_interval)}}; AVAssetWriterInput* writer_input = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:settings];
编辑:我想,如果你坚持使用UIImagePicker
来创build电影,你必须使用AVAssetReader's
copyNextSampleBuffer
和AVAssetWriter's
appendSampleBuffer
方法来完成转码。
在UImagePickerController
您有一个videoQuality
属性的UIImagePickerControllerQualityType
types,并且将应用于录制的电影以及从库中拾取的(在转码阶段发生的)拾取的电影。
或者,如果您不得不从库中处理现有资产(文件),则可能需要查看这些预设:
AVAssetExportPresetLowQuality AVAssetExportPresetMediumQuality AVAssetExportPresetHighestQuality
和
AVAssetExportPreset640x480 AVAssetExportPreset960x540 AVAssetExportPreset1280x720 AVAssetExportPreset1920x1080
并将其中的一个传递给AVAssetExportSession
类的初始化 AVAssetExportSession
。 恐怕你不得不使用那些你的特定内容,因为没有精确的描述什么是中low
质量,哪个质量将用于640x480
或1280x720
预设。 文档中唯一有用的信息如下:
导出适用于设备的QuickTime文件的预设名称您可以使用这些导出选项生成适合当前设备的video大小的QuickTime .mov文件。
导出不会从较小的尺寸缩放video。 video使用H.264进行压缩; audio使用AAC进行压缩
有些设备无法支持某些尺寸。
除此之外,我不记得 AVFoundation
帧率或自由尺寸等质量有精确的控制
我错了,有一种方法来调整你提到的所有参数,它确实是AVAssetWriter: 如何导出UIImagearrays作为一个电影?
顺便说一句,这是一个类似的问题与代码示例的链接: iPhone:编程压缩录制的video分享?
yourfriendzak是正确的:设置cameraUI.videoQuality = UIImagePickerControllerQualityTypeLow;
这里不是解决scheme。 解决办法是降低数据速率或比特率,这正是jgh所暗示的。
我有三种方法 第一个方法处理UIImagePicker
委托方法:
// For responding to the user accepting a newly-captured picture or movie - (void) imagePickerController: (UIImagePickerController *) picker didFinishPickingMediaWithInfo: (NSDictionary *) info { // Handle movie capture NSURL *movieURL = [info objectForKey: UIImagePickerControllerMediaURL]; NSURL *uploadURL = [NSURL fileURLWithPath:[[NSTemporaryDirectory() stringByAppendingPathComponent:[self randomString]] stringByAppendingString:@".mp4"]]; // Compress movie first [self convertVideoToLowQuailtyWithInputURL:movieURL outputURL:uploadURL]; }
第二种方法将video转换为较低的比特率,而不是降低维度。
- (void)convertVideoToLowQuailtyWithInputURL:(NSURL*)inputURL outputURL:(NSURL*)outputURL { //setup video writer AVAsset *videoAsset = [[AVURLAsset alloc] initWithURL:inputURL options:nil]; AVAssetTrack *videoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; CGSize videoSize = videoTrack.naturalSize; NSDictionary *videoWriterCompressionSettings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:1250000], AVVideoAverageBitRateKey, nil]; NSDictionary *videoWriterSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey, videoWriterCompressionSettings, AVVideoCompressionPropertiesKey, [NSNumber numberWithFloat:videoSize.width], AVVideoWidthKey, [NSNumber numberWithFloat:videoSize.height], AVVideoHeightKey, nil]; AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoWriterSettings]; videoWriterInput.expectsMediaDataInRealTime = YES; videoWriterInput.transform = videoTrack.preferredTransform; AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:outputURL fileType:AVFileTypeQuickTimeMovie error:nil]; [videoWriter addInput:videoWriterInput]; //setup video reader NSDictionary *videoReaderSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey]; AVAssetReaderTrackOutput *videoReaderOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:videoTrack outputSettings:videoReaderSettings]; AVAssetReader *videoReader = [[AVAssetReader alloc] initWithAsset:videoAsset error:nil]; [videoReader addOutput:videoReaderOutput]; //setup audio writer AVAssetWriterInput* audioWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:nil]; audioWriterInput.expectsMediaDataInRealTime = NO; [videoWriter addInput:audioWriterInput]; //setup audio reader AVAssetTrack* audioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]; AVAssetReaderOutput *audioReaderOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:audioTrack outputSettings:nil]; AVAssetReader *audioReader = [AVAssetReader assetReaderWithAsset:videoAsset error:nil]; [audioReader addOutput:audioReaderOutput]; [videoWriter startWriting]; //start writing from video reader [videoReader startReading]; [videoWriter startSessionAtSourceTime:kCMTimeZero]; dispatch_queue_t processingQueue = dispatch_queue_create("processingQueue1", NULL); [videoWriterInput requestMediaDataWhenReadyOnQueue:processingQueue usingBlock: ^{ while ([videoWriterInput isReadyForMoreMediaData]) { CMSampleBufferRef sampleBuffer; if ([videoReader status] == AVAssetReaderStatusReading && (sampleBuffer = [videoReaderOutput copyNextSampleBuffer])) { [videoWriterInput appendSampleBuffer:sampleBuffer]; CFRelease(sampleBuffer); } else { [videoWriterInput markAsFinished]; if ([videoReader status] == AVAssetReaderStatusCompleted) { //start writing from audio reader [audioReader startReading]; [videoWriter startSessionAtSourceTime:kCMTimeZero]; dispatch_queue_t processingQueue = dispatch_queue_create("processingQueue2", NULL); [audioWriterInput requestMediaDataWhenReadyOnQueue:processingQueue usingBlock:^{ while (audioWriterInput.readyForMoreMediaData) { CMSampleBufferRef sampleBuffer; if ([audioReader status] == AVAssetReaderStatusReading && (sampleBuffer = [audioReaderOutput copyNextSampleBuffer])) { [audioWriterInput appendSampleBuffer:sampleBuffer]; CFRelease(sampleBuffer); } else { [audioWriterInput markAsFinished]; if ([audioReader status] == AVAssetReaderStatusCompleted) { [videoWriter finishWritingWithCompletionHandler:^(){ [self sendMovieFileAtURL:outputURL]; }]; } } } } ]; } } } } ]; }
成功时,调用第三个方法sendMovieFileAtURL:
上的压缩video上传到服务器。
请注意, 我已经在我的项目中启用了ARC ,所以如果在您的ARC中closures,您将不得不添加一些release
调用。
Erik的回答在他写的时候可能是正确的 – 但是现在iOS8只是左右摇摆,我自己也花了几个小时。
你需要一个博士学位与AVAssetWriter – 这是不平凡的: https : //developer.apple.com/library/mac/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/05_Export.html#//apple_ref/doc/uid/ TP40010188-CH9-SW1
有一个惊人的库正是你想要的只是一个AVAssetExportSession插件replace更重要的function,如更改比特率: https : //github.com/rs/SDAVAssetExportSession
以下是如何使用它:
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info { SDAVAssetExportSession *encoder = [SDAVAssetExportSession.alloc initWithAsset:[AVAsset assetWithURL:[info objectForKey:UIImagePickerControllerMediaURL]]]; NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0]; self.myPathDocs = [documentsDirectory stringByAppendingPathComponent: [NSString stringWithFormat:@"lowerBitRate-%d.mov",arc4random() % 1000]]; NSURL *url = [NSURL fileURLWithPath:self.myPathDocs]; encoder.outputURL=url; encoder.outputFileType = AVFileTypeMPEG4; encoder.shouldOptimizeForNetworkUse = YES; encoder.videoSettings = @ { AVVideoCodecKey: AVVideoCodecH264, AVVideoCompressionPropertiesKey: @ { AVVideoAverageBitRateKey: @2300000, // Lower bit rate here AVVideoProfileLevelKey: AVVideoProfileLevelH264High40, }, }; encoder.audioSettings = @ { AVFormatIDKey: @(kAudioFormatMPEG4AAC), AVNumberOfChannelsKey: @2, AVSampleRateKey: @44100, AVEncoderBitRateKey: @128000, }; [encoder exportAsynchronouslyWithCompletionHandler:^ { int status = encoder.status; if (status == AVAssetExportSessionStatusCompleted) { AVAssetTrack *videoTrack = nil; AVURLAsset *asset = [AVAsset assetWithURL:encoder.outputURL]; NSArray *videoTracks = [asset tracksWithMediaType:AVMediaTypeVideo]; videoTrack = [videoTracks objectAtIndex:0]; float frameRate = [videoTrack nominalFrameRate]; float bps = [videoTrack estimatedDataRate]; NSLog(@"Frame rate == %f",frameRate); NSLog(@"bps rate == %f",bps/(1024.0 * 1024.0)); NSLog(@"Video export succeeded"); // encoder.outputURL <- this is what you want!! } else if (status == AVAssetExportSessionStatusCancelled) { NSLog(@"Video export cancelled"); } else { NSLog(@"Video export failed with error: %@ (%d)", encoder.error.localizedDescription, encoder.error.code); } }]; }
当您要将UIImagePickerController
打开到以下任一项时,可以设置video质量:
UIImagePickerControllerQualityType640x480
UIImagePickerControllerQualityTypeLow
UIImagePickerControllerQualityTypeMedium
UIImagePickerControllerQualityTypeHigh
UIImagePickerControllerQualityTypeIFrame960x540
UIImagePickerControllerQualityTypeIFrame1280x720
在打开UIImagePickerController
时候试试这个代码来改变质量types:
if (([UIImagePickerController isSourceTypeAvailable: UIImagePickerControllerSourceTypeCamera] == NO)) return NO; UIImagePickerController *cameraUI = [[UIImagePickerController alloc] init]; cameraUI.sourceType = UIImagePickerControllerSourceTypeCamera; cameraUI.mediaTypes = [[NSArray alloc] initWithObjects: (NSString *) kUTTypeMovie, nil]; cameraUI.allowsEditing = NO; cameraUI.delegate = self; cameraUI.videoQuality = UIImagePickerControllerQualityTypeLow;//you can change the quality here [self presentModalViewController:cameraUI animated:YES];
埃里克·韦格纳(Erik Wegener)的代码重新翻译成swift 3:
class func convertVideoToLowQuailtyWithInputURL(inputURL: NSURL, outputURL: NSURL, onDone: @escaping () -> ()) { //setup video writer let videoAsset = AVURLAsset(url: inputURL as URL, options: nil) let videoTrack = videoAsset.tracks(withMediaType: AVMediaTypeVideo)[0] let videoSize = videoTrack.naturalSize let videoWriterCompressionSettings = [ AVVideoAverageBitRateKey : Int(125000) ] let videoWriterSettings:[String : AnyObject] = [ AVVideoCodecKey : AVVideoCodecH264 as AnyObject, AVVideoCompressionPropertiesKey : videoWriterCompressionSettings as AnyObject, AVVideoWidthKey : Int(videoSize.width) as AnyObject, AVVideoHeightKey : Int(videoSize.height) as AnyObject ] let videoWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: videoWriterSettings) videoWriterInput.expectsMediaDataInRealTime = true videoWriterInput.transform = videoTrack.preferredTransform let videoWriter = try! AVAssetWriter(outputURL: outputURL as URL, fileType: AVFileTypeQuickTimeMovie) videoWriter.add(videoWriterInput) //setup video reader let videoReaderSettings:[String : AnyObject] = [ kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange) as AnyObject ] let videoReaderOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: videoReaderSettings) let videoReader = try! AVAssetReader(asset: videoAsset) videoReader.add(videoReaderOutput) //setup audio writer let audioWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeAudio, outputSettings: nil) audioWriterInput.expectsMediaDataInRealTime = false videoWriter.add(audioWriterInput) //setup audio reader let audioTrack = videoAsset.tracks(withMediaType: AVMediaTypeAudio)[0] let audioReaderOutput = AVAssetReaderTrackOutput(track: audioTrack, outputSettings: nil) let audioReader = try! AVAssetReader(asset: videoAsset) audioReader.add(audioReaderOutput) videoWriter.startWriting() //start writing from video reader videoReader.startReading() videoWriter.startSession(atSourceTime: kCMTimeZero) let processingQueue = DispatchQueue(label: "processingQueue1") videoWriterInput.requestMediaDataWhenReady(on: processingQueue, using: {() -> Void in while videoWriterInput.isReadyForMoreMediaData { let sampleBuffer:CMSampleBuffer? = videoReaderOutput.copyNextSampleBuffer(); if videoReader.status == .reading && sampleBuffer != nil { videoWriterInput.append(sampleBuffer!) } else { videoWriterInput.markAsFinished() if videoReader.status == .completed { //start writing from audio reader audioReader.startReading() videoWriter.startSession(atSourceTime: kCMTimeZero) let processingQueue = DispatchQueue(label: "processingQueue2") audioWriterInput.requestMediaDataWhenReady(on: processingQueue, using: {() -> Void in while audioWriterInput.isReadyForMoreMediaData { let sampleBuffer:CMSampleBuffer? = audioReaderOutput.copyNextSampleBuffer() if audioReader.status == .reading && sampleBuffer != nil { audioWriterInput.append(sampleBuffer!) } else { audioWriterInput.markAsFinished() if audioReader.status == .completed { videoWriter.finishWriting(completionHandler: {() -> Void in onDone(); }) } } } }) } } } }) }
埃里克·韦格纳(Erik Wegener)的代码重新翻译成swift:
class func convertVideoToLowQuailtyWithInputURL(inputURL: NSURL, outputURL: NSURL, onDone: () -> ()) { //setup video writer let videoAsset = AVURLAsset(URL: inputURL, options: nil) let videoTrack = videoAsset.tracksWithMediaType(AVMediaTypeVideo)[0] let videoSize = videoTrack.naturalSize let videoWriterCompressionSettings = [ AVVideoAverageBitRateKey : Int(125000) ] let videoWriterSettings:[String : AnyObject] = [ AVVideoCodecKey : AVVideoCodecH264, AVVideoCompressionPropertiesKey : videoWriterCompressionSettings, AVVideoWidthKey : Int(videoSize.width), AVVideoHeightKey : Int(videoSize.height) ] let videoWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: videoWriterSettings) videoWriterInput.expectsMediaDataInRealTime = true videoWriterInput.transform = videoTrack.preferredTransform let videoWriter = try! AVAssetWriter(URL: outputURL, fileType: AVFileTypeQuickTimeMovie) videoWriter.addInput(videoWriterInput) //setup video reader let videoReaderSettings:[String : AnyObject] = [ kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange) ] let videoReaderOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: videoReaderSettings) let videoReader = try! AVAssetReader(asset: videoAsset) videoReader.addOutput(videoReaderOutput) //setup audio writer let audioWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeAudio, outputSettings: nil) audioWriterInput.expectsMediaDataInRealTime = false videoWriter.addInput(audioWriterInput) //setup audio reader let audioTrack = videoAsset.tracksWithMediaType(AVMediaTypeAudio)[0] let audioReaderOutput = AVAssetReaderTrackOutput(track: audioTrack, outputSettings: nil) let audioReader = try! AVAssetReader(asset: videoAsset) audioReader.addOutput(audioReaderOutput) videoWriter.startWriting() //start writing from video reader videoReader.startReading() videoWriter.startSessionAtSourceTime(kCMTimeZero) let processingQueue = dispatch_queue_create("processingQueue1", nil) videoWriterInput.requestMediaDataWhenReadyOnQueue(processingQueue, usingBlock: {() -> Void in while videoWriterInput.readyForMoreMediaData { let sampleBuffer:CMSampleBuffer? = videoReaderOutput.copyNextSampleBuffer(); if videoReader.status == .Reading && sampleBuffer != nil { videoWriterInput.appendSampleBuffer(sampleBuffer!) } else { videoWriterInput.markAsFinished() if videoReader.status == .Completed { //start writing from audio reader audioReader.startReading() videoWriter.startSessionAtSourceTime(kCMTimeZero) let processingQueue = dispatch_queue_create("processingQueue2", nil) audioWriterInput.requestMediaDataWhenReadyOnQueue(processingQueue, usingBlock: {() -> Void in while audioWriterInput.readyForMoreMediaData { let sampleBuffer:CMSampleBufferRef? = audioReaderOutput.copyNextSampleBuffer() if audioReader.status == .Reading && sampleBuffer != nil { audioWriterInput.appendSampleBuffer(sampleBuffer!) } else { audioWriterInput.markAsFinished() if audioReader.status == .Completed { videoWriter.finishWritingWithCompletionHandler({() -> Void in onDone(); }) } } } }) } } } }) }
有一个很棒的自定义类( SDAVAssetExportSession )来做video压缩。 你可以从这个链接下载。
下载后,将SDAVAssetExportSession.h和SDAVAssetExportSession.m文件添加到您的项目中,然后使用下面的代码进行压缩。 在下面的代码中,您可以通过指定分辨率和比特率来压缩video
#import "SDAVAssetExportSession.h" - (void)compressVideoWithInputVideoUrl:(NSURL *) inputVideoUrl { /* Create Output File Url */ NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0]; NSFileManager *fileManager = [NSFileManager defaultManager]; NSString *finalVideoURLString = [documentsDirectory stringByAppendingPathComponent:@"compressedVideo.mp4"]; NSURL *outputVideoUrl = ([[NSURL URLWithString:finalVideoURLString] isFileURL] == 1)?([NSURL URLWithString:finalVideoURLString]):([NSURL fileURLWithPath:finalVideoURLString]); // Url Should be a file Url, so here we check and convert it into a file Url SDAVAssetExportSession *compressionEncoder = [SDAVAssetExportSession.alloc initWithAsset:[AVAsset assetWithURL:inputVideoUrl]]; // provide inputVideo Url Here compressionEncoder.outputFileType = AVFileTypeMPEG4; compressionEncoder.outputURL = outputVideoUrl; //Provide output video Url here compressionEncoder.videoSettings = @ { AVVideoCodecKey: AVVideoCodecH264, AVVideoWidthKey: @800, //Set your resolution width here AVVideoHeightKey: @600, //set your resolution height here AVVideoCompressionPropertiesKey: @ { AVVideoAverageBitRateKey: @45000, // Give your bitrate here for lower size give low values AVVideoProfileLevelKey: AVVideoProfileLevelH264High40, }, }; compressionEncoder.audioSettings = @ { AVFormatIDKey: @(kAudioFormatMPEG4AAC), AVNumberOfChannelsKey: @2, AVSampleRateKey: @44100, AVEncoderBitRateKey: @128000, }; [compressionEncoder exportAsynchronouslyWithCompletionHandler:^ { if (compressionEncoder.status == AVAssetExportSessionStatusCompleted) { NSLog(@"Compression Export Completed Successfully"); } else if (compressionEncoder.status == AVAssetExportSessionStatusCancelled) { NSLog(@"Compression Export Canceled"); } else { NSLog(@"Compression Failed"); } }]; }
取消压缩使用下面的代码行
[compressionEncoder cancelExport]; //Video compression cancel
我支持etayluz的答案SDAVAssetExportSession是一个很棒的自定义类来做video压缩。 这是我工作的代码。 您可以从此链接下载SDAVAssetExportSession 。
下载后,将SDAVAssetExportSession.h和SDAVAssetExportSession.m文件添加到您的项目中,然后使用下面的代码进行压缩。 在下面的代码中,您可以通过指定分辨率和比特率来压缩video
#import "SDAVAssetExportSession.h" - (void)compressVideoWithInputVideoUrl:(NSURL *) inputVideoUrl { /* Create Output File Url */ NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0]; NSFileManager *fileManager = [NSFileManager defaultManager]; NSString *finalVideoURLString = [documentsDirectory stringByAppendingPathComponent:@"compressedVideo.mp4"]; NSURL *outputVideoUrl = ([[NSURL URLWithString:finalVideoURLString] isFileURL] == 1)?([NSURL URLWithString:finalVideoURLString]):([NSURL fileURLWithPath:finalVideoURLString]); // Url Should be a file Url, so here we check and convert it into a file Url SDAVAssetExportSession *compressionEncoder = [SDAVAssetExportSession.alloc initWithAsset:[AVAsset assetWithURL:inputVideoUrl]]; // provide inputVideo Url Here compressionEncoder.outputFileType = AVFileTypeMPEG4; compressionEncoder.outputURL = outputVideoUrl; //Provide output video Url here compressionEncoder.videoSettings = @ { AVVideoCodecKey: AVVideoCodecH264, AVVideoWidthKey: @800, //Set your resolution width here AVVideoHeightKey: @600, //set your resolution height here AVVideoCompressionPropertiesKey: @ { AVVideoAverageBitRateKey: @45000, // Give your bitrate here for lower size give low values AVVideoProfileLevelKey: AVVideoProfileLevelH264High40, }, }; compressionEncoder.audioSettings = @ { AVFormatIDKey: @(kAudioFormatMPEG4AAC), AVNumberOfChannelsKey: @2, AVSampleRateKey: @44100, AVEncoderBitRateKey: @128000, }; [compressionEncoder exportAsynchronouslyWithCompletionHandler:^ { if (compressionEncoder.status == AVAssetExportSessionStatusCompleted) { NSLog(@"Compression Export Completed Successfully"); } else if (compressionEncoder.status == AVAssetExportSessionStatusCancelled) { NSLog(@"Compression Export Canceled"); } else { NSLog(@"Compression Failed"); } }]; }
取消压缩使用下面的代码行
[compressionEncoder cancelExport]; //Video compression cancel
- AVPlayer和MPMoviePlayerController的区别
- AVAudioPlayer不能播放任何声音
- AVAssetImageGenerator提供旋转的图像
- 具有多个预览的AVCaptureSession
- 如何解决警告:发送'ViewController * const __strong'参数的不兼容types'id <AVAudioPlayerDelegate>?
- UIWebView:当应用程序进入后台时,HTML5audio会在iOS 6中暂停
- dyld:Library未加载:@ rpath / libswiftAVFoundation.dylib
- 如何播放背景Swift中的audio?
- 如何在Swift中使用AVPlayerViewController(AVKit)播放video