iOS GPUImage使用短视频录制 暂停录制 滤镜 自定义美颜滤镜gif动态及文本水印添加 视频拼接

    xiaoxiao2022-07-01  117

    主要功能(有示例代码):

    1.基于GPUImage自定义美颜滤镜 2.基于GPUImage添加文本水印及动态水印 3.录制过程中各种滤镜随意切换,及文本水印动态水印随意切换可以加载gif图作为水印 4.录制过程中可以暂停并继续录制 5.使用AVFoundation框架进行视频拼接 6.获取短视频第一帧图片,示例代码中有方法 录制页面:

    暂停功能:

    由于音视频同步过程中需要用到时间戳,视频录制过程中有时间戳,暂停过程如果只是停止编码是不行的,暂停后开始编码时间戳就有问题,不连续了。于是就采用暂停录制就录制成为一段完成视频,结束录制时再把录制的所有视频拼接起来。

    美颜滤镜:

    不想下载示例代码可以到这里看: iOS GPUImage 自定义美颜滤镜

    动态水印功能:

    这个动态水印就像抖音的抖动水印一样,可以加载gif图,没有使用第三方框架加载gif图,是把gif图当中的所有图片提取出来,然后动态替换水印图片,直接使用UIImageView的animation加载图片没有效果,就在GPUImage的回调当中进行动态改变水印图片:

    ///gif NSString *path = [[NSBundle mainBundle] pathForResource:@"video.gif" ofType:nil] ; NSData *imageData = [NSData dataWithContentsOfFile:path]; NSArray * arrImage = [YHHelp changeGifToImage:imageData]; UIImageView * imageV = [[UIImageView alloc]initWithFrame:CGRectMake(80, 0, 100, 80)]; imageV.image = arrImage[0]; //创建滤镜 GPUImageDissolveBlendFilter gifFilter = [[GPUImageAlphaBlendFilter alloc] init]; // GPUImageGaussianBlurFilter *filter = [[GPUImageGaussianBlurFilter alloc] init]; gifFilter.mix = 0.8; //创建水印图形 UIView* watermarkView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 180, 380)]; UILabel *label = [[UILabel alloc] initWithFrame:CGRectMake(0, 0, 80, 80)]; label.text = @"Record"; label.textColor = [UIColor whiteColor]; label.textAlignment = NSTextAlignmentCenter; label.font = [UIFont systemFontOfSize:20 weight:UIFontWeightBold]; [watermarkView addSubview:imageV]; [watermarkView addSubview:label]; GPUImageUIElement *uiElement = [[GPUImageUIElement alloc] initWithView:watermarkView]; videoFilter = [[GPUImageFilter alloc] init]; [videoFilter addTarget:gifFilter]; [uiElement addTarget:gifFilter]; __block NSInteger imageIndex = 0; __block GPUImageUIElement *weakElement = uiElement; __block NSInteger timeCount = 0; [videoFilter setFrameProcessingCompletionBlock:^(GPUImageOutput *output, CMTime time) { NSInteger tempCount = time.value/(time.timescale/1000); //控制水印图片刷新速率 if (tempCount - timeCount > 100) { imageIndex ++; dispatch_async(dispatch_get_main_queue(), ^{ imageV.image = arrImage[imageIndex]; }); if (imageIndex == arrImage.count -1) { imageIndex = 0; } timeCount = tempCount; } [weakElement update]; }];

    视频拼接,使用AVMutableComposition进行拼接:

    - (void)mergeAndExportVideos:(NSArray*)videosPathArray withOutPath:(NSString*)outpath{ if (videosPathArray.count == 0) { [YHToastHUD showToast:@"没有可处理视频文件!"]; return; } //音频视频合成体 AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init]; //创建音频通道容器 AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; //创建视频通道容器 AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; CMTime totalDuration = kCMTimeZero; for (int i = 0; i < videosPathArray.count; i++) { // AVURLAsset *asset = [AVURLAsset assetWithURL:[NSURL URLWithString:videosPathArray[i]]]; NSDictionary* options = @{AVURLAssetPreferPreciseDurationAndTimingKey:@YES}; AVAsset* asset = [AVURLAsset URLAssetWithURL:videosPathArray[i] options:options]; NSError *erroraudio = nil; //获取AVAsset中的音频 或者视频 AVAssetTrack *assetAudioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] firstObject]; //向通道内加入音频或者视频 BOOL ba = [audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:assetAudioTrack atTime:totalDuration error:&erroraudio]; NSLog(@"erroraudio:%@%d",erroraudio,ba); NSError *errorVideo = nil; AVAssetTrack *assetVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo]firstObject]; BOOL bl = [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:assetVideoTrack atTime:totalDuration error:&errorVideo]; NSLog(@"errorVideo:%@%d",errorVideo,bl); totalDuration = CMTimeAdd(totalDuration, asset.duration); } NSLog(@"%@",NSHomeDirectory()); CGSize videoSize = [videoTrack naturalSize]; AVMutableVideoComposition* videoComp = [AVMutableVideoComposition videoComposition]; videoComp.renderSize = videoSize; videoComp.frameDuration = CMTimeMake(1, 30); AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; instruction.timeRange = CMTimeRangeMake(kCMTimeZero, [mixComposition duration]); AVAssetTrack *mixVideoTrack = [[mixComposition tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:mixVideoTrack]; instruction.layerInstructions = [NSArray arrayWithObject:layerInstruction]; videoComp.instructions = [NSArray arrayWithObject: instruction]; NSURL *mergeFileURL = [NSURL fileURLWithPath:outpath]; //视频导出工具 #pragma mark 录制分辨率设置 AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality]; exporter.videoComposition = videoComp; exporter.outputURL = mergeFileURL; exporter.outputFileType = AVFileTypeMPEG4; exporter.shouldOptimizeForNetworkUse = YES; __weak typeof(self) weak_self = self; [exporter exportAsynchronouslyWithCompletionHandler:^{ dispatch_async(dispatch_get_main_queue(), ^{ [YHToastHUD showToast:@"处理完毕...." completion:^{ [weak_self dismissViewControllerAnimated:YES completion:nil]; }]; }); }]; }

    示例代码调试文件查看:

    视频录制文件保存到了document目录下面了,这个目录会被iTunes同步,所以可以使用iTunes把视频保存目录Video给直接导出(拖出即可),或者直接Window->Devices and Simulators 找到相应设备及app导出沙盒里面的内容查看。 test.mp4为最终合成文件。

    示例代码:https://github.com/huizai0705/VideoRecorder_iOS

    最新回复(0)