AVFoundation (AVPlayer) supported formats? No .vob or .mpg containers?

The AVURLAsset class has a static methods that you can query for supported video UTIs: + (NSArray *)audiovisualTypes On 10.9.1 it returns these system defined UTIs: public.mpeg public.mpeg-2-video public.avi public.aifc-audio public.aac-audio public.mpeg-4 public.au-audio public.aiff-audio public.mp2 public.3gpp2 public.ac3-audio public.mp3 public.mpeg-2-transport-stream public.3gpp public.mpeg-4-audio Here is an explanation of system UTIs. So it seems that at least the … Read more

iOS – Scale and crop CMSampleBufferRef/CVImageBufferRef

If you use vimage you can work directly on the buffer data without converting it to any image format. outImg contains the cropped and scaled image data. The relation between outWidth and cropWidth sets the scaling. int cropX0, cropY0, cropHeight, cropWidth, outWidth, outHeight; CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); CVPixelBufferLockBaseAddress(imageBuffer,0); void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); size_t bytesPerRow = … Read more

create movie from [UIImage], Swift

I convert the objective-c code that posted by ’@Cameron E‘ to Swift 3, and It’s working. the answer’s link:@Cameron E’s CEMovieMaker following is CXEImagesToVideo class: // // CXEImagesToVideo.swift // VideoAPPTest // // Created by Wulei on 16/12/14. // Copyright © 2016 wulei. All rights reserved. // import Foundation import AVFoundation import UIKit typealias CXEMovieMakerCompletion = … Read more

How can I know users click fast forward and fast rewind buttons on the playback controls in iPhone

I got the answer by myself. That is using UIApplication’s beginReceivingRemoteControlEvents. In an appropriate place (like viewWillAppear:) put the following code [[UIApplication sharedApplication] beginReceivingRemoteControlEvents]; [self becomeFirstResponder]; And the view controller should implement the following method returning YES – (BOOL)canBecomeFirstResponder { return YES; } And then you can receive remote controller event in the following method. … Read more

AVSpeechSynthesizer in background mode

You must set “Audio and AirPlay” in background modes. You have to configure the audio session: NSError *error = NULL; AVAudioSession *session = [AVAudioSession sharedInstance]; [session setCategory:AVAudioSessionCategoryPlayback error:&error]; if(error) { // Do some error handling } [session setActive:YES error:&error]; if (error) { // Do some error handling }

WaveForm on IOS

Thank all. I found this example here: Drawing waveform with AVAssetReader , changed it and developed a new class based on. This class returns UIImageView. //.h file #import <UIKit/UIKit.h> @interface WaveformImageVew : UIImageView{ } -(id)initWithUrl:(NSURL*)url; – (NSData *) renderPNGAudioPictogramLogForAssett:(AVURLAsset *)songAsset; @end //.m file #import “WaveformImageVew.h” #define absX(x) (x<0?0-x:x) #define minMaxX(x,mn,mx) (x<=mn?mn:(x>=mx?mx:x)) #define noiseFloor (-50.0) #define … Read more

ASSETWriterInput for making Video from UIImages on Iphone Issues

I found that for some reason I needed to append the buffer more than once. The timing in this example from a test app I made might not be proper, but since it works it should give you a good idea. + (void)writeImageAsMovie:(UIImage*)image toPath:(NSString*)path size:(CGSize)size duration:(int)duration { NSError *error = nil; AVAssetWriter *videoWriter = [[AVAssetWriter … Read more