Register FAQ / Rules Forum Spy Search Today's Posts Mark Forums Read
Go Back   MacRumors Forums > Apple Systems and Services > Programming > Mac Programming

Reply
 
Thread Tools Search this Thread Display Modes
Old Dec 23, 2012, 03:53 AM   #1
appson
macrumors newbie
 
Join Date: Nov 2012
AV Foundation encoding options beyond presets

I've been trying to figure out how one might explore encoding beyond the presets available in AV Foundation. It seems that QTKit featured finely grained encoding options for h264, etc, though I just read another thread here circa 12/2011 saying AV Foundation is the way to go when you're looking to go beyond "playing" AV assets.. Please forgive me if I'm overlooking something obvious. A google search yields just a plist of what appears to be the detailed encoding params (albeit outdated) of AV Foundation's encoding options. Further, I was under the impression that gpu accelerated encoding could be achieved only through AV F. On a higher level, the app I'm building encodes h264 from what will be fairly large ProRes files. Very much appreciate any tips. Feel like I'm missing something obvious.
appson is offline   0 Reply With Quote
Old Dec 23, 2012, 12:12 PM   #2
xStep
macrumors 68000
 
Join Date: Jan 2003
Location: Lost in Minneapolis
Not sure what you're looking for. It can be difficult to find good information on AV Foundation. Reviewing the documentation to learn the nomenclature could be handy for web searches. If you haven't done so already, you should review Apple's sample code and WWDC videos.

Below is some sample code ripped from my CameraTime iOS app. In it you can see that I'm setting the bit rate, profile, and size (my input and output are the same size) of the output. This requires use of some mutable arrays where the settings are held and added to appropriate AV objects. I threw in the meta data stuff as bonus.

I haven't yet dealt with converting from one source file to another, as it sounds like you are doing. I don't think that should be too difficult, unless you want to perform extra processing on the frames, or real time activity.


Code:
- (void) startRecording
{
    NSError * error;

    NSMutableDictionary *compressionSettings = [[[NSMutableDictionary alloc] init] autorelease];
	[compressionSettings setValue: [NSNumber numberWithDouble:24*(1024.0*1024.0)] forKey: AVVideoAverageBitRateKey];
    [compressionSettings setValue: AVVideoProfileLevelH264Main41 forKey: AVVideoProfileLevelKey];

    outputSettings = [[NSMutableDictionary alloc] init];
    [outputSettings setValue: AVVideoCodecH264 forKey: AVVideoCodecKey];
    [outputSettings setValue: [NSNumber numberWithInt: 1920] forKey: AVVideoWidthKey];
    [outputSettings setValue: [NSNumber numberWithInt: 1080] forKey: AVVideoHeightKey];
    
    [outputSettings setValue: compressionSettings forKey: AVVideoCompressionPropertiesKey];

    inputWriterBuffer = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo outputSettings: outputSettings];


    outputWriter = [AVAssetWriter assetWriterWithURL: [projectPaths movieURLPath] fileType: AVFileTypeQuickTimeMovie error: &error];
    outputWriter.movieFragmentInterval = CMTimeMakeWithSeconds(1.0, theProject.frameRateVideoOutput.frameRateCMTime.timescale);
    if ([outputWriter canAddInput: inputWriterBuffer]) [outputWriter addInput: inputWriterBuffer];

    outputWriter.metadata = [self commonMetadata];
		
    [outputWriter startWriting];

    imageSourceTime = CMTimeMake( 0 ,theProject.frameRateVideoOutput.frameRateCMTime.timescale); //CMTimeMake(1,600);
    [outputWriter startSessionAtSourceTime: imageSourceTime];
}



- (NSMutableArray *) commonMetadata
{
    // QuickTime X, at least in Snow Leopard 10.6, does not display any of this metadata.
    // To do so, seems to require that metadata be written to the base container track instead of to the video track like I'm doing here.
    // Use command line tool "mdls" to see what the file metadata looks like.
    
    // You cannot set this property after writing on the receiverís asset writer has started.
    // The array contains AVMetadataItem objects representing the collection of track-level metadata to be written in the output file.
    NSMutableArray * myMetadata = [[NSMutableArray new] autorelease];
    AVMutableMetadataItem * metadataItem;

    // QT 7 displays in CMD-J window only.
	metadataItem = [AVMutableMetadataItem metadataItem];
	metadataItem.keySpace = AVMetadataKeySpaceCommon;		
	metadataItem.key = AVMetadataCommonKeySoftware;
	metadataItem.value = @"CameraTime 1.1";
	[myMetadata addObject: metadataItem];		

	return myMetadata;
}
__________________
My App: CameraTime - Time lapse photography for novice and advanced users.
xStep is offline   0 Reply With Quote

Reply
MacRumors Forums > Apple Systems and Services > Programming > Mac Programming

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Similar Threads
thread Thread Starter Forum Replies Last Post
AV Foundation timecode support for 10.8 jmsbc Mac Programming 1 Jan 14, 2014 01:11 PM
Chatty Error Messages in Core Foundation? ArtOfWarfare iPhone/iPad Programming 4 Jul 15, 2013 10:37 AM
How to get a decent foundation without formal education estorstenson Design and Graphics 4 Jun 28, 2013 07:09 PM
Security benefits of a UNIX foundation munkery Apple, Industry and Internet Discussion 0 Sep 27, 2012 04:00 AM

Forum Jump

All times are GMT -5. The time now is 02:43 AM.

Mac Rumors | Mac | iPhone | iPhone Game Reviews | iPhone Apps

Mobile Version | Fixed | Fluid | Fluid HD
Copyright 2002-2013, MacRumors.com, LLC