Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

jcl43

macrumors newbie
Original poster
Mar 9, 2008
21
0
Hi all, I am working on an application that involves real time video editing with Core Framework in Cocoa. For testing we basically have a DVI cam hooked up to a Mac Book Pro and need to be able to edit the stream it produces. I successfully got the video into quick time using QTKit, but am having trouble aquiring the frames with a display link. The example in the Core Video guide shows it initializing the video source from file with this method:

* (id)initWithFilePath: (NSString*)theFilePath


, but I need to get the frames I can already access into the display link for OpenGL rendering and Core Image effects. If anyone has any ideas or advice I would greatly appreciate it. Thanks in advance!
 

Sayer

macrumors 6502a
Jan 4, 2002
981
0
Austin, TX
In the Core Video Programming Guide, Core Video Tasks, at the very end of Obtaining Frames using Display Link it talks about getting frames from QuickTime. Basically you grab a frame of video as an OpenGL texture and pass it back to Core Video for manipulation somehow.

I think that is the direction you need to go (literally and figuratively).
 

jcl43

macrumors newbie
Original poster
Mar 9, 2008
21
0
Thanks, im going to give it a try right now. When using this implementation would I set the delegate to be an NSOpenGLView? Right now i have it output via QTCaptureView. But if I want to do openGL rendering/Core Image effects on the frame does it need to be an NSOpenGLView?
 

jcl43

macrumors newbie
Original poster
Mar 9, 2008
21
0
maybe posting my code this far will help

Code:
//
//  VideoDisplayController.m
//  Prototype492
//
//  Created on 3/7/08.
//

#import "VideoDisplayController.h"

@implementation VideoDisplayController 

/*
    start the video capture session if it's not running
*/
- (IBAction)startVideoCapture:(id)sender
{

	if(![captureSession isRunning])
	{
	    [captureSession startRunning];
	}
}

/*
    stop the video capture session if it's running
*/
	
- (IBAction)stopVideoCapture:(id)sender
{
    if([captureSession isRunning])
	{
	    [captureSession stopRunning];
	}
	
}


/*
initialization of UI elements
*/
- (void)awakeFromNib
{
    NSLog(@"initializing video controller");

    if(!captureSession)	{

        //creating capture session

        captureSession = [[QTCaptureSession alloc] init];
	

	    //flags for connecting inputs and outputs
	    BOOL successConnect = NO;
	    NSError *error = nil;
	
		
	    //finding and creating the input device and adding it to the capture session
	    //if unsuccessful, send a message to error log


        //note: using QTMediaTypeMuxed for DV camera... otherwise use QTMediaTypeVideo
        videoCaptureDevice = [QTCaptureDevice defaultInputDeviceWithMediaType:QTMediaTypeMuxed];
    
	    //if the video capture device is found, attempt to open it and add it to the capture session
        if(videoCaptureDevice)
	    {
            successConnect = [videoCaptureDevice open:&error];

		
		    if(!successConnect)
		    {
            
		        NSLog(@"error opening video capture device");
                videoCaptureDevice = nil;

		        //other error handling code goes here
		    }

            captureVideoDeviceInput = [[QTCaptureDeviceInput alloc] initWithDevice:videoCaptureDevice];
     	
	
            //attempt to add capture device input to the capture session
		    successConnect = [captureSession addInput:captureVideoDeviceInput error:&error];

		    if(!successConnect)
		    {
		        NSLog(@"error adding video capture device input to capture session");
			
			    //other error handling code goes here
	        }
        
            //adding decompressed video output to capture session
			
			decompressedVideoOutput = [[QTCaptureDecompressedVideoOutput alloc] init];
			
			//assigning delegate to decompressed video output
			
			[decompressedVideoOutput setDelegate: self];
			
			successConnect = [captureSession addOutput:decompressedVideoOutput error:&error];

            if(!successConnect)
			{
			    NSLog(@"error adding decompressed video output to capture session");
				
				//other error handling code goes here
			
			}

	
            //linking capture session to the capture view
		    [captureView setCaptureSession:captureSession];
	    
		    //[captureSession startRunning];
	    }
        else
	    {
	        NSLog(@"couldn't find video capture device");
	    }
    }
}


/*
delegate method called when decompressedVideoOutput receives a frame
it is not called in the main thread, so a synhronized block is used
delegates of decompressedVideoOutput receive this message 
and then can used the provided video frame for further image processing

captureOutput - the QTCaptureDecompressedVideoOutput instance that sent the frame (decompressedVideoOutput)
videoFrame - Core Video image buffer containing the decompressed frame
sampleBuffer sample buffer containing additional info about the frame
connection - connnection from which vdieo was received
*/


- (void)captureOutput:(QTCaptureOutput *)captureOutput 
        didOutputVideoFrame:(CVImageBufferRef)videoFrame
        withSampleBuffer:(QTSampleBuffer *)sampleBuffer 
        fromConnection:(QTCaptureConnection *)connection
{
    CVImageBufferRef releasedImageBuffer;


    CVBufferRetain(videoFrame);
	

    @synchronized(self)
	{
        //basically, have frame to be released refer to the current frame
		//then update the reference to the current frame with the next frame in the "video stream"
	    releasedImageBuffer = currentImageBuffer;
        currentImageBuffer = videoFrame;
	}

    CVBufferRelease(releasedImageBuffer);

}





/*
event that's fired when window is closed
stop the session and close any opened devices
*/
- (void)windowWillClose:(NSNotification *)notification
{
    if([captureSession isRunning])
	{
	    [captureSession stopRunning];
    }
	
	if([videoCaptureDevice isOpen])
	{
	    [videoCaptureDevice close];
	}

}

/*
event that fires when memory is to be deallocated when object is killed
*/
- (void)dealloc
{

    [captureSession release];
	[captureVideoDeviceInput release];
    [decompressedVideoOutput release];
	
	[super dealloc];
}


@end


//
//  VideoDisplayController.h
//  Prototype492v1.1
//
//  Updated on 3/8/08.
//

#import <Cocoa/Cocoa.h>
#import <QTKit/QTkit.h>
#import "VideoView.h"

@interface VideoDisplayController : NSOpenGLView 
{
    QTCaptureSession *captureSession;
    QTCaptureDevice *videoCaptureDevice;
	QTCaptureDeviceInput *captureVideoDeviceInput;
    QTCaptureDecompressedVideoOutput *decompressedVideoOutput;
   
    //stores most recent frame grabbed from a CVImageBufferRef
	CVImageBufferRef currentImageBuffer;


    //this is the panel on which the video from the DV camera will be displayed
    IBOutlet QTCaptureView *captureView;


    //IBOutlet VideoView *openGLVideoView;

}



- (IBAction)startVideoCapture:(id)sender;
- (IBAction)stopVideoCapture:(id)sender;


@end

thanks again for any help
 

Sayer

macrumors 6502a
Jan 4, 2002
981
0
Austin, TX
Yes, you use NSOpenGLView.

There is sample code called CIVideoDemoGL that does a lot of what you want to do and is easier to point to than copy/pasting it into a message board.
 

jcl43

macrumors newbie
Original poster
Mar 9, 2008
21
0
Thanks for the info. That seriously helped me out a ton. I still need to figure out how to get the frames from the dvi came...in this example it opens from a file i believe.
 

Sayer

macrumors 6502a
Jan 4, 2002
981
0
Austin, TX
QTKit now has video capture support in Leopard and its fairly easy to do.

Lemme look for the sample code I was studying earlier (for a big-time pro-level contract job with a rather large company).

Working...

The sample code is called "StillImage" but it does a lot of what you want e.g. capturing from a live video source and doing something with the frames. Here's some of it:

Code:
        mCaptureSession = [[QTCaptureSession alloc] init];
        
        // Find a video device
        QTCaptureDevice *device = [QTCaptureDevice defaultInputDeviceWithMediaType:QTMediaTypeVideo];
        success = [device open:&error];
        if (!success) {
            [[NSAlert alertWithError:error] runModal];
            return;
        }
        
        // Add a device input for that device to the capture session
        mCaptureDeviceInput = [[QTCaptureDeviceInput alloc] initWithDevice:device];
        success = [mCaptureSession addInput:mCaptureDeviceInput error:&error];
        if (!success) {
            [[NSAlert alertWithError:error] runModal];
            return;
        }
        
        // Add a decompressed video output that returns raw frames to the session
        mCaptureDecompressedVideoOutput = [[QTCaptureDecompressedVideoOutput alloc] init];
        [mCaptureDecompressedVideoOutput setDelegate:self];
        success = [mCaptureSession addOutput:mCaptureDecompressedVideoOutput error:&error];
        if (!success) {
            [[NSAlert alertWithError:error] runModal];
            return;
        }
        
        // Preview the video from the session in the document window
        [mCaptureView setCaptureSession:mCaptureSession];
        
        // Start the session
        [mCaptureSession startRunning];

Note that QTKit's video capture is Leopard only (for use in Cocoa apps anyway).

In this implementation you create a few delegate methods (callbacks) that hand off a frame of raw decompressed video that you can manipulate.

Code:
// This delegate method is called whenever the QTCaptureDecompressedVideoOutput receives a frame
- (void)captureOutput:(QTCaptureOutput *)captureOutput didOutputVideoFrame:(CVImageBufferRef)videoFrame withSampleBuffer:(QTSampleBuffer *)sampleBuffer fromConnection:(QTCaptureConnection *)connection
{
    // Store the latest frame
	// This must be done in a @synchronized block because this delegate method is not called on the main thread
    CVImageBufferRef imageBufferToRelease;
    
    CVBufferRetain(videoFrame);
    
    @synchronized (self) {
        imageBufferToRelease = mCurrentImageBuffer;
        mCurrentImageBuffer = videoFrame;
    }
    
    CVBufferRelease(imageBufferToRelease);
}
 

fantastico

macrumors newbie
Mar 10, 2008
2
0
Sayer, from what I'm seeing in jcl43's code, he is already using QTKit to capture the frames from a DV Cam, and even storing the current frame in a CVBufferRef...very similar to what you just posted.

In fact, in the original post, he states:I SUCCESSFULLY GOT THE VIDEO INTO QUICK TIME USING QTKIT, but am having trouble aquiring the frames with a display link.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.