Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Jakintosh™

macrumors member
Original poster
Jun 21, 2007
73
0
I have an app that uses a live feed from the camera (via a UIImagePickerController) with a cameraOverlay on top of it. I now want to take a "screen shot" of the screen and save that to a UIImage and write that to a view over top the live feed so that it appears that the camera input has paused. I have everything in place to do this, but I can't figure out how to grab a screen shot.

Also, I'm using AVFoundation in another class to turn the Flash Torch on and off. Unfortunately this freezes my camera input from the UIImagePickerController for the rest of the program until I kill it... does anyone know why this happens?

EDIT: It seems I might have to migrate all my camera code to AVFoundation? Can I have a live video feed if I use AVFoundation instead of UIImagePickerController?
 
Last edited:
EDIT: It seems I might have to migrate all my camera code to AVFoundation? Can I have a live video feed if I use AVFoundation instead of UIImagePickerController?

This is the way to do it. If you look back through my posts you'll even find helpful sample code showing how to get each and every frame...
 
Awesome, thanks. Is there a way to make camera overlay equivalent when using AVFoundation?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.