Hi Guys,
I got an image rotation function working a while back,
and recently put it to the test in an app, and have an interesting problem.
The image I'm testing is this:
![]()
Well not really, but it might as well be.
The image is stretched to be larger than the screen, and the origin I'm
rotating about will always be inside the blue box,
so I can rotate the image at any angle, and don't have to worry about
what part of the black surrounding gets chopped off since the background
is also plain black.
Every time the image is rotated, it appears there is some skew in the angle
the result can appear to be about 5 degrees off at worst,
depending on the angle of rotation. Also, depending on the angle, the result
image can be distorted into a slight rhombus, and not end up perfectly square.
I am printing the image with the exact same bounds that I started with 1024x1024.
Here is the code (image rotation) :
and the part that deals with finding the origin, and drawing to screen:
Everything is fine (even if I needlessly use the image rotation function)
if the angle I'm rotating is zero. It also appears the origin always lines up correctly.
Any ideas on what's happening?
Thanks, Art.
I got an image rotation function working a while back,
and recently put it to the test in an app, and have an interesting problem.
The image I'm testing is this:

Well not really, but it might as well be.
The image is stretched to be larger than the screen, and the origin I'm
rotating about will always be inside the blue box,
so I can rotate the image at any angle, and don't have to worry about
what part of the black surrounding gets chopped off since the background
is also plain black.
Every time the image is rotated, it appears there is some skew in the angle
the result can appear to be about 5 degrees off at worst,
depending on the angle of rotation. Also, depending on the angle, the result
image can be distorted into a slight rhombus, and not end up perfectly square.
I am printing the image with the exact same bounds that I started with 1024x1024.
Here is the code (image rotation) :
Code:
- (CGImageRef)CGImageRotatedByAngle:(CGImageRef)imgRef angle:(CGFloat)angle
{
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef bmContext = CGBitmapContextCreate(NULL,1024,1024,8,0,colorSpace,kCGImageAlphaPremultipliedFirst);
CGColorSpaceRelease(colorSpace);
CGContextTranslateCTM(bmContext,+originx,+originy);
CGContextRotateCTM(bmContext, angle);
CGContextTranslateCTM(bmContext,-originx,-originy);
CGContextDrawImage(bmContext, CGRectMake(0,0,1024,1024),imgRef);
rotatedImage = CGBitmapContextCreateImage(bmContext);
CFRelease(bmContext);
return rotatedImage;
}
and the part that deals with finding the origin, and drawing to screen:
Code:
// these floats are already loaded with the values I was
// already using to print the image without rotation
// they are current screen coordinates
tscreenx // top left of image
tscreeny //
bscreenx // and bottom right of image
bscreeny //
screenx // screen coords for image centre
screeny //
pixelvalx = 1024.0 / (bscreenx-tscreenx); // calc how many screen pixels (coordinate value) that one pixel
pixelvaly = 1024.0 / (bscreeny-tscreeny); // of the image is currently worth while printed stretched
originx = 512.0 + ((160-screenx) * pixelvalx); // find origin using image centre
originy = 512.0 + ((screeny-centery) * pixelvaly);
//originx = (160-tscreenx) * pixelvalx; // or find origin using image corners
//originy = (bscreeny-centery) * pixelvaly;
CGImageRef imageRef = [self CGImageRotatedByAngle:[boxImage CGImage] angle:(360-rotation)/57.295791];
rotImage = [UIImage imageWithCGImage: imageRef];
CGRect boximageRect = CGRectMake(tscreenx, tscreeny, bscreenx-tscreenx, bscreeny-tscreeny);
[rotImage drawInRect:boximageRect];
CGImageRelease(imageRef);
Everything is fine (even if I needlessly use the image rotation function)
if the angle I'm rotating is zero. It also appears the origin always lines up correctly.
Any ideas on what's happening?
Thanks, Art.