EAGLContext with Core Image Filters

1.4k Views Asked by At

I want to add images to a camera video feed in real time, but there is only need to display them, no need to save. If I use standard Core Image procedure it works fine but I need more frame rate. But if I uncomment this code and comment next it doesn't. OpenGLView is just a UIView subclass with method

+(Class)layerClass {
return [CAEAGLLayer class];
}

//Working code
EAGLContext *myEAGLContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
NSDictionary *options = @{ kCIContextWorkingColorSpace : [NSNull null] };
CIContext *context = [CIContext contextWithEAGLContext:myEAGLContext options:options];
UIImage* foto = [UIImage imageNamed:@"foto2.jpg"];
UIImage* foto2 = [UIImage imageNamed:@"foto_no_ok.png"];
CIImage *backgroundImage = [[CIImage alloc]initWithImage:foto];
CIImage *foregroundImage = [[CIImage alloc]initWithImage:foto2];
 CIFilter *myFilter = [CIFilter filterWithName:@"CISourceOverCompositing" keysAndValues: kCIInputImageKey, foregroundImage, kCIInputBackgroundImageKey, backgroundImage, nil];
CIImage* resultingImage = [myFilter outputImage];

/* Not working code, GPU processing
self.view = [[OpenGLView alloc]initWithFrame:self.view.frame];
[self.view setOpaque:YES];
[self.view setFrame:CGRectMake(0, 0, 500, 500)];
[context drawImage:resultingImage inRect:self.view.bounds fromRect:self.view.bounds];
 */

//Working code, CPU processing
CGRect extent = [resultingImage extent];
CGImageRef cgImage = [context createCGImage:resultingImage fromRect:extent];
UIImage* myImage = [[UIImage alloc]initWithCGImage:cgImage];
self.myImageView.image = myImage;
CFRelease(cgImage);

What is missing? What I am doing wrong? Thank you very much.

1

There are 1 best solutions below

1
Carlos Pastor On

Ok, finally I figured it out. Remember to don't put this code in viewDidLoad:. Images will need some work, but that's a different topic.

EAGLContext *myEAGLContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
NSDictionary *options = @{ kCIContextWorkingColorSpace : [NSNull null] };
self.context = [CIContext contextWithEAGLContext:myEAGLContext options:options];
[EAGLContext setCurrentContext:myEAGLContext];
self.visorOpenGLView.context = myEAGLContext;

UIImage* foto = [UIImage imageNamed:@"onepic.jpg"];
UIImage* foto2 = [UIImage imageNamed:@"anotherpic.png"];
CIImage *backgroundImage = [[CIImage alloc]initWithImage:foto];
CIImage *foregroundImage = [[CIImage alloc]initWithImage:foto2];
CIFilter *myFilter = [CIFilter filterWithName:@"CISourceOverCompositing" keysAndValues: kCIInputImageKey, foregroundImage, kCIInputBackgroundImageKey, backgroundImage, nil];
self.resultingImage = [myFilter outputImage];
UIImage* result = [UIImage imageWithCIImage:self.resultingImage];
self.resultingImageView.image = result;

CGRect ext = [foregroundImage extent];
[self.visorOpenGLView bindDrawable];
[self.context drawImage:self.resultingImage inRect:self.visorOpenGLView.frame fromRect:ext];
[self.visorOpenGLView display];