AVCaptureVideoDataOutputSampleBufferDelegate is not triggering captureOutput

27 Views Asked by At

I am trying to build an card scanner app using vision framework of IOS. I am successfully able to open the camera but the problem is that AVCaptureVideoDataOutputSampleBufferDelegate is not triggering captureOutput function where I will do the analysis for identifying the card.

Here is my .h file

#import <AVFoundation/AVFoundation.h>
#import <UIKit/UIKit.h>

@interface Trying : NSObject <AVCaptureVideoDataOutputSampleBufferDelegate>
@property (nonatomic, strong) AVCaptureSession *captureSession;
@property (nonatomic, strong) AVCaptureVideoPreviewLayer *previewLayer;
@property (nonatomic, strong) dispatch_queue_t videoDataOutputQueue;
@property (nonatomic, strong) UIView *view;
- (void)setupCamera;
@end

And .m file

#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
#import "Trying.h"

@interface Trying ()

@end

@implementation Trying

- (void)setupCamera {
    self.captureSession = [[AVCaptureSession alloc] init];
    
    AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    
    NSError *error;
    AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:&error];
    
    if (deviceInput) {
        [self.captureSession addInput:deviceInput];
        
        AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
        previewLayer.frame = self.view.bounds;
        previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
        [self.view.layer addSublayer:previewLayer];
        
        self.previewLayer = previewLayer;
        
        // Create a video data output and set the delegate
        self.videoDataOutputQueue = dispatch_queue_create("VideoDataOutputQueue", DISPATCH_QUEUE_SERIAL);
        AVCaptureVideoDataOutput *videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
        [videoDataOutput setSampleBufferDelegate:self queue:self.videoDataOutputQueue];
        [self.captureSession addOutput:videoDataOutput];
        
        // Start the session
        [self.captureSession startRunning];
    } else {
        NSLog(@"Error setting up camera: %@", error.localizedDescription);
    }
}

- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    // Process the live sample buffer here
    NSLog(@"Got a live sample buffer");
    
    // You can use the sampleBuffer for further processing, such as image recognition or analysis
}

- (void)captureOutput:(AVCaptureOutput *)output didDropSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    NSLog(@"Dropped a sample buffer");
    // Handle dropped frames if needed
}
@end

Can anyone please help with why captureOutput is not being triggered by AVCaptureVideoDataOutputSampleBufferDelegate. I am able to see the live camera feed in the view but captureOutput is not being called

1

There are 1 best solutions below

0
Hao Liang On

You need to hold the output object to prevent it from being released.
1、@property (nonatomic, strong) AVCaptureVideoDataOutput* cameraOutput;
2、_cameraOutput = videoDataOutput;
It is recommended that you also create an AVCaptureConnection object to distinguish the connection in the AVCaptureVideoDataOutputSampleBufferDelegate.
3、AVCaptureConnection* _videoConnection;
4、_videoConnection = [_cameraOutput connectionWithMediaType:AVMediaTypeVideo];
5、this didOutputSampleBuffer, if (_videoConnection == connection), you can use sampleBuffer.