welcome to linkAR technical documentation


Processing Preview Frames

This allows us to process each frame that the camera preview receives.

In the ViewController.m we have:

- (void)viewDidLoad
    [super viewDidLoad];
    // 1.- Init
    _imageTracker = [[ImageTracker alloc] initWithAppKey:@"YOUR API KEY" useDefaultCamera:FALSE];
    // 2.- Set filter
    [_cvView setEnableMedianFilter:YES];

1- Initiates the Tracker. Notice that we set useDefaultCamera:NO
2- The median filter is recommended because we are processing continuous frames.

2.1.- Initiating own camera:

In ViewController.h we have:

#import "myCameraCapture.h"@property (nonatomic,strong) myCameraCapture *captureManager;

In ViewController.m we have:

- (void)viewDidLoad
{//1.- Initialize Capture Manager.
    [self setCaptureManager:[[myCameraCapture alloc] init]];
    [[self captureManager] setVideocontroller:self];
    [[self captureManager] addVideoInput];
    [[self captureManager] addVideoOutput];
    //3.- Add Video Preview Layer and set the frame
    [[self captureManager] addVideoPreviewLayer];
    CGRect layerRect = [[[self view] layer] bounds];
    [[[self captureManager] previewLayer] setBounds:layerRect];
    [[[self captureManager] previewLayer] setPosition:CGPointMake(CGRectGetMidX(layerRect),
    [[[self view] layer] addSublayer:[[self captureManager] previewLayer]];
    [[captureManager captureSession] startRunning];

1- Initialize Capture Manager.
2- We set ourselves as the managing VideoController to receive preview frame updates.
3- Add Video Preview Layer and set the frame.

This function is called for every frame.

        [_imageTracker processNewCameraFrame:captureManager.imageReference];

2.2.- File to manage the camera:

In myCameraCapture.h we have:

#import <CoreMedia/CoreMedia.h>
#import <AVFoundation/AVFoundation.h>
@protocol myCameraCapture
- (void)processNewCameraFrame:(CVImageBufferRef)cameraFrame;
@interface myCameraCapture : NSObject <AVCaptureVideoDataOutputSampleBufferDelegate> {    
@property (strong,nonatomic) AVCaptureVideoPreviewLayer *previewLayer;
@property (strong,nonatomic) AVCaptureSession *captureSession;
@property (weak, nonatomic) id<myCameraCapture> delegate;
@property (nonatomic, strong) UIViewController*videocontroller;
@property (nonatomic) CVImageBufferRef imageReference;
- (void)addVideoInput;
- (void)addVideoOutput;
- (void)addVideoPreviewLayer;

In myCameraCapture.m we have:

// Initializes AVCaptureSession.
- (id)init {
	if ((self = [super init])) {
		[self setCaptureSession:[[AVCaptureSession alloc] init]];
	return self;

// Add preview layer.
- (void)addVideoPreviewLayer {
	[self setPreviewLayer:[[AVCaptureVideoPreviewLayer alloc] initWithSession:[self captureSession]]];
	[[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];

// Add video as device input.
- (void)addVideoInput {
    AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];	
	if (videoDevice) {
		NSError *error;
		AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
		if (!error) {
			if ([[self captureSession] canAddInput:videoIn])
				[[self captureSession] addInput:videoIn];
				NSLog(@"Couldn't add video input");		
			NSLog(@"Couldn't create video input");
		NSLog(@"Couldn't create video capture device");

// Add video output. This function is called only if the mode is MODEVIDEO.
- (void)addVideoOutput {
    // 1
    AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
    [[self captureSession] addOutput:output];
    // 2
    output.videoSettings =
    [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
    // 3
    dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
    [output setSampleBufferDelegate:self queue:queue];

1- Create a VideoDataOutput and add it to the session.
2- Specify the pixel format.
3- Configure your output.

// This function is called for every new frame.
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
    imageReference = CMSampleBufferGetImageBuffer(sampleBuffer);
    [videocontroller performSelectorOnMainThread:@selector(processImageCamera) withObject:nil waitUntilDone:YES];

1- Get output sample buffer.
2- Launch processImageCamera function (function of ViewController).