Question

How can I simply scan barcodes on iPhone and/or iPad?

Was it helpful?

Solution

We produced the 'Barcodes' application for the iPhone. It can decode QR Codes. The source code is available from the zxing project; specifically, you want to take a look at the iPhone client and the partial C++ port of the core library. The port is a little old, from circa the 0.9 release of the Java code, but should still work reasonably well.

If you need to scan other formats, like 1D formats, you could continue the port of the Java code within this project to C++.

EDIT: Barcodes and the iphone code in the project were retired around the start of 2014.

OTHER TIPS

Check out ZBar reads QR Code and ECN/ISBN codes and is available as under the LGPL v2 license.

As with the release of iOS7 you no longer need to use an external framework or library. The iOS ecosystem with AVFoundation now fully supports scanning almost every code from QR over EAN to UPC.

Just have a look at the Tech Note and the AVFoundation programming guide. AVMetadataObjectTypeQRCode is your friend.

Here is a nice tutorial which shows it step by step: iPhone QR code scan library iOS7

Just a little example on how to set it up:

#pragma mark -
#pragma mark AVFoundationScanSetup

- (void) setupScanner;
{
    self.device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

    self.input = [AVCaptureDeviceInput deviceInputWithDevice:self.device error:nil];

    self.session = [[AVCaptureSession alloc] init];

    self.output = [[AVCaptureMetadataOutput alloc] init];
    [self.session addOutput:self.output];
    [self.session addInput:self.input];

    [self.output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
    self.output.metadataObjectTypes = @[AVMetadataObjectTypeQRCode];

    self.preview = [AVCaptureVideoPreviewLayer layerWithSession:self.session];
    self.preview.videoGravity = AVLayerVideoGravityResizeAspectFill;
    self.preview.frame = CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height);

    AVCaptureConnection *con = self.preview.connection;

    con.videoOrientation = AVCaptureVideoOrientationLandscapeLeft;

    [self.view.layer insertSublayer:self.preview atIndex:0];
}

The iPhone 4 camera is more than capabale of doing barcodes. The zebra crossing barcode library has a fork on github zxing-iphone. It's open-source.

liteqr is a "Lite QR Reader in Objective C ported from zxing" on github and has support for Xcode 4.

There are two major libraries:

  • ZXing a library written in Java and then ported to Objective C / C++ (QR code only). And an other port to ObjC has been done, by TheLevelUp: ZXingObjC

  • ZBar an open source software for reading bar codes, C based.

According to my experiments, ZBar is far more accurate and fast than ZXing, at least on iPhone.

You can find another native iOS solution using Swift 4 and Xcode 9 at below. Native AVFoundation framework used with in this solution.

First part is the a subclass of UIViewController which have related setup and handler functions for AVCaptureSession.

import UIKit
import AVFoundation

class BarCodeScannerViewController: UIViewController {

    let captureSession = AVCaptureSession()
    var videoPreviewLayer: AVCaptureVideoPreviewLayer!
    var initialized = false

    let barCodeTypes = [AVMetadataObject.ObjectType.upce,
                        AVMetadataObject.ObjectType.code39,
                        AVMetadataObject.ObjectType.code39Mod43,
                        AVMetadataObject.ObjectType.code93,
                        AVMetadataObject.ObjectType.code128,
                        AVMetadataObject.ObjectType.ean8,
                        AVMetadataObject.ObjectType.ean13,
                        AVMetadataObject.ObjectType.aztec,
                        AVMetadataObject.ObjectType.pdf417,
                        AVMetadataObject.ObjectType.itf14,
                        AVMetadataObject.ObjectType.dataMatrix,
                        AVMetadataObject.ObjectType.interleaved2of5,
                        AVMetadataObject.ObjectType.qr]

    override func viewDidAppear(_ animated: Bool) {
        super.viewDidAppear(animated)
        setupCapture()
        // set observer for UIApplicationWillEnterForeground, so we know when to start the capture session again
        NotificationCenter.default.addObserver(self,
                                           selector: #selector(willEnterForeground),
                                           name: .UIApplicationWillEnterForeground,
                                           object: nil)
    }

    override func viewWillDisappear(_ animated: Bool) {
        super.viewWillDisappear(animated)
        // this view is no longer topmost in the app, so we don't need a callback if we return to the app.
        NotificationCenter.default.removeObserver(self,
                                              name: .UIApplicationWillEnterForeground,
                                              object: nil)
    }

    // This is called when we return from another app to the scanner view
    @objc func willEnterForeground() {
        setupCapture()
    }

    func setupCapture() {
        var success = false
        var accessDenied = false
        var accessRequested = false

        let authorizationStatus = AVCaptureDevice.authorizationStatus(for: .video)
        if authorizationStatus == .notDetermined {
            // permission dialog not yet presented, request authorization
            accessRequested = true
            AVCaptureDevice.requestAccess(for: .video,
                                      completionHandler: { (granted:Bool) -> Void in
                                          self.setupCapture();
            })
            return
        }
        if authorizationStatus == .restricted || authorizationStatus == .denied {
            accessDenied = true
        }
        if initialized {
            success = true
        } else {
            let deviceDiscoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera,
                                                                                        .builtInTelephotoCamera,
                                                                                        .builtInDualCamera],
                                                                          mediaType: .video,
                                                                          position: .unspecified)

            if let captureDevice = deviceDiscoverySession.devices.first {
                do {
                    let videoInput = try AVCaptureDeviceInput(device: captureDevice)
                    captureSession.addInput(videoInput)
                    success = true
                } catch {
                    NSLog("Cannot construct capture device input")
                }
            } else {
                NSLog("Cannot get capture device")
            }
        }
        if success {
            DispatchQueue.global().async {
                self.captureSession.startRunning()
                DispatchQueue.main.async {
                    let captureMetadataOutput = AVCaptureMetadataOutput()
                    self.captureSession.addOutput(captureMetadataOutput)
                    let newSerialQueue = DispatchQueue(label: "barCodeScannerQueue") // in iOS 11 you can use main queue
                    captureMetadataOutput.setMetadataObjectsDelegate(self, queue: newSerialQueue)
                    captureMetadataOutput.metadataObjectTypes = self.barCodeTypes
                    self.videoPreviewLayer = AVCaptureVideoPreviewLayer(session: self.captureSession)
                    self.videoPreviewLayer.videoGravity = .resizeAspectFill
                    self.videoPreviewLayer.frame = self.view.layer.bounds
                    self.view.layer.addSublayer(self.videoPreviewLayer)
                } 
            }
            initialized = true
        } else {
            // Only show a dialog if we have not just asked the user for permission to use the camera.  Asking permission
            // sends its own dialog to th user
            if !accessRequested {
                // Generic message if we cannot figure out why we cannot establish a camera session
                var message = "Cannot access camera to scan bar codes"
                #if (arch(i386) || arch(x86_64)) && (!os(macOS))
                    message = "You are running on the simulator, which does not hae a camera device.  Try this on a real iOS device."
                #endif
                if accessDenied {
                    message = "You have denied this app permission to access to the camera.  Please go to settings and enable camera access permission to be able to scan bar codes"
                }
                let alertPrompt = UIAlertController(title: "Cannot access camera", message: message, preferredStyle: .alert)
                let confirmAction = UIAlertAction(title: "OK", style: .default, handler: { (action) -> Void in
                    self.navigationController?.popViewController(animated: true)
                })
                alertPrompt.addAction(confirmAction)
                self.present(alertPrompt, animated: true, completion: nil)
            }
        }
    }

    func handleCapturedOutput(metadataObjects: [AVMetadataObject]) {
        if metadataObjects.count == 0 {
            return
        }

        guard let metadataObject = metadataObjects.first as? AVMetadataMachineReadableCodeObject else {
            return
        }

        if barCodeTypes.contains(metadataObject.type) {
            if let metaDataString = metadataObject.stringValue {
                captureSession.stopRunning()
                displayResult(code: metaDataString)
                return
            }
        }
    }

    func displayResult(code: String) {
        let alertPrompt = UIAlertController(title: "Bar code detected", message: code, preferredStyle: .alert)
        if let url = URL(string: code) {
            let confirmAction = UIAlertAction(title: "Launch URL", style: .default, handler: { (action) -> Void in
                UIApplication.shared.open(url, options: [:], completionHandler: { (result) in
                    if result {
                        NSLog("opened url")
                    } else {
                        let alertPrompt = UIAlertController(title: "Cannot open url", message: nil, preferredStyle: .alert)
                        let confirmAction = UIAlertAction(title: "OK", style: .default, handler: { (action) -> Void in
                        })
                        alertPrompt.addAction(confirmAction)
                        self.present(alertPrompt, animated: true, completion: {
                            self.setupCapture()
                        })
                    }
                })        
            })
            alertPrompt.addAction(confirmAction)
        }
        let cancelAction = UIAlertAction(title: "Cancel", style: .cancel, handler: { (action) -> Void in
            self.setupCapture()
        })
        alertPrompt.addAction(cancelAction)
        present(alertPrompt, animated: true, completion: nil)
    }

}

Second part is the extension of our UIViewController subclass for AVCaptureMetadataOutputObjectsDelegate where we catch the captured outputs.

extension BarCodeScannerViewController: AVCaptureMetadataOutputObjectsDelegate {

    func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection) {
        handleCapturedOutput(metadataObjects: metadataObjects)
    }

}

Update for Swift 4.2

.UIApplicationWillEnterForegroundchanges as UIApplication.willEnterForegroundNotification.

Not sure if this will help but here is a link to an open source QR Code library. As you can see a couple of people have already used this to create apps for the iphone.

Wikipedia has an article explaining what QR Codes are. In my opinion QR Codes are much more fit for purpose than the standard barcode where the iphone is concerned as it was designed for this type of implementation.

If support for the iPad 2 or iPod Touch is important for your application, I'd choose a barcode scanner SDK that can decode barcodes in blurry images, such as our Scandit barcode scanner SDK for iOS and Android. Decoding blurry barcode images is also helpful on phones with autofocus cameras because the user does not have to wait for the autofocus to kick in.

Scandit comes with a free community price plan and also has a product API that makes it easy to convert barcode numbers into product names.

(Disclaimer: I'm a co-founder of Scandit)

The problem with iPhone camera is that the first models (of which there are tons in use) have a fixed-focus camera that cannot take picture in-focus for distances under 2ft. The images are blurry and distorted and if taken from greater distance there is not enough detail/information from the barcode.

A few companies have developed iPhone apps that can accomodate for that by using advanced de-blurring technologies. Those applications you can find on Apple app store: pic2shop, RedLaser and ShopSavvy. All of the companies have announced that they have also SDKs available - some for free or very preferential terms, check that one out.

For a native iOS 7 bar code scanner take a look at my project on GitHub:

https://github.com/werner77/WECodeScanner

Here is simple code:

func scanbarcode()
{
    view.backgroundColor = UIColor.blackColor()
    captureSession = AVCaptureSession()

    let videoCaptureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
    let videoInput: AVCaptureDeviceInput

    do {
        videoInput = try AVCaptureDeviceInput(device: videoCaptureDevice)
    } catch {
        return
    }

    if (captureSession.canAddInput(videoInput)) {
        captureSession.addInput(videoInput)
    } else {
        failed();
        return;
    }

    let metadataOutput = AVCaptureMetadataOutput()

    if (captureSession.canAddOutput(metadataOutput)) {
        captureSession.addOutput(metadataOutput)

        metadataOutput.setMetadataObjectsDelegate(self, queue: dispatch_get_main_queue())
        metadataOutput.metadataObjectTypes = [AVMetadataObjectTypeEAN8Code, AVMetadataObjectTypeEAN13Code, AVMetadataObjectTypePDF417Code]
    } else {
        failed()
        return
    }

    previewLayer = AVCaptureVideoPreviewLayer(session: captureSession);
    previewLayer.frame = view.layer.bounds;
    previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    view.layer.addSublayer(previewLayer);
    view.addSubview(closeBtn)
    view.addSubview(backimg)

    captureSession.startRunning();

}
override func didReceiveMemoryWarning() {
    super.didReceiveMemoryWarning()
    // Dispose of any resources that can be recreated.
}

func failed() {
    let ac = UIAlertController(title: "Scanning not supported", message: "Your device does not support scanning a code from an item. Please use a device with a camera.", preferredStyle: .Alert)
    ac.addAction(UIAlertAction(title: "OK", style: .Default, handler: nil))
    presentViewController(ac, animated: true, completion: nil)
    captureSession = nil
}

override func viewWillAppear(animated: Bool) {
    super.viewWillAppear(animated)

    if (captureSession?.running == false) {
        captureSession.startRunning();
    }
}

override func viewWillDisappear(animated: Bool) {
    super.viewWillDisappear(animated)

    if (captureSession?.running == true) {
        captureSession.stopRunning();
    }
}

func captureOutput(captureOutput: AVCaptureOutput!, didOutputMetadataObjects metadataObjects: [AnyObject]!, fromConnection connection: AVCaptureConnection!) {
    captureSession.stopRunning()

    if let metadataObject = metadataObjects.first {
        let readableObject = metadataObject as! AVMetadataMachineReadableCodeObject;

        AudioServicesPlaySystemSound(SystemSoundID(kSystemSoundID_Vibrate))
        foundCode(readableObject.stringValue);
    }

   // dismissViewControllerAnimated(true, completion: nil)
}

func foundCode(code: String) {
    var createAccountErrorAlert: UIAlertView = UIAlertView()
    createAccountErrorAlert.delegate = self
    createAccountErrorAlert.title = "Alert"
    createAccountErrorAlert.message = code
    createAccountErrorAlert.addButtonWithTitle("ok")
    createAccountErrorAlert.addButtonWithTitle("Retry")
    createAccountErrorAlert.show()
    NSUserDefaults.standardUserDefaults().setObject(code, forKey: "barcode")
    NSUserDefaults.standardUserDefaults().synchronize()
    ItemBarcode = code
    print(code)
}

override func prefersStatusBarHidden() -> Bool {
    return true
}

override func supportedInterfaceOrientations() -> UIInterfaceOrientationMask {
    return .Portrait
}

If you are developing for iOS >10.2 with Swift 4 then you can try my solution. I mixed up this and this tutorial and came up with a ViewController which scans a QR Code and print() it out. I also have a Switch in my UI to toggle the camera light, might be helpful as well. For now I only tested it on a iPhone SE, please let me know if it doesn't work on newer iPhones.

Here you go:

import UIKit
import AVFoundation

class QRCodeScanner: UIViewController, AVCaptureMetadataOutputObjectsDelegate {

    let captureSession: AVCaptureSession = AVCaptureSession()
    var videoPreviewLayer: AVCaptureVideoPreviewLayer?
    let qrCodeFrameView: UIView = UIView()
    var captureDevice: AVCaptureDevice?

    override func viewDidLoad() {
        // Get the back-facing camera for capturing videos
        let deviceDiscoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera, .builtInDualCamera], mediaType: AVMediaType.video, position: .back)

        captureDevice = deviceDiscoverySession.devices.first
        if captureDevice == nil {
            print("Failed to get the camera device")
            return
        }

        do {
            // Get an instance of the AVCaptureDeviceInput class using the previous device object.
            let input = try AVCaptureDeviceInput(device: captureDevice!)

            // Set the input device on the capture session.
            captureSession.addInput(input)

            // Initialize a AVCaptureMetadataOutput object and set it as the output device to the capture session.
            let captureMetadataOutput = AVCaptureMetadataOutput()
            captureSession.addOutput(captureMetadataOutput)

            // Set delegate and use the default dispatch queue to execute the call back
            captureMetadataOutput.setMetadataObjectsDelegate(self, queue: DispatchQueue.main)
            captureMetadataOutput.metadataObjectTypes = [AVMetadataObject.ObjectType.qr]

            // Initialize the video preview layer and add it as a sublayer to the viewPreview view's layer.

            videoPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession)

            if let videoPreviewLayer = videoPreviewLayer {
                videoPreviewLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill
                videoPreviewLayer.frame = view.layer.bounds
                view.layer.addSublayer(videoPreviewLayer)

                // Start video capture.
                captureSession.startRunning()

                if let hasFlash = captureDevice?.hasFlash, let hasTorch = captureDevice?.hasTorch {
                    if hasFlash && hasTorch {
                        view.bringSubview(toFront: bottomBar)
                        try captureDevice?.lockForConfiguration()
                    }
                }
            }

            // QR Code Overlay
            qrCodeFrameView.layer.borderColor = UIColor.green.cgColor
            qrCodeFrameView.layer.borderWidth = 2
            view.addSubview(qrCodeFrameView)
            view.bringSubview(toFront: qrCodeFrameView)

        } catch {
            // If any error occurs, simply print it out and don't continue any more.
            print("Error: \(error)")
            return
        }
    }

    // MARK: Buttons and Switch

    @IBAction func switchFlashChanged(_ sender: UISwitch) {
        do {
            if sender.isOn {
                captureDevice?.torchMode = .on
            } else {
                captureDevice?.torchMode = .off
            }
        }
    }

    // MARK: AVCaptureMetadataOutputObjectsDelegate

    func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection) {

        // Check if the metadataObjects array is not nil and it contains at least one object.
        if metadataObjects.count == 0 {
            qrCodeFrameView.frame = CGRect.zero
            return
        }

        // Get the metadata object.
        let metadataObj = metadataObjects[0] as! AVMetadataMachineReadableCodeObject

        if metadataObj.type == AVMetadataObject.ObjectType.qr {
            // If the found metadata is equal to the QR code metadata then update the status label's text and set the bounds
            let barCodeObject = videoPreviewLayer?.transformedMetadataObject(for: metadataObj)
            qrCodeFrameView.frame = barCodeObject!.bounds

            print("QR Code: \(metadataObj.stringValue)")
        }
    }
}

Sometimes it can be useful also to generate QR codes. There is a superb C library for this which works like a charm. It is called libqrencode. Writing a custom view for displaying the QR code then is not that difficult and can be done with a basic understanding of QuartzCore.

you can check ZBarSDK to reads QR Code and ECN/ISBN codes it's simple to integrate try the following code.

- (void)scanBarcodeWithZBarScanner
  {
// ADD: present a barcode reader that scans from the camera feed
ZBarReaderViewController *reader = [ZBarReaderViewController new];
reader.readerDelegate = self;
reader.supportedOrientationsMask = ZBarOrientationMaskAll;

ZBarImageScanner *scanner = reader.scanner;
// TODO: (optional) additional reader configuration here

// EXAMPLE: disable rarely used I2/5 to improve performance
 [scanner setSymbology: ZBAR_I25
               config: ZBAR_CFG_ENABLE
                   to: 0];

//Get the return value from controller
[reader setReturnBlock:^(BOOL value) {

}

and in didFinishPickingMediaWithInfo we get bar code value.

    - (void) imagePickerController: (UIImagePickerController*) reader
   didFinishPickingMediaWithInfo: (NSDictionary*) info
   {
    // ADD: get the decode results
    id<NSFastEnumeration> results =
    [info objectForKey: ZBarReaderControllerResults];
    ZBarSymbol *symbol = nil;
    for(symbol in results)
    // EXAMPLE: just grab the first barcode
    break;

    // EXAMPLE: do something useful with the barcode data
    barcodeValue = symbol.data;

    // EXAMPLE: do something useful with the barcode image
    barcodeImage =   [info objectForKey:UIImagePickerControllerOriginalImage];
    [_barcodeIV setImage:barcodeImage];

    //set the values for to TextFields
    [self setBarcodeValue:YES];

    // ADD: dismiss the controller (NB dismiss from the *reader*!)
    [reader dismissViewControllerAnimated:YES completion:nil];
   }

I believe this can be done using AVFramework, here is the sample code to do this

import UIKit
import AVFoundation

class ViewController: UIViewController, AVCaptureMetadataOutputObjectsDelegate
{

    @IBOutlet weak var lblQRCodeResult: UILabel!
    @IBOutlet weak var lblQRCodeLabel: UILabel!

    var objCaptureSession:AVCaptureSession?
    var objCaptureVideoPreviewLayer:AVCaptureVideoPreviewLayer?
    var vwQRCode:UIView?

    override func viewDidLoad() {
        super.viewDidLoad()
        self.configureVideoCapture()
        self.addVideoPreviewLayer()
        self.initializeQRView()
    }

    func configureVideoCapture() {
        let objCaptureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
        var error:NSError?
        let objCaptureDeviceInput: AnyObject!
        do {
            objCaptureDeviceInput = try AVCaptureDeviceInput(device: objCaptureDevice) as AVCaptureDeviceInput

        } catch let error1 as NSError {
            error = error1
            objCaptureDeviceInput = nil
        }
        objCaptureSession = AVCaptureSession()
        objCaptureSession?.addInput(objCaptureDeviceInput as! AVCaptureInput)
        let objCaptureMetadataOutput = AVCaptureMetadataOutput()
        objCaptureSession?.addOutput(objCaptureMetadataOutput)
        objCaptureMetadataOutput.setMetadataObjectsDelegate(self, queue: dispatch_get_main_queue())
        objCaptureMetadataOutput.metadataObjectTypes = [AVMetadataObjectTypeQRCode]
    }

    func addVideoPreviewLayer() {
        objCaptureVideoPreviewLayer = AVCaptureVideoPreviewLayer(session: objCaptureSession)
        objCaptureVideoPreviewLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill
        objCaptureVideoPreviewLayer?.frame = view.layer.bounds
        self.view.layer.addSublayer(objCaptureVideoPreviewLayer!)
        objCaptureSession?.startRunning()
        self.view.bringSubviewToFront(lblQRCodeResult)
        self.view.bringSubviewToFront(lblQRCodeLabel)
    }

    func initializeQRView() {
        vwQRCode = UIView()
        vwQRCode?.layer.borderColor = UIColor.redColor().CGColor
        vwQRCode?.layer.borderWidth = 5
        self.view.addSubview(vwQRCode!)
        self.view.bringSubviewToFront(vwQRCode!)
    }

    func captureOutput(captureOutput: AVCaptureOutput!, didOutputMetadataObjects metadataObjects: [AnyObject]!, fromConnection connection: AVCaptureConnection!) {
        if metadataObjects == nil || metadataObjects.count == 0 {
            vwQRCode?.frame = CGRectZero
            lblQRCodeResult.text = "QR Code wans't found"
            return
        }
        let objMetadataMachineReadableCodeObject = metadataObjects[0] as! AVMetadataMachineReadableCodeObject
        if objMetadataMachineReadableCodeObject.type == AVMetadataObjectTypeQRCode {
            let objBarCode = objCaptureVideoPreviewLayer?.transformedMetadataObjectForMetadataObject(objMetadataMachineReadableCodeObject as AVMetadataMachineReadableCodeObject) as! AVMetadataMachineReadableCodeObject
            vwQRCode?.frame = objBarCode.bounds;
            if objMetadataMachineReadableCodeObject.stringValue != nil {
                lblQRCodeResult.text = objMetadataMachineReadableCodeObject.stringValue
            }
        }
    }
}

with Swift 5 it's Simple and Super fast!!

You just need to add cocoa pods "BarcodeScanner" here is the full code

source 'https://github.com/CocoaPods/Specs.git' 
platform :ios, '12.0' 
target 'Simple BarcodeScanner' 
do   
pod 'BarcodeScanner' 
end

Make sure add Camera permission in your .plist file

<key>NSCameraUsageDescription</key>
<string>Camera usage description</string>

And add Scanner and handle result in your ViewController this way

import UIKit
import BarcodeScanner

class ViewController: UIViewController, BarcodeScannerCodeDelegate, BarcodeScannerErrorDelegate, BarcodeScannerDismissalDelegate {

    override func viewDidLoad() {
        super.viewDidLoad()

        let viewController = BarcodeScannerViewController()
        viewController.codeDelegate = self
        viewController.errorDelegate = self
        viewController.dismissalDelegate = self

        present(viewController, animated: true, completion: nil)
    }

    func scanner(_ controller: BarcodeScannerViewController, didCaptureCode code: String, type: String) {
        print("Product's Bar code is :", code)
        controller.dismiss(animated: true, completion: nil)
    }

    func scanner(_ controller: BarcodeScannerViewController, didReceiveError error: Error) {
        print(error)
    }

    func scannerDidDismiss(_ controller: BarcodeScannerViewController) {
        controller.dismiss(animated: true, completion: nil)
    }
}

Still and any question or challenges, please check sample application here with full source code

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top