Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sir, how would I pass the result image from CoreImageVIewFilter to the main view ? #24

Open
burkaslarry opened this issue Oct 17, 2017 · 0 comments

Comments

@burkaslarry
Copy link

I am current ly implementing the filter in capture Photo delegate as below

When it comes to the implementation, the filter is not working.

    @IBAction func capturePhoto(_ sender: Any) {
        
        // stop text recognition
        cameraSession.stopRunning()
        
        //start filter
        videoFilter = CoreImageVideoFilter(superview: view, applyFilterCallback: nil)
        // Simulate a tap on the mode selector to start the process

        if let videoFilter = videoFilter {
            videoFilter.stopFiltering()
            detector = prepareRectangleDetector()
            videoFilter.applyFilter = {
                image in
                self.resultCIIMage = self.performRectangleDetection(image)!
                return self.performRectangleDetection(image)
            }
            
            videoFilter.startFiltering(currentSession: cameraSession)
        }
        
        
        
        cameraSession.beginConfiguration()
        cameraSession.sessionPreset = AVCaptureSessionPresetPhoto
        let device : AVCaptureDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
        do {
            let captureDeviceInput = try AVCaptureDeviceInput(device: device)
            if cameraSession.canAddInput(captureDeviceInput) {
                cameraSession.addInput(captureDeviceInput)
            }
        }
        catch {
            print("Error occured \(error)")
            return
        }
        if(device.isFocusModeSupported(.continuousAutoFocus)) {
            try! device.lockForConfiguration()
            device.focusMode = .continuousAutoFocus
            device.unlockForConfiguration()
        }
        runStillImageCaptureAnimation()
        cameraSession.addOutput(cameraOutput)
        cameraSession.commitConfiguration()
        
        let photoSettings = AVCapturePhotoSettings()
        photoSettings.flashMode = .on
        photoSettings.isHighResolutionPhotoEnabled = true
        
        if photoSettings.__availablePreviewPhotoPixelFormatTypes.count > 0 {
            photoSettings.previewPhotoFormat = [ kCVPixelBufferPixelFormatTypeKey as String : photoSettings.__availablePreviewPhotoPixelFormatTypes.first!]
            
        }
        
        cameraSession.startRunning()
        print("go this")
        cameraOutput.isHighResolutionCaptureEnabled = true
        cameraOutput.capturePhoto(with: photoSettings, delegate: self)
        //  print("go that")
        //   cameraSession.stopRunning()
    }
    
    
    func runStillImageCaptureAnimation(){
        DispatchQueue.main.async {
            self.preview.layer.opacity = 0.0
            print("opacity 0")
            UIView.animate(withDuration: 1.0) {
                self.preview.layer.opacity = 1.0
                print("opacity 1")
            }
        }
    }
    
    func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {
        
        print("go threee ")
        
        if let error = error {
            print("error occure : \(error.localizedDescription)")
        }
        
        if  let sampleBuffer = photoSampleBuffer,
            let previewBuffer = previewPhotoSampleBuffer,
            let dataImage =  AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer:  sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
            
            let dataProvider = CGDataProvider(data: dataImage as CFData)
            let cgImageRef: CGImage! = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent)
            //let imageX = UIImage(cgImage: cgImageRef, scale: 1.0, orientation: UIImageOrientation.right)
            let image = UIImage(cgImage: cgImageRef, scale: 1.0, orientation: UIImageOrientation.right)
            
            capturedImage = videoFilter?.resuItImage == nil ? image : convert(cmage: resultCIIMage)
            user?.setScannedlist(list: self.scannerdText)
            user?.capImage(captured :  capturedImage!  )
            tesseract?.delegate = nil
            tesseract = nil
            
            
            self.dismiss(animated: false, completion: { Void in self.cameraSession.stopRunning()})
            
        } else {
            print("some errorX  here")
        }
    }
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant