Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MovieOutput with Camera resulted in black pixels #85

Open
barbayrak opened this issue Jun 5, 2020 · 5 comments
Open

MovieOutput with Camera resulted in black pixels #85

barbayrak opened this issue Jun 5, 2020 · 5 comments

Comments

@barbayrak
Copy link

Hi there ,

I want to migrate GPUImage2 to GPUImage3 but when i run a test that simply captures from Camera and writes to a file i weirdly got black pixels randomly in my output . I am sharing my test viewcontroller and result video's one frame (black pixels appears especially on top of the record video). I also test this code on Iphone X (13.5) and Iphone 8 (12.1) . Iphone X a lot less black pixels compared to Iphone 8

When i run the same code on GPUImage2 everything is fine

IMG_0002

import UIKit
import AVFoundation
import AVKit
import GPUImage

class ViewController: UIViewController {

    @IBOutlet var renderView: RenderView!
    
    var camera : Camera!
    var effectMovie : MovieInput!
    var movieOutput : MovieOutput!
    
    var isRecording = false
    var recordUrl = ""
    
    override func viewDidLoad() {
        super.viewDidLoad()
        setupViews()
    }
    
    func setupViews(){

        do {
             camera = try Camera(sessionPreset: .hd1280x720)
             camera --> renderView
             camera.startCapture()
         } catch {
             fatalError("Could not initialize rendering pipeline: \(error)")
         }
        
    }
    
    @IBAction func recordTapped(_ sender: Any) {
        if (!isRecording) {
            do {
                self.isRecording = true
                
                let documentDirectory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!
                let filePath = documentDirectory.path + "/" + "recorded" + ".mov"
                self.recordUrl = filePath
                let fileURL = URL(fileURLWithPath: filePath)
                do {
                    try FileManager.default.removeItem(at:fileURL)
                } catch {
                    
                }
                
                movieOutput = try MovieOutput(URL:fileURL, size:Size(width:720.0, height:1280.0), liveVideo:true)
                
                camera --> movieOutput
                
                movieOutput!.startRecording()
                
            } catch {
                fatalError("Couldn't initialize movie, error: \(error)")
            }
        } else {
            movieOutput?.finishRecording{
                self.isRecording = false
                self.movieOutput = nil
                DispatchQueue.main.async {
                    self.performSegue(withIdentifier: "test", sender: nil)
                }
                
            }
        }
    }
 
    override func prepare(for segue: UIStoryboardSegue, sender: Any?) {
        if(segue.identifier == "test"){
            let dest = segue.destination as! TestVideoViewController
            dest.path = recordUrl
        }
    }
}
@luoser
Copy link

luoser commented Aug 25, 2020

do you have an audioEncodingTarget configured in the GPU Camera? i encountered this issue as well when trying to enable the audio track (by copying over bits from GPUImage2), but the black pixels went away when removing the audio logic. reference: BradLarson/GPUImage#1255

@eliot1019
Copy link

@luoser I also readded some of the audioEncodingTarget logic from GPUImage2 so that my video would have audio.
I'm getting black squares on the top parts of my video sometimes. Which logic did you end up removing?

The reference you posted seems to tell you to remove the first and last frame from the video which isn't necessary for me as only parts of the video have these black squares.

@luoser
Copy link

luoser commented Sep 24, 2020

@eliot1019 you are correct, that reference mentions frames rather than the pixelation issue that we are experiencing, i had followed some comments from there to see if it would help. i actually haven't been able to consistently resolve this issue so am still working on it, unfortunately deprecating back to GPUImage2 doesn't seem to be a viable option since OpenGL is deprecated and my project targets 13+ :/

@Darwel
Copy link

Darwel commented Sep 25, 2020

In MovieOutput.swift
change this

func renderIntoPixelBuffer(_ pixelBuffer:CVPixelBuffer, texture:Texture) {
        guard let pixelBufferBytes = CVPixelBufferGetBaseAddress(pixelBuffer) else {
            print("Could not get buffer bytes")
            return
        }

        let bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer)
        
        let outputTexture:Texture
        
        if (Int(round(self.size.width)) != texture.texture.width) && (Int(round(self.size.height)) != texture.texture.height) {
            let commandBuffer = sharedMetalRenderingDevice.commandQueue.makeCommandBuffer()

            outputTexture = Texture(device:sharedMetalRenderingDevice.device, orientation: .portrait, width: Int(round(self.size.width)), height: Int(round(self.size.height)), timingStyle: texture.timingStyle)

            commandBuffer?.renderQuad(pipelineState: renderPipelineState, inputTextures: [0:texture], outputTexture: outputTexture)
            commandBuffer?.commit()
            commandBuffer?.waitUntilCompleted()
        } else {
            outputTexture = texture
        }
        
        let region = MTLRegionMake2D(0, 0, outputTexture.texture.width, outputTexture.texture.height)
        
        outputTexture.texture.getBytes(pixelBufferBytes, bytesPerRow: bytesPerRow, from: region, mipmapLevel: 0)
    }

to this

func renderIntoPixelBuffer(_ pixelBuffer:CVPixelBuffer, texture:Texture) {
        guard let pixelBufferBytes = CVPixelBufferGetBaseAddress(pixelBuffer) else {
            print("Could not get buffer bytes")
            return
        }

        let bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer)
        
        let outputTexture:Texture
        
        let commandBuffer = sharedMetalRenderingDevice.commandQueue.makeCommandBuffer()
        outputTexture = Texture(device:sharedMetalRenderingDevice.device,
                                orientation: .portrait,
                                width: Int(round(self.size.width)),
                                height: Int(round(self.size.height)),
                                timingStyle: texture.timingStyle)
        
        commandBuffer?.renderQuad(pipelineState: renderPipelineState, inputTextures: [0:texture], outputTexture: outputTexture)
        commandBuffer?.commit()
        commandBuffer?.waitUntilCompleted()
        
        let region = MTLRegionMake2D(0, 0, outputTexture.texture.width, outputTexture.texture.height)
        
        outputTexture.texture.getBytes(pixelBufferBytes, bytesPerRow: bytesPerRow, from: region, mipmapLevel: 0)
    }

will fix.
It should wait until rendering is complete.

@bhadresh13
Copy link

bhadresh13 commented Mar 16, 2024

Screenshot 2024-03-16 at 11 48 25 AM

Actual video is green

Simulator Screenshot - iPhone 15 Pro - 2024-03-16 at 11 48 11
remove green color using ChromaKeyBlend
but its give black background
I want to remove that black background because I want to play on AR
how its possible ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants