Skip to content

adapting to live stream using LFLiveKit #1

@sgallo0692

Description

@sgallo0692

your project is the best I can find for implementing CVPixelBuffer and FPV extensions. I'm trying to use LFLiveKit to pass the CVPixelBuffer along to an RTMP destination. In your code, where would be the right place to pass the CVPixelBuffer frames to? I'm attempting this:

func videoProcessFrame(_ frame: UnsafeMutablePointer<VideoFrameYUV>!) {
//        let resolution = CGSize(width: CGFloat(frame.pointee.width), height: CGFloat(frame.pointee.height))
        
        if frame.pointee.cv_pixelbuffer_fastupload != nil {
            //  cv_pixelbuffer_fastupload to CVPixelBuffer
            let cvBuf = unsafeBitCast(frame.pointee.cv_pixelbuffer_fastupload, to: CVPixelBuffer.self)
            pixelBuffer = cvBuf
            print("pushed video1")

        } else {
            // create CVPixelBuffer by your own, createPixelBuffer() is an extension function for VideoFrameYUV
            pixelBuffer = createPixelBuffer(fromFrame: frame.pointee)
            guard let cvBuf = pixelBuffer else { return }
            print("pushed video2")
        }
        session?.pushVideo(pixelBuffer)
    }

The stream technically starts as I'm getting an error on my server, but I don't know if I'm calling pushVideo in the right spot to continuously push the CVPixelBuffer frames along.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions