Video 360 in OpenGL (iOS) part 4 – Video Reader

Video 360 blogpost header

In the previous posts we learned how to create a static video 360° player based on a sky sphere. In this post we will learn how to read a video from a local file and how to extract a single video’s frame and map it onto the sky sphere. The common extension of 360° video files is .mp4. We will use that format as well.

We’ll start by creating an AVURLAsset object from video’s url:


let asset = AVURLAsset(URL: self.videoURL, options: nil)

Then we will create a player item and add a video output to it:


let pixelBufferAttributes = [ kCVPixelBufferPixelFormatTypeKey as String : NSNumber(unsignedInt: kCVPixelFormatType_32BGRA)]
self.videoOutput = AVPlayerItemVideoOutput(pixelBufferAttributes: pixelBufferAttributes)

self.playerItem = AVPlayerItem(asset: asset)
self.playerItem.addOutput(self.videoOutput)

Thanks to AVPlayerItemVideoOutput we will be able to extract video images as CVPixelBuffers.

The next step will be to create an AVPlayer and play it:


self.player = AVPlayer(playerItem: self.playerItem)
self.player.play()

The last bit we will implement in our VideoReader class will be a function that gets access to raw data of a given video’s frame. We will achieve this by accessing CVPixelBuffer and getting its base address. We will use frameHandler closure for passing the video’s frame outside the VideoReader object (specifically to our Skysphere object). We will use the closure instead of a function’s return value, just because we want to call CVPixelBufferLockBaseAddress and CVPixelBufferUnlockBaseAddress in the same place:


func currentFrame(frameHandler: ((size: CGSize, frameData: UnsafeMutablePointer) -> (Void))?)
{
    guard self.playerItem?.status == .ReadyToPlay else
    {
        return
    }

    let currentTime = self.playerItem.currentTime()
    guard let pixelBuffer = self.videoOutput.copyPixelBufferForItemTime(currentTime, itemTimeForDisplay: nil) else
    {
        return
    }

    CVPixelBufferLockBaseAddress(pixelBuffer, kCVPixelBufferLock_ReadOnly)

    let width = CVPixelBufferGetWidth(pixelBuffer)
    let height = CVPixelBufferGetHeight(pixelBuffer)
    let baseAddress = CVPixelBufferGetBaseAddress(pixelBuffer)
    frameHandler?(size: CGSize(width: width, height: height), frameData: baseAddress)

    CVPixelBufferUnlockBaseAddress(pixelBuffer, kCVPixelBufferLock_ReadOnly)
}

In the Scene3DViewController, we will create a VideoReader object and fill the update function:


func update()
{
    self.videoReader?.currentFrame(
    {
        [weak self] (size, frameData) -> (Void) in
        self?.skysphere.updateTexture(size, imageData: frameData)
    })
}

Optionally we could loop our video by observing AVPlayerItemDidPlayToEndTimeNotification notification:


NSNotificationCenter.defaultCenter().addObserver(self, selector: "playerItemDidPlayToEndTime:", name: AVPlayerItemDidPlayToEndTimeNotification, object: self.playerItem)

func playerItemDidPlayToEndTime(notification: NSNotification)
{
    self.player.seekToTime(kCMTimeZero)
    self.player.play()
}

That’s it. Our spherical video player is ready!

Some last notes and remarks:

  • Note that some versions of OpenGL may require texture dimensions to be powers of 2. However the sample video we used in the project (1280×640) works fine with its original dimensions.
  • Fortunately, we may easily define the size of output frames in AVPlayerItemVideoOutput’s pixelBufferAttributes (see flags: kCVPixelBufferWidthKey and kCVPixelBufferHeightKey).
  • Please mind that the format parameter of glTexImage2D function has changed since the 3rd part of this tutorial (Sky sphere). Previously we used GL_RGBA format because we loaded a standard .png image. Now we should use GL_BGRA format because we defined such format in the VideoReader (kCVPixelFormatType_32BGRA is recommended by Apple on iOS).
  • The described video player was made as a proof of concept and has not been used in any commercial product yet.
  • You may want to check Aralekk’s repo to see how he has implemented a video 360° player using SceneKit (swift)
  • Another approach to spherical video can be found in hanton’s GitHub (obj-c).

Ways of improvements and extensions:

  • Integrate accelerometer to change camera’s orientation.
  • Add VR mode (lens distortion correction)
  • Integrate spatial sound and manually sync with the video (e.g. check 3Dception Spatial Worksatation)
  • There are some OpenGL related issues exposed by Instruments in the sample project (see screenshot below). Most of them are related to glTexImage2D function. If you know how to fix them or how to improve anything in this project, please let us know through comments. We’ll appreciate any hint.
  • video_360_profiling
    Screenshot taken in Instruments’ OpenGL ES Analysis.

The code of a full video 360° player can be found here.

  • http://www.holdapp.com Artur Ozierański

    Great articles! i was looking for the proper solution for 360 player. Thank you for the code snippets and the detailed description. I look forward to check your code.

    • 黄世平

      Artur Ozierański ,have you found the proper solution for 360 player? thank you

  • Devin

    Great information, I was wondering what would it take to make this code OpenGL ES2.0 compatible?

    • Pawel

      Hey Devin,
      To make this code supporting EAGLRenderingAPI.OpenGLES2 you will need to play with pixelBufferAttributes while creating AVPlayerItemVideoOutput. Try to add kCVPixelBufferWidthKey and kCVPixelBufferHeightKey keys to limit buffer size (the values 1024×512 work fine for me).

      • Devin

        Defining the correct width and height worked. Thanks for the help and these articles!

  • Job applicant

    it doesn’t run on my phone, only the background and audio, no sharks, etc