Video 360 in OpenGL (iOS) part 3 – Sky Sphere

Video 360 blogpost header

Since we have an interactive camera, it’s time to replace our skybox with a sky sphere. Why to use a sphere instead of a box? Because most of the videos available in the net are prepared for spherical mapping. However Facebook uses skyboxes for its 360° videos (see Under the hood: Building 360 Video).

A rectangular frame with sphere mapping

A rectangular frame with sphere mapping. Image taken from Discovery’s YT channel.

Let’s get started with the sky sphere. First, we will define a vertex format for our mesh. For our purposes a vertex with 3d position and UV texture coordinates should be enough.


typealias VertexPositionComponent = (GLfloat, GLfloat, GLfloat)
typealias VertexTextureCoordinateComponent = (GLfloat, GLfloat)

struct TextureVertex
{
    var position: VertexPositionComponent = (0, 0, 0)
    var texture: VertexTextureCoordinateComponent = (0, 0)
}

Then we will generate sphere’s vertices (vertex buffer) and indices (index buffer) and bind them to a vertex array. We will use polar coordinates for vertices.

Polar coordinates

Polar coordinates (by Philip Rideout)

Geodesic coordinates

Geodesic coordinates (by Philip Rideout)

You may imagine polar coordinates as a grid with rows and columns which are mapped on the sphere.

mapping

This way, defining texture coordinates will be as trivial as calculating linear interpolation between first and last row/column:


tu = column_index / number_of_columns
tv = row_index / number_of_rows

The code for generating vertices and indices for a sphere based on polar coordinates:


private func generateVertices()
    {
        let deltaAlpha = Float(2.0 * M_PI) / Float(self.columns)
        let deltaBeta = Float(M_PI) / Float(self.rows)
        for row in 0...self.rows
        {
            let beta = Float(row) * deltaBeta
            let y = self.radius * cosf(beta)
            let tv = Float(row) / Float(self.rows)
            for col in 0...self.columns
            {
                let alpha = Float(col) * deltaAlpha
                let x = self.radius * sinf(beta) * cosf(alpha)
                let z = self.radius * sinf(beta) * sinf(alpha)

                let position = GLKVector3(v: (x, y, z))
                let tu = Float(col) / Float(self.columns)

                let vertex = TextureVertex(position: position.v, texture: (tu, tv))
                self.vertices.append(vertex)
            }
        }
    }

    private func generateIndicesForTriangleStrip()
    {
        for row in 1...self.rows
        {
            let topRow = row - 1
            let topIndex = (self.columns + 1) * topRow
            let bottomIndex = topIndex + (self.columns + 1)
            for col in 0...self.columns
            {
                self.indices.append(UInt32(topIndex + col))
                self.indices.append(UInt32(bottomIndex + col))
            }

            self.indices.append(UInt32(topIndex))
            self.indices.append(UInt32(bottomIndex))
        }
    }

Finally we will bind a static texture to the mesh and can dive into our sky sphere:

For the source code check this link.

Next article will be about rendering live video on the sphere.

  • Jane Allen

    This is not working on Swift 3. Fix it please~

    • Pawel

      Jane, we’ve just upgraded the code to Xcode8, Swift 3. Have a try.