The lenses in Head Mounted Displays (HMDs) apply pincushion distortion to the image displayed. When we display images on an HMD, we must ensure the images are barrel distorted. The barrel distortion is cancelled perfectly by the pincushion distortion of the lenses, and we see an undistorted image. There are numerous ways to achieve barrel distortion, each finding a different balance between performance and image quality.

The traditional shader method for barrel distortion.
The traditional shader method for barrel distortion.

The barrel distortion method in early versions of the Oculus SDK examples is a post-processing shader. We usually render a rectangular grid of pixels. This makes sense, because almost all screens are rectangular grids of pixels. The Oculus shader moves each pixel of the image outward from the centre to create barrel distortion. This method is very fast, but leaves us with pixels that aren’t in a grid, and we need the pixels of our image to line up with the pixels of our screen. Resampling is used to remake our distorted image as a regular grid of pixels. There are many different resampling algorithms. Some produce sharper images than others, but none are perfect; all resamplers blur the image a little. This is a huge downside, since resolution is already quite low on today’s HMDs. We need all the sharpness we can get. To combat the blurring, we render our original image at a higher resolution. Oversampling reduces (but does not eliminate) resampling blur, at the cost of performance. This method is easy to implement as it only adds post processing rather than altering the actual rendering process.

Another method, used in the SteamVR SDK, applies the barrel distortion with a mesh. The image is rendered normally, and then stretched over a 3D polygon mesh as its texture. The textured mesh is rendered, capturing a distorted image. Essentially, we place our flat image onto a curved virtual “screen” and then take a virtual “photograph” of the screen. This is similar to the shader method, only we utilise the fact that GPUs are very good at taking 3D scenes and turning them into 2D images. The pixel displacement is a natural consequence of imaging the warped mesh, and the resampling is inherent in the rasterisation process. Like the shader method, this causes some blurring, and if the polygon mesh is not fine enough it can cause other small distortions. Again, we compensate partially for the blurring with oversampling. Again, this method only adds an extra step to the rendering process, requiring no change to existing renderers.

Researchers at Intel recently detailed novel methods for barrel distortion. Instead of warping the image after rendering, they warped the world itself using a vertex shader. Polygon vertices were displaced in 3D, and the warped world was rendered in the traditional way. This has the advantage of optimal image clarity (1 pixel rendered = 1 pixel displayed), as no resampling is required. However, the distortion is only accurate when the resolution of the geometry is sufficient, meaning very high poly models or tessellation shading is required. This incurs a significant performance impact.

Intel's custom rasteriser barrel distortion method.
Intel's custom rasteriser barrel distortion method.

Another method described by the researchers uses a custom rasteriser which renders images to a pincushion grid, as opposed to a rectangular one. Then, each pixel is moved out from the centre of the image, creating the barrel distortion. Since the original grid had a pincushion pattern, the displacement puts the pixels into a rectangular grid, without resampling! We get optimal image clarity with little performance overhead. Or do we? In reality, rendering to a non-rectangular grid is quite hard. GPUs have been heavily optimised to render to rectangular grids because this is what nearly every application requires. If future GPUs were to add support for custom render grids, they could provide pixel-perfect VR image quality with very little performance impact. For now, however, this method is restricted to painfully slow software rasterisers.

Teapot image by Dhatfield (Own work) CC BY-SA 3.0 or GFDL, via Wikimedia Commons.