It takes a unique facility plus customized hardware and software to create the content seen in the Las Vegas Sphere.
This FAQ looks at the production facility where the immersive content is developed, the custom 16K resolution video camera used to record video content and the customized software for creating the immersive audio.
Until the Big Dome was built at Sphere Studios, there was no screen capable of displaying the immersive visual content needed for the Sphere. The approximately 28,000 square foot Big Dome features a large, curved LED display that is a quarter-scale, full-resolution replica of the Las Vegas screen (Figure 1). The complex also includes about 60,000 square feet of office and studio space.
Big Sky camera
The Big Sky camera was developed at Sphere Studios to produce ultra-high resolution 16K video content for the Sphere. Some of the details of Big Sky include (Figure 2):
- The camera’s single-lens system claims to be the world’s sharpest cinema lenses and can maintain focus on a subject while zooming, even if it’s moving. It can capture images that fill the Sphere’s 16K x 16K immersive display plane from edge to edge.
- The camera’s single sensor is a 316-megapixel, 3” x 3” high dynamic range (HDR) CMOS image sensor with 92 current mode logic (CML) output ports with a total data rate of 515 Gbit/s. The sensor requires 23 W of power.
- The rolling shutter sensor can operate in high-frame-rate single-gain readout mode or in HDR mode at a lower frame rate. The HDR mode takes advantage of the dual-gain capability of the pixels to allow extended dynamic range within a single exposure.
- The camera’s media recorder uses 32 TB memory magazines that can handle full resolution, 60 frames per second (fps), uncompressed RAW footage at 30 Gigabytes/second (Gb/s), or 120 fps at 50 Gb/s. It also supports 600 Gb/s of network connectivity and has integrated media duplication to speed postproduction workflows.

Custom image processing software called SphereLab was developed to handle the output of the Big Sky camera. It includes GPU accelerated RAW processing to speed the capture and development of the ultra-high-resolution content needed by the Sphere. The Bid Dome has a viewing/working platform for reviewing the output of Big Sky (Figure 3).

Sound developments
HOLOPLOT, the company that developed the immersive audio system for the Sphere also developed the audio development equipment. The HOLOPLOT software suite supports the workflow from content production to presentation in an intuitive desktop environment.
It starts with HOLOPLOT Plan that can design, simulate, and set up the audio system. Matrix arrays of different loudspeaker configurations can be designed using the 3D graphical interface (Figure 4). Venue models can be imported from standard 3D modeling tools like SketchUp. The software can design custom sound fields using 3D audio beamforming and wave field synthesis technologies.

Additional tools in the software suite include:
HOLOPLOT Create enables developers to model acoustic phenomena in a virtual environment without having to be in front of a physical HOLOPLOT audio system. It also allows designers to move in the 3D space defined in HOLOPLO Plan and listen to the resulting audio performance from any angle or position, maximizing the quality of the audio production and reducing the need for onsite optimization and tuning.
HOLOPLOT Control supports managing and delivering the resulting immersive audio content. In addition to content delivery, it provides a diagnostic interface for all the loudspeakers in the venue and enables operators to control volume and equalization and monitor performance in real time. It also provides insights into component health and network status to support technical staff in maintaining the system.
Summary
A purpose-built production facility along with a custom ultra-high resolution video camera and a comprehensive suite of immersive audio development software are required to support the content needs of the Sphere in Las Vegas. The Big Dome facility and Big Sky camera were both developed by Sphere Entertainment, owner of the Sphere, while the audio development software was developed by HOLOPLOT, the maker of the immersive audio system at the Sphere.
References
A 316MP, 120FPS, High Dynamic Range CMOS Image Sensor for Next Generation Immersive Displays, MDPI sensors
HOLOPLOT Software Suite
Sphere Entertainment
Leave a Reply
You must be logged in to post a comment.