Inside the Sphere in Las Vegas is a 15,000 m2 (160,000 ft2, 4 acres) 16K resolution wraparound LED screen with over 170 million pixels. It’s 240 feet high and wraps around the audience to create fully immersive visual effects. The entire visual content generation and control system is completely redundant to ensure uninterrupted entertainment. This FAQ reviews the industry standards developed by the Society of Motion Picture and Television Engineers (SMPTE) for immersive displays like those used in the Sphere and at some of the hardware used to drive the LEDs.
With such a gigantic display, media storage and playback are significant challenges. The content at the Sphere is pre-recorded and stored on a network-attached storage (NAS) system and streamed in real-time to dozens of media servers that deliver 4K video at 60 frames per second. The video streams are stitched together to form the 16K resolution of the display. The systems at the Sphere use the latest SMPTE ST 2110 IP video streaming standards including:
- SMPTE ST 2110-10/-20/-30 — addresses system concerns and uncompressed video and audio streams.
- SMPTE ST 2110-21 — specifies traffic shaping and delivery timing of the uncompressed video.
- SMPTE ST 2110-31 — specifies the real-time, RTP-based transport of AES3 signals over IP networks, referenced to a network reference clock.
- SMPTE ST 2110-40 — maps ancillary data packets (as defined in SMPTE ST 291-1) into Real-Time Transport Protocol (RTP) packets that are transported via User Data Protocol/Internet Protocol (UDP/IP) and enables those packets to be moved synchronously with associated video and audio essence streams.
The new standards support the replacement of the legacy serial digital interface (SDI) with internet protocol (IP). In addition to being used internally, the new standards are used for intra-facility transfers of content from the studio to the venue.
Generative artificial intelligence (AI) video content is another important element of the immersive displays at the Sphere. The content is created in part by using generative AI models. The models learn the underlying patterns and structures of the data they’re trained on and then use this knowledge to generate new, similar data. For example, with generative AI video, developers can remove or add an object or change the background with simple commands instead of spending hours editing.
The system
The completed system is built on a 3U high media platform that can be used for video playback, pixel processing, and control and is designed to run a variety of generative engines, including Unreal Engine, Unity, TouchDesigner, and Notch. Some of the hardware specifications for the 19-inch rack mount media platform include (Figure 2):
- Video outputs 4 × DisplayPort 1.4, upgradeable
- General network 2 × 10 Gb/sec Ethernet
- High-speed network 2 × 100 Gb/sec Ethernet, upgradeable
- Data Peripherals
- 1 × USB 3.2 Gen 2×2 port
- 9 × USB 3.2 Gen 2 ports
- Memory 128 GB DDR4 RAM, upgradeable
- Audio 8 × unbalanced analog, upgradeable
- GPU NVIDIA RTXTM A6000, upgradeable
- Motherboard has 7 PCIe 4.0 x16 slots
- Media storage 2 × 3.6 TB NVMe SSD (PCIe 4.0), upgradeable
- Power 1600 W (dual redundant), upgradeable
The media platform acts as the host for a variety of functions including:
- Media server that supports SMPTE ST 2110. It’s capable of 240 frames per second (fps) uncompressed playback, dynamic blends, and keyframing, and includes NotchLC video codec support.
- Dedicated generative content support for the media storage and playback system.
- Pixel processors are a critical infrastructure in getting the image to the display. They deliver a high level of flexibility, high signal density, and extremely low processing latency.
- Workflow user interface to bring together the media server, generative content system, and pixel processors for a complete content development environment.
Centralized control software ties it all together at showtime. It can control audio and video streaming, links to databases, and media management. The control system provides a bridge between the virtual and physical worlds. It supports a wide range of devices and effects like lighting, audio, fans, smell generators, vibration transducers, and, of course, video to create a unified immersive environment.
Summary
Presentation of the 15,000 m2 16K resolution wraparound LED screen at the Sphere takes a lot of technology. It starts with industry standards from SMPTE for immersive displays. It also requires an array of dedicated hardware and software solutions, including media servers, generative content support, pixel processors, and centralized control software to tie the display together with the other show elements, including audio and special effects.
References
7thSense Performer Range,
A disruptive new entertainment venue: inside the current-edge MSG Sphere, 7thSense
High performance 3U server, 7thSense
SMPTE ST 2110, Society of Motion Picture and Television Engineers
Leave a Reply
You must be logged in to post a comment.