LOS ANGELES, Jan. 24, 2026 /PRNewswire/ — At CES 2026, VR filmmaker and educator Hugh Hou led a live spatial computing demonstration inside the GIGABYTE suite, showing how immersive video is created in real production environments, not in theory or controlled lab conditions.
Inside the Computing Power Behind Spatial Filmmaking: Hugh Hou Goes Hands-On at GIGABYTE Suite During CES 2026
The session gave attendees a close look at a complete spatial filmmaking pipeline, from capture through post-production and final playback. Instead of relying on pre-rendered content, the workflow was executed live on the show floor, reflecting the same processes used in commercial XR projects and placing clear demands on system stability, performance consistency, and thermal reliability. The experience culminated with attendees viewing a two-minute spatial film trailer across Meta Quest, Apple Vision Pro, and the newly launched Galaxy XR headsets, alongside a 3D tablet display offering an additional 180-degree viewing option.
Where AI Fits Into Real Creative Workflows
AI was presented not as a feature highlight, but as a practical tool embedded into everyday editing tasks. During the demo, AI-assisted enhancement, tracking, and preview processes helped speed up iteration without interrupting creative flow.
Footage captured on cinema-grade immersive cameras moved through industry-standard software including Adobe Premiere Pro and DaVinci Resolve. AI-based upscaling, noise reduction, and detail refinement were applied to meet the visual requirements of immersive VR, where any artifact or softness becomes immediately noticeable across a 360-degree viewing environment.
Why Platform Design Matters for Spatial Computing
Supporting the entire workflow was a custom-built GIGABYTE AI PC designed specifically for sustained spatial video workloads. The system combined an AMD Ryzen™ 7 9800X3D processor with a Radeon™ AI PRO R9700 AI TOP GPU, providing the memory bandwidth and continuous AI performance required for real-time 8K spatial video playback and rendering. Equally critical, the X870E AORUS MASTER X3D ICE motherboard delivered stable power and signal integrity, allowing the workflow to run predictably throughout the live demonstration.
The experience concluded with attendees viewing a finished spatial film trailer across Meta Quest, Apple Vision Pro, and Galaxy XR devices.
By enabling a demanding spatial filmmaking workflow to operate live and repeatedly at CES, GIGABYTE demonstrated how platform-level system design turns complex immersive production into something creators can rely on, not just experiment with.

Đề xuất
Calidi Biotherapeutics Announces Proposed Public Offering
HD Hyundai Chairman Chung Kisun Takes Lead in Bridging Korea and the Philippines
Xiamen C&D Inc.’s New Five-Year Strategic Plan Released, Aiming to Accelerate Globalization
LiuGong at CONEXPO 2026 | Electric and Integrated Solutions
LRQA Sparks Critical Conversations at Melbourne Cybersecurity & AI Governance Roundtable
YY Group Projects HKD 100 Million Revenue Milestone in Hong Kong for 2026