NVIDIA Omniverse Expands Worlds Using Apple Vision Pro

NVIDIA is bringing OpenUSD-based Omniverse enterprise digital twins to the Apple Vision Pro.

Announced today at NVIDIA GTC, a new software framework built on Omniverse Cloud APIs, or application programming interfaces, lets developers easily send their Universal Scene Description (OpenUSD) industrial scenes from their content creation applications to the NVIDIA Graphics Delivery Network (GDN), a global network of graphics-ready data centers that can stream advanced 3D experiences to Apple Vision Pro.

In a demo unveiled at the global AI conference, NVIDIA presented an interactive, physically accurate digital twin of a car streamed in full fidelity to Apple Vision Pro’s high-resolution displays.

Responsive Image

The demo featured a designer wearing the Vision Pro, using a car configurator application developed by CGI studio Katana on the Omniverse platform. The designer toggles through paint and trim options and even enters the vehicle — leveraging the power of spatial computing by blending 3D photorealistic environments with the physical world.

Bringing the Power of RTX Enterprise Cloud Rendering to Spatial Computing

Spatial computing has emerged as a powerful technology for delivering immersive experiences and seamless interactions between people, products, processes and physical spaces. Industrial enterprise use cases require incredibly high-resolution displays and powerful sensors operating at high frame rates to make manufacturing experiences true to reality.

This new Omniverse-based workflow combines Apple Vision Pro groundbreaking high-resolution displays with NVIDIA’s powerful RTX cloud rendering to deliver spatial computing experiences with just the device and an internet connection.

This cloud-based approach allows real-time physically based renderings to be streamed seamlessly to Apple Vision Pro, delivering high-fidelity visuals without compromising details of the massive, engineering fidelity datasets.

“The breakthrough ultra-high-resolution displays of Apple Vision Pro, combined with photorealistic rendering of OpenUSD content streamed from NVIDIA accelerated computing, unlocks an incredible opportunity for the advancement of immersive experiences,” said Mike Rockwell, vice president of the Vision Products Group at Apple. “Spatial computing will redefine how designers and developers build captivating digital content, driving a new era of creativity and engagement.”

“Apple Vision Pro is the first untethered device which allows for enterprise customers to realize their work without compromise,” said Rev Lebaredian, vice president of simulation at NVIDIA. “We look forward to our customers having access to these amazing tools.”

The workflow also introduces hybrid rendering, a groundbreaking technique that combines local and remote rendering on the device. Users can render fully interactive experiences in a single application from Apple’s native SwiftUI and Reality Kit with the Omniverse RTX Renderer streaming from GDN.

NVIDIA GDN, available in over 130 countries, taps NVIDIA’s global cloud-to-edge streaming infrastructure to deliver smooth, high-fidelity, interactive experiences. By moving heavy compute tasks to GDN, users can tackle the most demanding rendering use cases, no matter the size or complexity of the dataset.

Enhancing Spatial Computing Workloads Across Use Cases

The Omniverse-based workflow showed potential for a wide range of use cases. For example, designers could use the technology to see their 3D data in full fidelity, with no loss in quality or model decimation. This means designers can interact with trustworthy simulations that look and behave like the real physical product. This also opens new channels and opportunities for e-commerce experiences.

In industrial settings, factory planners can view and interact with their full engineering factory datasets, letting them optimize their workflows and identify potential bottlenecks.

For developers and independent software vendors, NVIDIA is building the capabilities that would allow them to use the native tools on Apple Vision Pro to seamlessly interact with existing data in their applications.

Learn more about NVIDIA Omniverse and GDN.

If you found this article to be informative, you can explore more current Digital Twin news here exclusives, interviews, and podcasts.

The Future of Materials Discovery: Reducing R&D Costs significantly with GenMat’s AI and Machine Learning Tools

When: July 13, 2023 at 11:30am

What: GenMat Webinar

Picture of Jake Vikoren

Jake Vikoren

Company Speaker

Picture of Deep Prasad

Deep Prasad

Company Speaker

Picture of Araceli Venegas

Araceli Venegas

Company Speaker

Jack Boreham

Jack Boreham is the editorial director and account executive at the Digital Twin Insider: the leading digital twin publication globally. Jack has been at the forefront of the platform's growth as a digital twin specialist - writing and advising projects in the Digital Twin space for over two years. [email protected]

Keep track of everything going on in the Digital Twin Market.

In one place.

Related Articles

Stay ahead in the virtual realm with Digital Twin Insider, subscribe to our newsletter today

Join Our Newsletter