One of the key announcements at NVIDIA’s GTC (NVIDIA GPU Technology conference) is that NVIDIA is bringing OpenUSD-based Omniverse enterprise digital twins to the Apple Vision Pro. Apple’s mixed-reality headset was first announced in June 2023 at Apple’s Worldwide Developers Conference and became available for purchase in the US in February 2024.
NVIDIA sees Spatial computing as a powerful technology for delivering immersive experiences and seamless interactions between people, products, processes and physical spaces. One of the areas where spatial computing can be a game changes is industrial enterprise use cases that require incredibly high-resolution displays and powerful sensors operating at high frame rates to make manufacturing experiences true to reality.
A new software framework built on Omniverse Cloud APIs or application programming interfaces, lets developers easily send their Universal Scene Description (Open USD) industrial scenes from their content creation applications to the NVIDIA Graphics Delivery Network, a global network of graphics-ready data centres that can stream advanced 3D experiences to Apple Vision Pro. At the global AI conference, NVIDIA presented a demo that featured an interactive, physically accurate digital twin of a car streamed in full fidelity to Apple Vision Pro’s high-resolution displays.
The demo leverages the power of spatial computing by blending 3D photorealistic environments with the physical world. It featured a designer wearing the Vision Pro, using a car configurator application developed by CGI studio Katana on the Omniverse platform. The designer toggles through paint and trim options and even enters the vehicle.
The new Omniverse-based workflow combines Apple Vision Pro ground-breaking high-resolution displays with NVIDIA’s powerful RTX cloud rendering to deliver spatial computing experiences made possible with just the device and an internet connection.
“The breakthrough ultra-high-resolution displays of Apple Vision Pro, combined with photorealistic rendering of OpenUSD content streamed from NVIDIA accelerated computing, unlocks an incredible opportunity for the advancement of immersive experiences,” said Mike Rockwell, vice president of the Vision Products Group at Apple.
The workflow also introduces hybrid rendering, a ground-breaking technique that combines local and remote rendering on the device. Users can render fully interactive experiences in a single application from Apple’s native SwiftUI and Reality Kit with the Omniverse RTX Renderer streaming from GDN. For developers and independent software vendors, NVIDIA is building the capabilities that would allow them to use the native tools on Apple Vision Pro to seamlessly interact with existing data in their applications.
The Omniverse-based workflow showed potential for a wide range of use cases. It also opens new channels and opportunities for e-commerce experiences.
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!
