请翻译下面内容,重写翻译后的标题使其更吸睛,确保翻译完全,保留所有html代码和标签,不要包含思考和解释等内容,不要增加额外内容,直接给出纯文字结果:
Apple explains benefits of new foveated streaming support in visionOS 26.4
Per Apple’s official visionOS 26.4 release notes: “visionOS 26.4 supports NVIDIA CloudXR with foveated streaming, enabling apps to display high-resolution, low-latency immersive content on Apple Vision Pro.”
That sounds exciting, but what exactly does it mean for users?
In short, it’s a new tool that apps and games can utilize to deliver improved experiences on Apple Vision Pro.
From Apple’s developer documentation:
If you have an existing virtual reality game, experience, or application built for desktop computers or a cloud server, you can stream it to Apple Vision Pro with the Foveated Streaming framework.
Foveated Streaming allows your endpoint to stream high quality content only where necessary based on information about the approximate region where the person is looking, ensuring performance. On Apple Vision Pro, you can also layer native spatial content over the streamed content. For example, a racing game can render the gauges in the interior of the car with RealityKit, and stream the processor-intensive outdoor environment from a remote computer to the device.
Since NVIDIA CloudXR is a third-party technology employed by other VR and computing platforms, the hope is clearly that foveated streaming will allow apps and games to more easily be brought over to Apple Vision Pro.
Beyond this though, the new feature should unlock new experiences inside existing visionOS apps too.
The racing example given above demonstrates that. Here’s another example Apple gives: “a flight simulator app can render a cockpit using RealityKit, and stream a processor-intensive landscape from a remote computer to the device.”
It’s unclear whether any of Apple’s native visionOS apps currently utilize this technology, but hopefully the company leads the way in showing developers best practices.


















