At Last, Apple Unveils Its AR Headset, Rebranded as ‘Spatial Computer’
Apple’s mixed-reality headset offers a plethora of technological innovations, including a new SoC that prevents users from experiencing nausea while wearing the headset.
Look for our continued WWDC23 coverage tommorrow in our article on Apple's M2 Ultra and Mac Studio.
For years, rumors have been swirling about Apple’s potential entry into the AR/VR market. While other companies put out headsets that were met with mixed reviews, they failed to spark widespread consumer adoption—and many in the industry optimistically waited for Apple to release a headset of its own.
Apple Vision Pro. Image courtesy of Apple
This week, the wait finally came to an end when Apple unveiled Apple Vision Pro, the company’s first “spatial computer,” at WWDC23. This novel tech from Apple comes rife with brand-new technological innovations that set it apart from past headsets. In this piece, we’ll take a look at the Vision Pro headset on a high level and discuss the new silicon powering the company's first spatial computer.
Vision Pro: The High-level Overview
Apple Vision Pro is a mixed-reality headset, meaning that it operates both as a VR headset (complete immersion) and an AR headset (partial immersion). There is an abundance of technologies we can discuss when it comes to Apple Vision Pro, but since we’re EEs, we’ll look specifically at the hardware.
The device employs a myriad of sensors supported by sophisticated hardware. To interpret the physical world around the user, the headset features an array of sensors, including two main forward-facing cameras, four downward-facing cameras, two side-facing cameras, a true-depth (3D) camera, and a LiDAR scanner. To interface with the user’s eyes, the product features an array of IR cameras and LED illuminators on the inside of the headset.
The outward-facing sensors on Vision Pro. Image courtesy of Apple
The Vision Pro introduces two processors to process this sensing and support Vision OS: the M2 and the all-new R1. For visuals, each eye has a custom-made micro-OLED display that offers a resolution greater than a 4K TV for each eye.
Unlike competitors such as Microsoft’s Holo Lens or the Meta Quest, which fully contain the hardware in the headset, Apple’s Vision Pro’s battery is separate from the headset. Instead, the device’s battery is meant to go in the user’s pocket and connects to the headset via a USB cable. By separating the battery from the headset, Apple made its headset lighter and prevented it from overheating—both common complaints of AR/VR headsets.
The R1 Chip
One of the biggest barriers facing AR/VR headsets so far has been the feeling of nausea users experience because of the lag between processed data and visual data. Apple claims to have remedied this issue in the Apple Vision Pro with its new in-house R1 chip.
The Apple Vision Pro introduces the R1 chip for the first time. Screenshot from Apple
Apple has not revealed many details about the R1 chip, but the chip has been confirmed as an Apple-custom silicon solution designed explicitly for sensor processing and fusion. According to Apple, the R1 simultaneously processes input from 12 cameras, five sensors, and six microphones with extremely low latency to eliminate the lag between sensed and experienced data. The R1 is also said to stream new images to the user’s eyes in under 12 milliseconds—eight times faster than the blink of a human eye.
The R1 works alongside Apple's M2 SoC in the Apple Vision Pro, where the M2 does the heavy lifting for non-sensor computing. The M2 can achieve up to 15.8 TOPS of AI compute and achieves a memory bandwidth of 100 GB/s.
Is This the One?
AR/VR has long been touted as a potential next wave in computing, but to this point, no company has produced a product to make it a reality. It’s hard to say whether or not Apple’s Vision Pro may be the headset to change this trajectory for AR headsets, but if there were ever a company with the resources, ethos, and technology to make it possible, it’d be Apple.