Appleâ€™s upcoming mixed reality headset could include a number of sensors to track the eyes, gestures and even facial expressions of its users. The company applied for a patent to track these kinds of inputs, and combine them with information gathered from outward-facing sensors for mixed reality experiences.
The patent application in question, simply titled â€œDisplay System Having Sensors,â€ was first filed in March of this year, and published last week. It describes in detail plans to use a range of sensors to gather data from the wearer of a mixed reality headset.
Such sensors would make it possible for Apple to more realistically reproduce a userâ€™s facial expression in mixed reality. Apple has already developed facial tracking software for Animoji, the companyâ€™s animated AR emoji. Animoji make use of an iPhoneâ€™s selfie camera to track facial expressions, and then translate those movements to animation.
The challenge with that approach is that you canâ€™t simply film a userâ€™s face if theyâ€™re wearing a headset. Thatâ€™s why Apple is looking to combine data from separate sensors, including some used for eyebrow and jaw tracking, as well as eye tracking cameras. The latter could also be used for biometric authentication, as the patent application notes. The company also may use cameras for gesture tracking, according to the patent application.
Apple has been working on its own headset for a couple of years now. Cnet reported last year that this headset would combine augmented and virtual reality, meaning that it would overlay virtual objects and worlds over a view of the real world.
The patent application filed last week further outlines how the company is likely going to do that: by capturing the real world with outward-facing cameras, and then displaying these images on a display â€” an approach thatâ€™s very different from the way Microsoftâ€™s and Magic Leapâ€™s augmented reality headsets work. From the patent application:
â€œIn some embodiments, the world sensors may include one or more â€œvideo see throughâ€ cameras (e.g., RGB (visible light) video cameras) that capture high-quality video of the userâ€™s environment that may be used to provide the user with a virtual view of their real environment.â€
Apple has yet to publicly comment on its headset plans, including when it intends to ship any such product. In March, Apple analyst Ming-Chi Kuo estimated that the company may start to produce its headset by Q4 of this year, and then publicly introduce it in 2020.