Apple is developing its very own rear-facing 3D sensor system for their 2019 iPhone lineup. Up until now, mobile devices only had front-facing laser scanners such as the TrueDepth sensor system seen with the iPhone X. By moving this feature to the rear, Apple is positioning themselves to become a leader in augmented-reality (AR) for mobile.
At the September 2017 event, Phil Schiller introduces the new generation of iPhone
Currently, their TrueDepth system depends on a structured-light scan that projects thousands of lasers for facial recognition and 3D replication. This requires lasers to be placed very precisely on the scanned object(s). However, their new time-of-flight system measures how long it takes for lasers to reflect off of whatever’s being scanned. This sophisticated model will hopefully resolve previous issues with manufacturing large volumes of iPhones using advanced laser scanning technology. Embraced by companies such as Sony and Panasonic, the new time-of-flight approach is still in its early stages but has the potential to revolutionize the functionality of our modern smartphone.
This technology was first discussed in June by Patently Apple after an Apple engineer, Jawad Nawasra, patented the 3D time-of-flight depth mapping scan pattern system. The description of this intellectual property described the concept as:
A processor [that] identifies an object in the image of the scene, defines the non-rectangular area so as to contain the identified object, and processes the output of the optical receiver so as to extract a three-dimensional (3D) map of the object.
These rear-facing scanners will be the stepping stone to more AR applications on the iPhone. Just this year, the company added the ARKit allowing developers to make AR-related apps with ease. However, the accuracy of its laser-scanning technology was extremely flawed due to being limited to only its front-facing option. Re-positioning this 3D sensor to the rear may help with these setbacks. Apple seems to be getting a head start on integrating next-gen AR technology for mobile users. Still, the company plans to have both front and rear-facing abilities for their phones in the future.
Apple CEO, Tim Cook, has openly advocated for the utilization of AR and describes how significant it would be for the user experience. In a past interview with Buzzfeed, Cook said that he believed that we would eventually adopt Augmented Reality in our daily life the same way we warmed up to the idea of mobile apps and fully embraced the technology and that with time AR will become our new normal :
I think AR will change the way people shop. It’ll change the way that entertainment and gaming is done. It’ll change the way people learn, change education. It will literally change everything.
Project Tango in action
Competition – Alphabet Project Tango
This is exciting news amidst rumors that Apple’s AR headset could be ready to ship as early as 2020. However, Apple isn’t the only one taking advantage of this new look on reality. Alphabet has teamed up with Infineon to develop similar technology first seen with Project Tango in 2014. This Infineon chip is in the Phab 2 Pro and the ZenFone AR which run on Google’s Android OS, and the company is moving ahead with full development for upcoming Android phones.
According to market reports, the overall 3D integrated circuit market, which includes products like Memories, Sensors, MEMS and LEDs, will likely rise at a CAGR of above 79% during 2015–2019. Also, to suitably meet client requirements like indoor navigation, better image storage and higher resolution, phone manufacturers are increasingly adapting 3D motion sensors.
If you are familiar with the front-facing scanner, have you been satisfied with its performance thus far? What would you do differently if you had a rear-facing option? Let us know what you think.