Apple has been nailing it lately after its last conference, where it announced the upcoming iPhone X and two other iPhone models. Probably, what people liked the most was the new FaceID which the iPhone X uses to unlock the phone and all the advanced technology that’s behind. But, could Android ever get back the first place in the race as many considered once with the Google’s Project Tango?
The FaceID and the existing platform for ARKit in iOS 11 are two of many things Androids still don’t have. It seems they will always be getting at the back of the competition, especially with Apple bringing a new front-facing depth camera, letting people perform even more sophisticated AR effects.
The ARKit apps on the iPhone X use the TrueDepth camera to take the user’s face information and transform it from 3D graphics into a map of hundreds of dots. Thus, letting the phone to understand the facial geometry and compose the user’s pose, face shape and expressions – which enables the same time the Animoji, Portrait Lighting effects, masks and avatars. Indeed, these AR effects are a huge advance for the front camera.
Unlike any Snapchat filter that just enables and puts a layer over the customer’s face, the Face Tracking in ARKit detects any depth or expression on the user’s face by using its array of cameras and the 6-axis motion sensors. This can develop impressive selfie effects – such as virtual makeup or tattoos, facial hair, glasses, jewelry, hats, and any other accessory. Additionally, the gadgets also take a “face captures” to make a character avatar, as funny as the costumers want it to.
Apple is building a massive sea of people wanting to try and own the new camera-based AR technology with the iPhone X, just as Android’s ARCore drops out the basic single-camera AR to a select few, premium-priced Android models.
Apple’s iPhone TrueDepth vs Google’s Project Tango
Of course, Google is another giant when talking about impressive, advanced technology. Although their phones haven’t been as accepted as iPhones, Google is not way behind Apple on developing AR software. After the just-announced, upcoming Google’s Pixel 2, it’s been rumors that the company is creating another phone able to support the camera software, which would attract plenty of eyes in the market.
We just have to remember how ambitious Google showed to be that time it suggested the experimental Project Tango, which AR would require a more significant depth camera. The technology traces its lineage back to the same work at PrimeSense and FlyBy Media, both of which Apple acquired.
As Visual Inertial Odometry, Google’s Tango also uses SLAM (Simultaneous Localization And Mapping). Like the iPhone X does it with the user’s face, depth camera system “perceives” everything that surrounds the owner. It would be like an exact map of any home, any building, park, school, and many other parts of the entire world. It’s pretty similar to Google’s camera cars creating street views for Maps.
Source: Apple Insider