Today we have Face ID and 'emojis' that dance, but what will we see tomorrow? The TrueDepth technology of the iPhone X camera could give for much more.
A bigger screen is great and a better design is more than welcome, but if there is something that really makes the iPhone X stand out from the heap it is that TrueDepth technology. And this is the first serious attempt by Apple to make a much more advanced type of camera, something like what we have seen with Google's Tango technology, Intel's RealSense or Microsoft Kinect.
I can see that this technology will be used in a lot of different ways, from for mixed reality to virtual reality headphones or smart car dashboards.
Teams for mixed reality will need cameras like these. I think of HoloLens, a mixed-reality computer with a special set of cameras and sensors to track and recognize the environment well enough to realistically locate virtual objects. Or other emerging and experimental headphones like Meta and Avegant. The blend of virtual and real objects in everyday space, realistically, as Magic Leap promised but has not yet delivered.
Could more advanced sensors like TrueDepth begin to pave the way for high-end applications? It's hard to know the range of TrueDepth, but it's a start. Or, maybe TrueDepth could be used inside future headphones and use face tracking to control the entrance and movement of the eyes.
Biometrics everywhere. Face ID is basically hands-free technology. Maybe that will leave room to use it to open or close doors or control the board of our cars. Maybe even one day we use to control our Apple Watch. I find that it is a technology that can be flexible to use in a lot of sites. Maybe we are heading to a Minority Report future where we will have to scan our faces for everything.
Robots, drones, vehicles and locating devices. TrueDepth is meant for photos, Face ID and RA tricks right now. But what if it is used to help robots navigate or allow a vehicle to better see a parking garage? The cameras and SLAM technologies (simultaneous localization and mapping) are the ones that allow robots like Kuri to find their way or help an autonomous vehicle unit. TrueDepth could be short-range, perhaps, but what if the sensors of the future can scan longer intervals and help avoid obstacles?