ARCore presents Depth API
ARCore

ARCore presents Depth API

ARCore provides ability to build augmented reality (AR) experiences, allows your devices to display digital content immersively in the real environment โ€” making them instantly accessible and useful.


Earlier this year, ARCore introduced Environmental HDR, which brings real world lighting to AR objects and scenes. Today, ARCore is opening a call for collaborators to try another tool that helps improve immersion with the new Depth API, enabling experiences that are vastly more natural, interactive, and helpful.


The ARCore Depth API allows developers to use depth-from-motion algorithms to create a depth map using a single RGB camera. The depth map is created by taking multiple images from different angles and comparing them as you move your phone to estimate the distance to every pixel.



One important thing for depth is occlusion: the ability for Augmented Reality objects to accurately appear in front of or behind real world objects. Occlusion helps objects feel as if they are actually in your space by blending them with the scene. We will begin making occlusion available in Scene Viewer, the developer tool that powers AR in Search, to an initial set of over 200 million ARCore-enabled Android devices today.


In addition to enabling occlusion, having a 3D understanding of the world on your device unlocks a myriad of other possibilities. ARCore team has been exploring some of these, playing with realistic physics, path planning, surface interaction, and more.


When applications of the Depth API are combined together, you can also create experiences in which objects accurately bounce and splash across surfaces and textures, as well as new interactive game mechanics that enable players to duck and hide behind real-world objects.



The Depth API is not dependent on specialized cameras and sensors, and it will only get better as hardware improves. For example, the addition of depth sensors, like time-of-flight (ToF) sensors, to new devices will help create more detailed depth maps to improve existing capabilities like occlusion, and unlock new capabilities such as dynamic occlusionโ€”the ability to occlude behind moving objects.


Weโ€™ve only begun to scratch the surface of whatโ€™s possible with the Depth API and we want to see how you will innovate with this feature. If you are interested in trying the new Depth API, please fill out collaborators form.


via Shahram Izadi, Director of Research and Engineering (link)