Since Google launched ARCore in 2018, it has been working "quietly" on augmented reality (RA) platform enhancements. But now the technology giant came to announce a new tool, the ARCore Depth API, which allows developers to create more realistic RA objects.
How can developers create the RA map? As Google explains in this Tuesday's post announcing the news, this is done by capturing multiple images from different angles, which are then compared as you move your phone, allowing you to estimate the distance of each pixel.
The ability for digital objects to accurately appear in front of or behind real-world objects is possible through occlusion, which will begin shipping from Tuesday in Scene Viewer, a tool for developers to develop AR content in search, for an initial set of 200 million ARCore-enabled Android devices.
Google also said it was working with Houzz, a house renovation and design company, to bring the Depth API to the "View my room in 3D" experience in the company's app. Through the ARCore Depth API users can now gain a more realistic view of the products they are about to buy by viewing 3D models next to the furniture in a room.
When API Depth apps are combined developers can create experiments in which objects bounce and spread accurately across surfaces and textures, as well as new interactive real-world gameplay mechanics.
Developers interested in trying out the new Depth API will need to fill out a form and it will be up to Google to select the pros. Now we have to wait until we know when these features will emerge more widely on the Internet at a time when earlier this year the company introduced Environmental HDR, which brings real-world lighting to RA objects and scenes, enhancing immersion with more realistic reflections, shadows and lighting.