Tue. Jul 14th, 2020

SVMAKERS.ORG

Shenango Valley Makers

ARCore Depth API to Bring More Realism to Augmented Reality

Google this week released the ARCore Depth API, which uses the algorithms generated by movement to create depth information through a single camera. The API has been available in beta since last year, but thanks to Google’s collaboration with select creators, the API is now part of the broader ARCore release for Android and Unity. Google says the Depth API is compatible with hundreds of millions of Android devices around the world.

The core function of the Depth API is to create occlusion. This is the ability of digital objects to appear behind or in front of real-world objects in a realistic way. The idea is to make it appear as though the digital objects are really there with you in your space, bolstering the realism of the app or experience. 

This new release goes beyond occlusion. Google says the ARCore Depth API “unlocks more ways to increase realism and enables new interaction types.” For example, some ideas used to improve depth mapping and perception include realistic physics, surface interactions, environmental traversal, and similar. All these tools are now available via the API. 

Snapchat is one of the companies Google worked with to improve the API. Snapchat Lens Creators, for example, were able to improve the behavior of Snapchat’s lenses. The team said the Depth API’s single integration point streamlined the development process. Similarly, Lines of Play, an Android experiment from the Google Creative Lab, uses depth information to showcase both occlusion and collisions. Apart from entertainment, Google also believes the new depth realism can lead to practical improvements in utilitarian apps, such as video calls. Viewers of video could annotate what they’re seeing in three-dimensional space to give the person on the other end of the call a better idea of what they are supposed to do should a repair be necessary. 

It’s worth reiterating that while the ARCore Depth API works with a single camera, it is also compatible with devices that include a second depth or time-of-flight camera. Pairing the API with the data generated by a ToF camera will only serve to further enhance the realism and experience.

The API and samples are available today via GitHub, though Google says the associated SDK is still forthcoming.

Go to Source
Author: <a href="https://www.programmableweb.com/user/%5Buid%5D">EricZeman</a>