Google’s mobile augmented reality platform is currently available on 1.2 billion Android devices. At I/O 2022, ARCore is getting key improvements to make it run faster and more dependable on the Pixel 6 and in the Google Maps live view.
To improve the experience, Google has improved the depth capabilities of ARCore, which enables virtual objects to appear Bum real objects and other, more realistic positions. The current camera on the phone can create a depth map about the size of a large room, or up to 8 meters (26 feet).
The average time to initial depth is reduced by about 15%. The average depth map coverage is now 95% and is greatly improved on surfaces with minimal texture, as you can see on the blank white wall and on the black TV. These are places that Classic Depth will struggle with due to the lack of visual features.
In fact, this camera approach provides depth coverage that is “now comparable” to coverage of committed depth devices. On Pixel phones, Google speeds up the depth and motion tracking process with the moment camera. ARCore optimization is currently applied to the Pixel 4 and Pixel 6.
Meanwhile, Google has a new long-range depth capability that operates up to 20 meters (65 feet) outdoors, where blockage is now available “even in direct sunlight.” This is rolled out in Google Maps Live View.
Google has also improved motion tracking with ARCore by making it 17% faster to find the first level, while reducing tracking resets by 15%. Machine learning is also being leveraged to further these two improvements in ARCore, with Live View in Google Maps set to take virtue.
Another big improvement to ARCore at I/O 2022 is the Geospatial API, which allows augmented reality apps to “calculate the latitude, longitude, altitude, and address of your phone with greater accuracy than a GPS sensor can provide.” It is based on a portion of Google’s work on Cloud Anchors.