As the summer winds down, much of the Metaverse fervor of the past year has started to wane a bit, at least for the time being.
While we’re big believers in the Metaverse as a long-term trend, the technology seems destined to take a bit of a time-out in the “Trough of Disillusionment” (to use the Gartner Hype Cycle terminology).
At the same time, another related technology is starting to make some serious technological progress and appears to be making its way onto the “Slope of Enlightenment.”
2019-2021 - Partly Cloudy
Before the Metaverse was the talk of the XR town, back in early 2019, the immersive industry was buzzing about a different idea called “The AR Cloud” or “Mirrorworld.”
From a high level, the premise of the AR Cloud is that AR experiences will be attached to real-world locations where they will be both more useful and entertaining than their predecessors.
Imagine an AR layer added to a shopping mall to create a scavenger hunt. Or a festival app that helps you find where your favorite artists are performing and even creates special effects for their performances. Location-based AR could be used to show a “view into the past” at a historical location; there are so many possibilities to unlock here!
Traditionally, AR experiences combine computer graphics with the real world using a couple of different methods: image targets or plane tracking.
Image targets are the most widely used (they’ve been around for over ten years in broad use) and are responsible for most demos one would encounter where a business card or product packaging comes to life. Essentially an AR application looks for a known image (product logo etc.) and uses that to understand where to place digital content in the real world.
Plane tracking was introduced to the mass market through Apple’s ARKit and Google’s ARCore SDKs about five years ago. This technology allows for a new set of AR applications that don’t require a specific image or shape to anchor digital content to the real world but instead look for an open flat space like a floor or tabletop.
In 2019, this started to change in a meaningful way as new technologies from Microsoft, Google, and others were released that allowed AR experiences to be tied or “anchored” to specific physical locations. Microsoft released “Azure Spatial Anchors” at Mobile World Congress in 2019, and Google released its version, called “Google Cloud Anchors,” soon after.
While these technologies were a breakthrough in terms of the types of experiences they could enable, they never really reached mainstream adoption, and most AR apps continued to use image targets or plane tracking.
This lack of widespread success boils down to two main factors: the effort required by developers/creators and the experience for the end users.
On the development side, to create a cloud anchor, a location had to be carefully scanned ahead of time using a mobile phone.
In terms of the final experience for end-users, these earlier technologies were limited to mobile apps, which people generally don’t like to install, especially for quick experiences.
Almost four years later, we’re seeing serious signs of progress on both fronts, although no solution currently addresses both the user and developer issues.
2022 - Things are getting cloudier
On the user side, in March 2022, Snap announced support for “Custom Landmarkers,” which are effectively comparable to their 2019 counterparts with one key difference: these can be delivered through a Snapchat Lens. So for the 300M or so daily active Snapchat users, this means no app download is necessary to have these experiences.
The scanning process is currently comparable to traditional spatial anchor systems (see our behind-the-scenes look at creating a landmarker for the Lincoln Memorial), but Snap has already teased support for full cities, which will be expanding over time.
On the developer side of things, both Apple and Google have announced support for city-wide or even worldwide AR experiences that don’t require advanced scanning by the creators. Apple’s ARGeoAnchor supports a growing list of cities, and Google’s ARCore Geospatial API builds on their massive amounts of Streetview data to enable experiences in over 87 countries. Both technologies still require end-users to download an app, unfortunately, but these features have begun making their way into commonly used transit and sporting apps.
Hot off the Press
Just ahead on the horizon, Niantic (the company behind Pokemon Go) has just released a product called “Lightship VPS for Web.”
This product promises to increase the accessibility of experiences for end-users dramatically by enabling AR Cloud-style applications in a web browser (no app required!). This solution allows for specific locations to be scanned and also supports over 100,000 “VPS-activated” locations from their popular AR games.
So while there is still not a perfect, one-size-fits-all solution for creating location-based AR experiences, there has been a tremendous amount of progress over the last year, and the future looks cloudier than ever!
コメント