Snap Opens Up Ways To Add AR Layers To The Real World


Snap Opens Up Ways to Add AR Layers to the Real World


Snap Opens Up Ways to Add AR Layers to the Real World

As VR and AR keep accelerating, companies making creative virtual world apps like Horizon Worlds and VRChat are finding a wild and sometimes hard-to-govern mix of user-made spaces that are sprouting up fast. In the world of AR, we might start seeing shared virtual experiences, too, overlaid with places in the physical world. Snap's AR Landmarkers, which can layer AR on top of real-world 3D-scanned places, are opening up for developers to start building on their own. Snap sees this AR layer as a key piece of its own road to AR glasses

Snap's already opened the Custom Landmarkers to early access for some developers, many of them building local culture or entertainment AR experiences. (Yu & Me Books in New York, a San Francisco historical AR experience in Union Square; a Charlie Parker jazz AR experience at a Kansas City statue; and a Paul Smith wall in LA is connected with an AR song performance by Megan Thee Stallion and Dua Lipa.)

The AR effects get created in Snap's own Lens Studio computer software, not on phone, and need lidar-equipped iPhones and iPads to 3D scan local landmarks, according to Snap's director of computer vision engineering, Qi Pan, who spoke to CNET. 

Snap's approach also shows the challenges ahead in handling issues of privacy and respectful use of physical spaces. The company's original use of AR landmarks triggered augmented virtual effects on 30 famous locations by holding up a phone camera using Snapchat. The same idea will apply here with these AR landmarks, but only after the experiences are approved through Snap's submission process. That curated path could help limit misuse and help ensure AR experiences are authorized for the spaces they're being activated for. The AR experiences get activated by either looking for the AR Lens effect on a creator's profile or by scanning a physical QR code at the place where the AR activation is connected.

Snap's location-based AR could be used across locations, setting up virtual art walks or theatrical experiences in a similar way to how Niantic's Lightship AR platform works. And the local AR effects look to be stepping-stones to how Snap's going to evolve its vision for wearable AR glasses, which currently exist in a developer-only prototype form.

"These use cases are probably interesting on mobile as well as glasses, and there will be a bunch of use cases which will only be interesting on glasses in the future," Pan said of Snap's future AR strategy. "But investing in these use cases on mobile that are also interesting on Spectacles in the future, we really learn: what the value is that people get out of it."


Source

Tags:

Search This Blog

close