Skip to main content

Over the past few years, Snapchat’s growing collection of Lenses have been some of the best examples of smartphone-powered augmented reality, enabling users to effortlessly add facial modificationsenvironmental effects, and location-specific filters to their photos. Now parent company Snap is enabling creators to use self-provided machine learning models in Lenses, and hoping the initiative will inspire partnerships between ML developers and creatives.

The latest key change is an update to Lens Studio, the free desktop development app used to create most of Snapchat’s AR filters. A new feature called SnapML — unrelated to IBM’s same-named training tool — will let developers import machine learning models to power lenses, expanding the range of real world objects and body parts Snapchat will be able to instantly identify. As an example of the technology, Lens Studio will include a new foot tracking ML model developed by Wannaby, enabling developers to craft Lenses for feet.

Beyond Wannaby, developers including CV2020, visual filter maker Prisma, and several unnamed Lens creators are also working on SnapML-based filters. Lens Studio has also added new hand gesture templates, as well as Face Landmarks and Face Expressions that should improve facial tracking for specific situations. Additionally, the user-facing Snapchat app will be expanding its Scan feature with the abilities to recognize 90% of all known plants and trees, nearly 400 breeds of dogs, packaged food labels, and Louis Vuitton’s logo, plus SoundHound integration to let users find pertinent Lenses using only voice commands.

Snap is also previewing a new feature, Local Lenses, which will “soon” let users share persistent augmented reality content within neighborhoods. Local Lenses promises to create large-scale point clouds to recognize multiple buildings within an area — an expansion of the company’s prior Landmarking feature — to map entire city blocks, the same vision pursued by companies such as Immersal and Scape (now owned by Facebook). Unlike rivals, which have focused primarily on mapping and marketing applications, Snap plans to let users change the look of neighborhoods with digital content.

While the technology behind the feature is fascinating, the way Snap is promoting it today feels somewhat awkward given current social unrest over the Black Lives Matter movement. The company says Snapchat users will be able to “decorate nearby buildings with colorful paint” that will be visible to friends. Though its sample video is far more like Nintendo’s Splatoon than Sega’s Jet Set Radio, using large splashes of color rather than written words, we’ll have to see whether Local Lenses are used solely for positive purposes, or are steps towards AR graffiti similar to what’s currently appearing in real cities as protests continue.

This post by Jeremy Horwitz originally appeared on Venturebeat. 

 

Quelle:

https://uploadvr.com/snap-lenses-ml/

Column: What U.S. broadcasters can learn from Europe about ARExamples

Column: What U.S. broadcasters can learn from Europe about AR

A few years ago, the industry was buzzing about how augmented reality was coming soon…
6. Januar 2021
Empathy in action: Immersive learning as a catalyst for inclusionExamples

Empathy in action: Immersive learning as a catalyst for inclusion

Immersive learning creates engaging, realistic and empathetic learning experiences that help to boost workplace inclusion.…
13. November 2024
Hololight and Arthur Partner to Enhance XR Collaboration for EnterprisesExamples

Hololight and Arthur Partner to Enhance XR Collaboration for Enterprises

What has happened?Hololight has partnered with Arthur to integrate its virtual collaboration tools into its…
26. November 2024

Leave a Reply