Over the past few years, Snapchat’s growing collection of Lenses have been some of the best examples of smartphone-powered augmented reality, enabling users to effortlessly add facial modifications, environmental effects, and location-specific filters to their photos. Now parent company Snap is enabling creators to use self-provided machine learning models in Lenses, and hoping the initiative will inspire partnerships between ML developers and creatives.
The latest key change is an update to Lens Studio, the free desktop development app used to create most of Snapchat’s AR filters. A new feature called SnapML — unrelated to IBM’s same-named training tool — will let developers import machine learning models to power lenses, expanding the range of real world objects and body parts Snapchat will be able to instantly identify. As an example of the technology, Lens Studio will include a new foot tracking ML model developed by Wannaby, enabling developers to craft Lenses for feet.
Beyond Wannaby, developers including CV2020, visual filter maker Prisma, and several unnamed Lens creators are also working on SnapML-based filters. Lens Studio has also added new hand gesture templates, as well as Face Landmarks and Face Expressions that should improve facial tracking for specific situations. Additionally, the user-facing Snapchat app will be expanding its Scan feature with the abilities to recognize 90% of all known plants and trees, nearly 400 breeds of dogs, packaged food labels, and Louis Vuitton’s logo, plus SoundHound integration to let users find pertinent Lenses using only voice commands.
Snap is also previewing a new feature, Local Lenses, which will “soon” let users share persistent augmented reality content within neighborhoods. Local Lenses promises to create large-scale point clouds to recognize multiple buildings within an area — an expansion of the company’s prior Landmarking feature — to map entire city blocks, the same vision pursued by companies such as Immersal and Scape (now owned by Facebook). Unlike rivals, which have focused primarily on mapping and marketing applications, Snap plans to let users change the look of neighborhoods with digital content.
While the technology behind the feature is fascinating, the way Snap is promoting it today feels somewhat awkward given current social unrest over the Black Lives Matter movement. The company says Snapchat users will be able to “decorate nearby buildings with colorful paint” that will be visible to friends. Though its sample video is far more like Nintendo’s Splatoon than Sega’s Jet Set Radio, using large splashes of color rather than written words, we’ll have to see whether Local Lenses are used solely for positive purposes, or are steps towards AR graffiti similar to what’s currently appearing in real cities as protests continue.
This post by Jeremy Horwitz originally appeared on Venturebeat.
The post Snap’s Lens Studio Now Supports Custom ML-powered Snapchat Lenses appeared first on UploadVR.
from UploadVR https://ift.tt/2AvEGi4
via IFTTT
No comments:
Post a Comment