Snap Lens Studio

Hello World Building Augmented Reality for Snapchat

Fun fact – 30000+ lenses created by Snapchatters, leading to over a billion views of lense content
Table of Contents
Lens Studio
Hello World! Lens Studio Live Demo
High-Quality Rendering with Allegorithmic
Chester Fetch with Klei Entertainment
Cuphead and Mugman with Studio MDHR

Travis Chen

  • Worked at Bad Robot, Neversoft, and Blizzard

Lens Studio

  • Snapchat has always opened up to the camera, which has positively affected their engagement
  • Pair your phone with Lens Studio
  • A community forum on the site exists where devs q & a
  • Has been out for less than 4 months today, and the lenses have resulted in over a billion experiences
  • The tool has been used for a variety of things hamburger photogrammetry, full-screen 2d experiences,
    r/snaplenses
  • Distributing your lenses is really easy

  • Within, snap you can discover a lense where you can see more lenses by the same creators or you
    can pull up on the base of a story to figure out what lense was used.

Lense Boost – All users see the Snapchat carousel, a Lens Boost to get your lense into this carousel

Find which template best fits your creative intent

Templates

  • Static object
  • Animated object
  • Interactive templates (tap, approach, look at)
  • Immersive (look around, window)
  • For 2D creators (cutout, picture frame, fullscreen, soundboard)
  • Interactive path (idle, walk, and arrival states necessary) coming soon

Examples

  • Brian Garcia, Neon Book
  • Pinot, 2D textures, cutout template, then character animator to animate
  • DFace, DDog, imported into lenstudio (from camera reflections feature)
  • Jordan & Snapchat, ‘88 static Jordan 3D model
  • Netflix & Snapchat, Stranger Things – turning on the TV, or spelling your name out,
    awakiening the demi gorgon

Hello World

Lens Studio is made up of panels:

  • Live Preview, to see what it will be like, it includes tracked content and interaction support
  • Objects panel, like the Unity scene view, it shows you what is in the preview
  • Resources panel, all your resources and where you’d import stuff

Workflow

Start with the animated object template
Select an object in the resources panel and move it to the objects panel
Google blocks + mixamo + export free animations from Adobe and import your character animated from
mixamo
File import monitor and astronaut
Child the imported 3D model to the fox as a child and delete the fox
Add a shadow
Sprite

High-Quality Rendering

Substance Painter is an app to apply materials or paint textures for 2D or 3D.
Any material you bring in you can apply to an object, they apply uniformly, but there’s
also, smart materials which applies intelligently to geometry (rust example).

The layers tab is like the scene view the place to drag and drop

Alphas provides a cutout, you can apply materials to the cutout

Upon clicking export

lensstudio
Challenge: Rubber Ducky

Chester Fetch with Klei Entertainment

Games studio since 2005

Why is AR interesting for Klei?

  • AR is about bringing the virtual world out to the player.
  • Shareable
  • Limited bandwidth
  • Seems hard
  • Would require too much time from others at the Studio

Cuphead and Mugman with Studio MDHR

  • Cuphead and Mugman wanted to build and snap a boss battle
  • All of the lenses used in cuphead were from assets created directly from the game
  • Chains together 5 2D animations

 

Questions


  • Within the Snap app, I noticed you can rent/create a lense “as a service” how does this pertain to lens studio?
  • A question I had was, looking forward to a day when you can use targets like people for further interactable and shareable content like the examples shown in Mugman, when will person/object recognition be available to developers and users of Snap?   
  • What is the github account for Snap?

Relationships Matter: Maximizing Retention in VR

 

Relationships Matter: Maximizing Retention in VR
Isabel Tewes
isabel@oculus.com

There are many ways to measure success, but coming from the mobile world (push notification
strategy, the habit of retention mini-games, funnel analysis, making a real difference when multi-million userbases exist) Isabel talked about retention today.

Retention defined

When someone loves your app and comes back to it time and time again.

Make a great first impression

  • pinpoint your magic
  • get to that moment quickly
  • guide people through their first experience

Share your personality

  • create a tone and stay consistent
  • rethink your interactions
  • identify the pain points
  • design against them / take advantage of them

Create a lasting connection

  • make the right decisions early

First Contact – Bernie Yee

He focused on how VR can be really overwhelming and having someone acknowledge your actions can be really powerful.

The Significance Robot Waving – the way the robot waves to you at the beginning of the experience draws upon a universal sign. You know you’re supposed to wave back. The personality of your wave then comes out as well.

Wave Finding – Helped guide users through the experience the robot is helping to guide your
attention to where you should be going.

Nudge – Nudge your users patiently and with intent

Rick and Morty – Virtual Rick-ality

Establish a tone and be consistent

Against Gravity – Rec Room

Create a safe environment that people come back to
Minimizing trolling and harassment
“Whatever you are when your [organization] is small remember you’ll only be a larger version of that”

Making friends in Rec Room
Two people making friends in Rec Room is done by shaking hands with someone.

High fiving in Rec Room

Upshot: Create your values early and stick to your values ruthlessly.

 

UX for Created Realities

Context: I’m listening to Joshua Walton at the #CreatingRealityHackathon at USC. His talk is called UX for Created Realities. Personally found the part on Microinteractions (Dan Saffer) to be interesting.

Brainstorming

  • A lot of this returns shit ideas –– rather than individual ideas
  • The caveat is if you have the right structure for the brainstorming session it can be more
    beneficial
  • key part: There’s no bad ideas but also no good ideas. People want to be heard but build
    on the ideas, go for quantities, and respect everyone and let everyone speak.

Tactics for brainstorming

  • Do your project in 1 hour
  • Work both top-down and bottom-up
  • Iterate without fear

Microinteractions (Dan Saffer) and Tips

  • Focus on dynamics that build on knowledge in the head (the Lab’s Longbow)
  • Think about sense ratios and focus (Superhot)
  • Sensing is a creative part of the design (some of the most innovative work creates a sense you
    didn’t know you needed)
  • Use sounds right away
  • Encourage people to look around
  • Consistent interactions are way more valuable than realistic interactions
  • content is king, context is scale
  • as long as there’s language we’ll have 2D
  • when you’re creating these new realities be a gracious host – learned from the hospitality industry
  • create consistent space from which to explore

SVVR #49 Summary Notes

SVVR Meetup #49

SVVR Passport – A membership program

This is SVVR’s new shared co-working space for demonstration

  • 24 hour access
  • Demo equipment and library
  • Digital benefits

SVVR VR Mixer 2018

  • March 21st, 2018

Lumus Optics

  • Israeli company doing reflective wave-guide optics whose mission is to be your
    leader in AR displays––more than 60 patents
  • Highest performance for smallest form factor
  • What is wave guide tech?

This boasts

  • Wide FOV (40˚ – 55˚)
  • Compact 1.7mm Form Factor
  • Life-like Image
  • True See-Through
  • Daylight Readable

Founded in 2000
Partnered with Quanta Computer (going to produce the optics engine),
Flex (OEM using Lumus reference), and Deepoptics (vergence accommodation)

They have debuted their new prototype at CES––looks like Dragon Ball equipment.

Developers

Will be able to deploy Vuforia applications or demos using other AR libraries via this.

siram@lumus-optical.com

High Fidelity – Philip Rosedale

  • Probably gonna need
    • Identity
    • IP rights
    • Proof-of-purchase
    • Payment system

Full Decentralization?

Transaction expenses are high (syncing, VR transactions need to happen quickly,
must pay gas)
Federated Consensus – near-zero transaction fees, etc.

Bad Monetary Policy

  • Bitcoin for being usable as a currency isn’t viable. Because Bitcoin is going up in
    price so much and it’s fixed in circulation.
  • Increase circulation as people join
  • Second Life – made more money at roughly the same rate that people come online
    • Use a smart contract to create an exchange rate scaling

High Fidelity Coin (HFC)

  • Stable
  • Blockchain cryptocurrency
  • Easy to use, initial grants for proof-of-identity, and multiple currency exchange

Philip waxing about tech, says something to the effect of “with any big shift in technology
what often comes as compelling is things that are currently done in the world i.e. payment”

Philip also mentioned via s-contracts allowing duels to occur for identity

Cymatic-Bruce

Currently at 6D.AI

Came back from Japan
Saw so many vr experiences, met a ton of the community, discovered VTubing which
can also be found through IMVU, etc.

Learned a lot and PC VR isn’t a thing with Japan due to space and culture

PSVR location based VR situation

VR Zone Shinjuku-–world class location based experience. Tickets range from $10-15 per experience.
They take first-timers very seriously and interestingly take matters of safety really seriously.
Bruce really admired the standard of using motion platforms for everything, tie everything into an
IP, and did a great job of executing. Looks like he did a DBZ experience, and learned how to shoot a
Kamehamehameha. Mario Kart––on a motion platform, etc.

How long is the experience?
How much of what you saw in Japan will likely have an audience in EN?
How is FOVE doing?
VR experiences dealing with food?

view raw
SVVR-49.mdown
hosted with ❤ by GitHub

Reflection: Bigscreen VR

Before I continue I’ll take a step back to define any VR application that brings together people in an environment as “social vr”. What makes Bigscreen interesting is the paradox of choice. In other social applications VR Chat, AltspaceVR, High Fidelity; there are no core activities that you can derive outside of being together, which makes the choices for what you can do very broad. With Bigscreen it’s a display extension or place to watch movies with a cinema experience. Simple.
..
..
Recently I saw that they enticed Paramount (?) studios to do a premiere of Top Gun in VR. I thought that was a nice blend of a social construct we know and love of going to the movies and social VR––though I didn’t attend the screening. I’ve used Bigscreen recently and the environments are nice with physically based shadows and lighting. Bigscreen’s reliance on virtual displays makes it well-positioned to benefit from forthcoming improvements in display clarity.
multiplayer-bigscreen-1400
One thing worth noting is that I haven’t been able to get audio to work for all people in a living room setting. When putting your content on the Bigscreen audio seems to play only for you. This wasn’t the case in the movie theater where a host had no problem playing Rogue One off of Netflix for all to enjoy with sound.
..
..
Finally, when Oculus Home released it’s core 2.0 update everyone on Rift had the ability to see and use their desktop screen in VR. SteamVR also enables desktop viewing. Viveport? Although, today Oculus Home doesn’t offer social as Bigscreen does… this probably effects the uniqueness of their total product and must be considered for continuing fundraising.

Supermedium

Part of YC’s W18 Cohort, Supermedium, is a superset of webVR experiences. It features work from the likes of Inigo Quilez, Ricardo Cabello, Marpi, and others, the browser can be downloaded on Windows today.

I enjoyed Shadertoy’s audio visualizations in particular though they didn’t support touch controllers. An idiosyncrasy of the platform right now is that usually, experiences have HTC Vive controllers supported only. This means that when you use your Oculus Touch controllers the API will receive button presses and input but they will visually appear to come from an HTC Vive controller.

Check out the Supermedium website here. The founders are experienced contributors to webVR efforts, Kevin Ngo and Diego Marcos, and technical artist, Diego Goberna.

Reblog: Player – Game – Designer

The above work comes from Thomas Bedenk, who I met at VRX London in 2016. See end his page for sources (link found at bottom).

This model provides a substrate, an interactive application namely a game and its production and consumption, and highlights the aspects regarding components Player, Game, and Designer into the full picture.

Read the full version from the author’s website.

Reblog: Can gaming & VR help you with combatting traumatic experiences?

Can gaming & VR help you with combatting traumatic experiences? The following blog post, unless otherwise noted, was written by a member of Gamasutra’s community. The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company.

Can gaming & VR help you with combatting traumatic experiences?

Trauma affects a great many people in a variety of ways, some suffer from deep-seated trauma such as Post Traumatic Stress Disorder caused by war or abuse. And others suffer from anxiety and phobias caused by traumatic experiences such as an accident, a loss an attack.

Each needs its own unique and tailored regime to lessen the effects and to aid the individuals in regaining some normalcy to their lives. Often these customized treatments are very expensive and difficult to obtain.

In the world of ubiquitous technology and an ever-increasing speed in visual-based treatments, these personalized therapies are becoming more accessible to the average sufferer.

What I would like to do is take you through some of the beneficial effects that gaming and VR can have on those suffering from trauma, what these treatments sometimes look like and what the pitfalls can be when using them.
I am not a specialist in psychology or trauma treatment, but I feel that increasing awareness of what is out there is beneficial to everyone, and perhaps can help those suffering from trauma to take the first step in seeking help.

Games & VR as a positive mental activity

To date, a few studies have been done on the effectiveness of gaming and virtual reality gaming in therapeutic treatments. But due to the brief history of both, a lengthy study has yet to be completed. But the one thing that we can be sure of is the first-hand accounts of those that have experienced the benefit of these experiences.

A very basic exercise for those suffering from trauma is to engage in mindfulness or meditation exercises. Meditation guided through a VR system can have very positive effects on an individual’s disposition. Due to the immersive nature of VR, you can let yourself fall away into another world and detach yourself from the real world. It is as though you are “experiencing a virtual Zen garden” dedicated entirely to you.

This effect of letting go and identifying with an external locus is probably one of the most effective attributes of gaming and VR. It is the act of not focusing on yourself, on the memories and cues that cause the underlying trauma, but focusing on and engaging with another character, an avatar, on-screen who for all intents and purposes has led and now leads (through you) another life. This character has its own sense of agency to complete a quest or goal, totally independent from you.

The most effective way that games allow you to let go to offer you a challenge that requires your entire focus. And to enhance this, most games offer group challenges. These are two core drivers in improving positive emotions, personal empowerment, and social relatedness. With individuals who suffer from either PTSD or other deep trauma’s, being given a vehicle that allows easier connections with others helps them to cope with their own trauma’s much better. It takes their mind off what is troubling them and through repetition can even lead to a lessening of symptoms.
Did you enjoy this article? Then read the full version from the author’s website.

For a more behind the scenes look at how this manifests in practice, check out this PBS Frontline documentary. Master Sgt. Robert Butler, a Marine combat cameraman, recounts his struggle with PTSD and how Virtual Iraq helped.

OC4 Talks: Designing for Feeling – Robin Hunicke

Notes from OC4 Designing for Feeling – Robin Hunicke

Philosophy of Exploration and Design

Robin opened with her concept of triple E content (a play on AAA, disambiguated below) and extolled the value of figuring out where you want to go first

  • Elegant Expressive and Emotional content (EEE)
  • She presented a 2×2 matrix with high impact, low cost as the quadrant where most content aims… the problem, she expressed, was that the matrix leaves out elegance as a focal point
    Tips
  • Evolve concepts, tools, & solutions, to reduce cost & improve impact
  • Evolve ux
  • Expressive – Players Speak

Process & the Broad Applicability of EEE

Axes in her slide graphic included rational, eee, baroque, and scripted (e.g. Sims, Black ops)

  1. Test your concept like it isn’t your
  2. Throw away ideas
  3. Find the feeling in your idea (lock in on it)
  4. This is your secret sauce
  5. Test the prototype like it isn’t yours
  6. the prototype is different than what is on paper
  7. the process is what helps
  8. Repeat

Luna

Uncertainty is surpassed only by the effort that needs to go into it

For Luna she took inspiration for the design from a paper world feel, influenced by origami, and during the process she packed her mind with fairytales

Not everyone needs to get into hands-on design influences, but she
thought that making origami and the concepts and learning how the
tactile quality turned out were really informative
 I’ve definitely found with Project Futures: The Future of
Farming it’s really key to actually gain some influence from real world knowledge and folks that have built
constructs or structures that are going to lend to the look and feeling of the world space in the app. Namely
Infarm.

One important side note Robin dropped was that none of the characters in Luna have genders.

Other Random Notes

  • Mood boards
  • Luna started out as a PC and VR title from the beginning
  • The demo and vision existed before the actual prototype (i.e. the hands
    controlling the stars)
  • Tested prototype part 2 and threw it away
  • Music is integrated into the testing process with feeling at the center, namely, “what kind of feeling is it communicating?”

Timeline
4 year process for Luna – started out as a drawing in a book

  • They went through a massive phase where no VR was implemented, then in November 2016 it came to life in VR (7 person
    team)
  • By 2017 the pieces are starting to become cohesive and informed by the feeling

Fail Forward was key, it takes a lot of work.

Have to lean into the idea of interesting different challenging titles

UPSHOT = Diverse and inclusive teams, failure is ok, and the belief that you’re
going to get there. Leads to the triple EEEs and successful titles.

Developer Blog Post: ARKit #1

When developing AR applications for Apple phones there are two cameras that we speak about. One is the physical camera on the back of the phone. The other is the virtual camera that you will have in your Unity scene to in turn, match the position and orientation of the real world camera.

A camera in Unity (virtual) has a component called Clear Flags which determines which parts of the screen will be cleared. On your main virtual camera setting this to “Depth Only” will instruct the renderer to clear the layer of the virtual background environment. Allowing for the seamless overlay of virtual objects on the (physical) camera feed as a backdrop for your virtual objects.

More to come on differences between hit testing and ray casting in the context of ARKit and a broader look at intersection testing approaches in the next post.