Reblog: The Mind-Expanding Ideas of Andy Clark

The idea of the extended mind or extended cognition is not part of common parlance; however, many of us have espoused this idea naturally since our youth. It’s the concept that we use external, physical or digital, information to extend our knowledge and thinking processes.

Today’s “born-digital” kids––the first generation to grow up with the Internet, born 1990 and later––store their thoughts, education, and self-dialogue in external notes saved to the cloud. [1]

“… [Andy Clark describes us as] cyborgs, in the most natural way. Without the stimulus of the world, an infant could not learn to hear or see, and a brain develops and rewires itself in response to its environment throughout its life.”

via Read the full version from the author’s website.

[1] McGonigal; “Reality is Broken” pg. 127

Rapid Worldbuilding with Probuilder

Rapid Worldbuilding with ProBuilder

ProBuilder is totally free available through the package manager.
If you’re using 2017 or 5.6 you’ll get critical bug fixes. However, from 2018 onwards there’s more support, using 2018.1.0b3 I dealt with a considerably severe crash bug, so update to b12 IMO.
Polybrush and Progrids are things you’ll have to go get individually from Unity.
To replace Probuilder objects with polished geo you can play around with the Unity FBX exporter

Real Quick Prototyping Demo

  • Main Probuilder window (found by going to tools>probuilder>probuilder window) – has tools and Probuilder is designed so that you can ignore it when you’ a new to using Probuilder
  • Face, Object, etc. mode – will allow you to touch only the variable selectable
  • A good way to learn the tool is to go through the main probuilder window and check the shortcuts
  • It really helps to keep things as simple as possible from the get go. Don’t add tons of polys
  • Shape selector will help you quickly make stuff
  • Connect edges and insert edge loop
  • Holding shift to grab multiple
  • ‘R’ will give you scale mode
  • Extrude is a fantastic way to add geometry
  • Grid Size – keep at 1 for 1 meter this is important for avoiding mistakes when creating geo and knowing your angles
  • Use the ortho top down view to see if your geo fits your grid
  • Detach face settings is a way to split geo selected but it’s still part of your item
  • Detach face using a different setting to create a new game object
  • Pivot point needs to be rejigged often (solutions: object action or set it to a specific element using “Center Pivot”)
  • center pivot and put it on the grid by using progrids
  • Settings changes become the default
  • Use the Poly Shape tool to spin up a room + extrude quickly
  • Merge objects
  • think in terms of Quads
  • try selecting to vertices and connect (Alt + E) them
  • select hidden as a toggle is a great option, because in 3D you are seeing an orthographic projection so you will click on the thing that is drawn closest to the camera!
  • Crafting a doorway, can be done using extrude and grid meter changes, toggle the views (X, Y, and Z) to help with that
  • hold v to make geo snap; this will save you time later on
  • Alt+C will collapse verts (as in the ramp option where the speaker started with a cube)
  • Weld vs. Collapse — weld great for merging to hallways, or collapse which is more like pushing all verts within a specific distance together
  • Grow selection and smooth group

Polybrush Stuff

  • Add more detail with loops or subdivide (smart connect)
  • Polybrush will let you sculpt, smooth, texture blend, scatter objects, etc.
  • Modes like smoothing
  • todo explore something prefabs
  • N-gons are bad because everything is made up of tris

Texturing stuff

Open up the UV editor
  • By default, everything is on auto which means that on in-scene handles toggle
  • When you’re prototyping this allows you to not use fancy toolbar stuff

Question

  • Why is progrids helpful? Short answer: if you’re not super familiar with 3D modeling and creation software (i.e. Maya) you can create simple geo without leaving Unity editor.
  • Why would you be obsessive about making sure your geo fits your 1 meter grid size? Short answer: This helps you avoid errors with geo creation such as horrid angles and hidden faces.
  • Can you talk a little bit about automation with Probuilder?

Reblog: 3 ways to draw 3D lines in Unity3D

Just as I was thinking about an interesting demo to play with drawing functions in Unity3D, Mrdoob published his Harmony drawing tool made with HTML5/Canvas. It looks really cool, so I though how about doing this in 3D? I only had to figure out how to draw lines.

I did some research and below I present 3 different solutions. You can grab the source of the examples discussed below here.

Drawing lines with Line Renderer [demo]

When it comes to lines, the first thing you’ll bump into in the Unity3D API is the Line Renderercomponent. As the name suggests, it is used to draw lines so it seems the right tool for the job. Lines in this case are defined by 2 or more points (segments), a material and a width.

It has an important limitation: the line must be continuous. So if you need two lines, you need two renderers. The other problem is that the Line Renderer acts very strangely when new points are added dynamically. The width of the line does not seem to render correctly. It’s either buggy or just wasn’t designed for such use. Because of these limitations I had to create a separate Line Renderer for each tiny bit of line I’m drawing.

It was easy to implement, but not very fast since I end up spawning lots of GameObjects each with a LineRenderer attached. It seems to be the only option if you don’t have Unity3D Prothough.

Drawing lines as a mesh using Graphics [demo]

The Graphics class allows to draw a mesh directly without the overhead of creating game objects and components to hold it. It runs much faster than Line Renderer, but you need to create the lines yourself. This is a bit more difficult but also gives you total control of the lines – their color, material, width and orientation.

Since meshes are composed of surfaces rather than lines or points, in 3D space a line is best rendered as a very thin quad. A quad is described with 4 vertices, and usually you’ll only have the start and end points and a width. Based on this data you can compute a line like this:

Vector3 normal = Vector3.Cross(start, end);
Vector3 side = Vector3.Cross(normal, end-start);
side.Normalize();
Vector3 a = start + side * (lineWidth / 2);
Vector3 b = start + side * (lineWidth / -2);
Vector3 c = end + side * (lineWidth / 2);
Vector3 d = end + side * (lineWidth / -2);

First, you get the normal of the plane on which both start and end vectors lie. This will be the plane on which the line-quad will located. The cross product of the normal and of the difference between end and start vectors gives you the side vector (the “thin” side of the quad). You need to normalize it to make it a unit vector. Finally calculate all 4 points of the rectangle by adding the side vector multiplied by half width to both start and end points in both directions. In the source code all this happens in MakeQuad and AddLine methods, so take a look in there.

It wasn’t easy to implement, but once I was there it runs pretty fast.

Direct drawing with GL [demo]

No fast is fast enough! Instead of leaving this topic and live happily with the Graphics solution, I kept searching for something even better. And I found the GL class. GL is used to “issue rendering commands similar to OpenGL’s immediate mode”. This sounds like fast, doesn’t it? It is!

Being much easier to implement that the Graphics solution it is a clear winner for me, the only drawback being that you don’t have much control over the appearance of the lines. You can’t set a width and perspective does not apply (i.e. lines that are far behind look exactly the same as those that are close to the camera).

Conclusion

For massive & dynamic line drawing LineRenderer is not the best solution, but it is the only one available in Unity free version. It can surely be useful to draw limited amounts of static lines and this is probably what it was made for. If you do have Unity3D Pro, the solution with Graphics is reasonable and very flexible but if it is performance you’re after choose GL.

via Did you enjoy this article? Then read the full version from the author’s website.

Games as Medicine | FDA Clearance Methods

 

Games as Medicine | FDA Clearance Methods

Noah Falstein, @nfalstein
President, The Inspiracy
Neurogaming Consultant

Technically software and games are cleared and not approved by the FDA.

By background, Noah:

  • Has attended 31 GDCs
  • Been working in games since 1980 (started in entertainment and arcade games with Lucas Entertainment)
  • Gradually shifted over and consulted for 17 years on a wide variety of games
  • Started getting interested in medical games in 1991 (i.e. East3)
  • Went to Google and left due to platform perspective one had to have at Google
  • Game designer not a doctor, but voraciously learns about science and medical topics

Table of Content:

  • Context of games for health
  • New factor of FDA clearance
  • Deeper dive
  • Adv. and Disadvan. to clearance

Why are games and health an interesting thing?

Three reasons why games for health are growing quickly and are poised to be a very important thing

  • It’s about helping people (i.e. Dr. Sam Rodriguez’s work Google “Rodriguez pain VR”)
  • It’s challenging, exciting, and more diverse than standard games (i.e. games need to be fun, but if they’re not having the desired effect, for example restoring motion after a stroke, then you encounter an interesting challenge). The people in the medical field tend to be more diverse than those in the gaming space.
  • It’s a huge market* FDA clearance = big market
    IMG_2271

So what’s the catch?

Mis-steps along the way

  • Brain Training (i.e. Nintendo Gameboy had popular Japanese games claiming brain training)
  • Wii Fit (+U) (i.e. the balance board)
  • Lumosity fine (i.e. claims made that were unsubstantiated by research)

upshot: lack of research and good studies underpinning claims

Some bright spots

  • Remission from Hopelab (i.e. they targeted adherence: using the consequences of not having enough chemotherapy in their body)

FDA clearance is a gold standard

  • Because it provides a stamp of good, trustable, etc.
  • The burden is on the people who make products to go through a regimen of tests that are science-driven
  • Noah strongly recommends Game Devs to link up with a university
  • Working on SaMD – Software as a Med Device
  • Biggest single world market drives others
  • Necessary for a prescription and helps with insurance reimbursement
  • but it’s very expensive and time-consuming

IMG_2272

FDA definition of a serious disease
[missing]

MindMaze Pro

  • FDA clearance May 2017
  • Stroke Rehabilitation
  • Early in-hospital acute care while plasticity high

Pear Therapeutic

  • Positions its product as a “prescription digital therapeutic”

IMG_2273

Akili Interactive Labs

  • Treats pediatric ADHD
  • Late-stage trial results (Dec. 2017) were very positive with side effects of a headache and frustration, which is much better than alternatives like Ritalin
  • Seeking De Novo clearance
  • Adam Gazzaley – began as aging adult research with Neuroracer, a multi-year study published in Nature

The Future – Good, Bad, Ugly, Sublime

  • Each successful FDA clearance helps
  • But they still will require big $, years to dev
  • you have to create a company, rigorously study it, stall production because changing your game
    would make results invalid from studies, then you need to release it
  • Pharma is a powerful but daunting partner

Questions

  • Can FDA certification for games then reveal that some games are essentially street drugs?

 

Snap Lens Studio

Hello World Building Augmented Reality for Snapchat

Fun fact – 30000+ lenses created by Snapchatters, leading to over a billion views of lense content
Table of Contents
Lens Studio
Hello World! Lens Studio Live Demo
High-Quality Rendering with Allegorithmic
Chester Fetch with Klei Entertainment
Cuphead and Mugman with Studio MDHR

Travis Chen

  • Worked at Bad Robot, Neversoft, and Blizzard

Lens Studio

  • Snapchat has always opened up to the camera, which has positively affected their engagement
  • Pair your phone with Lens Studio
  • A community forum on the site exists where devs q & a
  • Has been out for less than 4 months today, and the lenses have resulted in over a billion experiences
  • The tool has been used for a variety of things hamburger photogrammetry, full-screen 2d experiences,
    r/snaplenses
  • Distributing your lenses is really easy

  • Within, snap you can discover a lense where you can see more lenses by the same creators or you
    can pull up on the base of a story to figure out what lense was used.

Lense Boost – All users see the Snapchat carousel, a Lens Boost to get your lense into this carousel

Find which template best fits your creative intent

Templates

  • Static object
  • Animated object
  • Interactive templates (tap, approach, look at)
  • Immersive (look around, window)
  • For 2D creators (cutout, picture frame, fullscreen, soundboard)
  • Interactive path (idle, walk, and arrival states necessary) coming soon

Examples

  • Brian Garcia, Neon Book
  • Pinot, 2D textures, cutout template, then character animator to animate
  • DFace, DDog, imported into lenstudio (from camera reflections feature)
  • Jordan & Snapchat, ‘88 static Jordan 3D model
  • Netflix & Snapchat, Stranger Things – turning on the TV, or spelling your name out,
    awakiening the demi gorgon

Hello World

Lens Studio is made up of panels:

  • Live Preview, to see what it will be like, it includes tracked content and interaction support
  • Objects panel, like the Unity scene view, it shows you what is in the preview
  • Resources panel, all your resources and where you’d import stuff

Workflow

Start with the animated object template
Select an object in the resources panel and move it to the objects panel
Google blocks + mixamo + export free animations from Adobe and import your character animated from
mixamo
File import monitor and astronaut
Child the imported 3D model to the fox as a child and delete the fox
Add a shadow
Sprite

High-Quality Rendering

Substance Painter is an app to apply materials or paint textures for 2D or 3D.
Any material you bring in you can apply to an object, they apply uniformly, but there’s
also, smart materials which applies intelligently to geometry (rust example).

The layers tab is like the scene view the place to drag and drop

Alphas provides a cutout, you can apply materials to the cutout

Upon clicking export

lensstudio
Challenge: Rubber Ducky

Chester Fetch with Klei Entertainment

Games studio since 2005

Why is AR interesting for Klei?

  • AR is about bringing the virtual world out to the player.
  • Shareable
  • Limited bandwidth
  • Seems hard
  • Would require too much time from others at the Studio

Cuphead and Mugman with Studio MDHR

  • Cuphead and Mugman wanted to build and snap a boss battle
  • All of the lenses used in cuphead were from assets created directly from the game
  • Chains together 5 2D animations

 

Questions


  • Within the Snap app, I noticed you can rent/create a lense “as a service” how does this pertain to lens studio?
  • A question I had was, looking forward to a day when you can use targets like people for further interactable and shareable content like the examples shown in Mugman, when will person/object recognition be available to developers and users of Snap?   
  • What is the github account for Snap?

Relationships Matter: Maximizing Retention in VR

 

Relationships Matter: Maximizing Retention in VR
Isabel Tewes
isabel@oculus.com

There are many ways to measure success, but coming from the mobile world (push notification
strategy, the habit of retention mini-games, funnel analysis, making a real difference when multi-million userbases exist) Isabel talked about retention today.

Retention defined

When someone loves your app and comes back to it time and time again.

Make a great first impression

  • pinpoint your magic
  • get to that moment quickly
  • guide people through their first experience

Share your personality

  • create a tone and stay consistent
  • rethink your interactions
  • identify the pain points
  • design against them / take advantage of them

Create a lasting connection

  • make the right decisions early

First Contact – Bernie Yee

He focused on how VR can be really overwhelming and having someone acknowledge your actions can be really powerful.

The Significance Robot Waving – the way the robot waves to you at the beginning of the experience draws upon a universal sign. You know you’re supposed to wave back. The personality of your wave then comes out as well.

Wave Finding – Helped guide users through the experience the robot is helping to guide your
attention to where you should be going.

Nudge – Nudge your users patiently and with intent

Rick and Morty – Virtual Rick-ality

Establish a tone and be consistent

Against Gravity – Rec Room

Create a safe environment that people come back to
Minimizing trolling and harassment
“Whatever you are when your [organization] is small remember you’ll only be a larger version of that”

Making friends in Rec Room
Two people making friends in Rec Room is done by shaking hands with someone.

High fiving in Rec Room

Upshot: Create your values early and stick to your values ruthlessly.

 

UX for Created Realities

Context: I’m listening to Joshua Walton at the #CreatingRealityHackathon at USC. His talk is called UX for Created Realities. Personally found the part on Microinteractions (Dan Saffer) to be interesting.

Brainstorming

  • A lot of this returns shit ideas –– rather than individual ideas
  • The caveat is if you have the right structure for the brainstorming session it can be more
    beneficial
  • key part: There’s no bad ideas but also no good ideas. People want to be heard but build
    on the ideas, go for quantities, and respect everyone and let everyone speak.

Tactics for brainstorming

  • Do your project in 1 hour
  • Work both top-down and bottom-up
  • Iterate without fear

Microinteractions (Dan Saffer) and Tips

  • Focus on dynamics that build on knowledge in the head (the Lab’s Longbow)
  • Think about sense ratios and focus (Superhot)
  • Sensing is a creative part of the design (some of the most innovative work creates a sense you
    didn’t know you needed)
  • Use sounds right away
  • Encourage people to look around
  • Consistent interactions are way more valuable than realistic interactions
  • content is king, context is scale
  • as long as there’s language we’ll have 2D
  • when you’re creating these new realities be a gracious host – learned from the hospitality industry
  • create consistent space from which to explore

SVVR #49 Summary Notes

SVVR Meetup #49

SVVR Passport – A membership program

This is SVVR’s new shared co-working space for demonstration

  • 24 hour access
  • Demo equipment and library
  • Digital benefits

SVVR VR Mixer 2018

  • March 21st, 2018

Lumus Optics

  • Israeli company doing reflective wave-guide optics whose mission is to be your
    leader in AR displays––more than 60 patents
  • Highest performance for smallest form factor
  • What is wave guide tech?

This boasts

  • Wide FOV (40˚ – 55˚)
  • Compact 1.7mm Form Factor
  • Life-like Image
  • True See-Through
  • Daylight Readable

Founded in 2000
Partnered with Quanta Computer (going to produce the optics engine),
Flex (OEM using Lumus reference), and Deepoptics (vergence accommodation)

They have debuted their new prototype at CES––looks like Dragon Ball equipment.

Developers

Will be able to deploy Vuforia applications or demos using other AR libraries via this.

siram@lumus-optical.com

High Fidelity – Philip Rosedale

  • Probably gonna need
    • Identity
    • IP rights
    • Proof-of-purchase
    • Payment system

Full Decentralization?

Transaction expenses are high (syncing, VR transactions need to happen quickly,
must pay gas)
Federated Consensus – near-zero transaction fees, etc.

Bad Monetary Policy

  • Bitcoin for being usable as a currency isn’t viable. Because Bitcoin is going up in
    price so much and it’s fixed in circulation.
  • Increase circulation as people join
  • Second Life – made more money at roughly the same rate that people come online
    • Use a smart contract to create an exchange rate scaling

High Fidelity Coin (HFC)

  • Stable
  • Blockchain cryptocurrency
  • Easy to use, initial grants for proof-of-identity, and multiple currency exchange

Philip waxing about tech, says something to the effect of “with any big shift in technology
what often comes as compelling is things that are currently done in the world i.e. payment”

Philip also mentioned via s-contracts allowing duels to occur for identity

Cymatic-Bruce

Currently at 6D.AI

Came back from Japan
Saw so many vr experiences, met a ton of the community, discovered VTubing which
can also be found through IMVU, etc.

Learned a lot and PC VR isn’t a thing with Japan due to space and culture

PSVR location based VR situation

VR Zone Shinjuku-–world class location based experience. Tickets range from $10-15 per experience.
They take first-timers very seriously and interestingly take matters of safety really seriously.
Bruce really admired the standard of using motion platforms for everything, tie everything into an
IP, and did a great job of executing. Looks like he did a DBZ experience, and learned how to shoot a
Kamehamehameha. Mario Kart––on a motion platform, etc.

How long is the experience?
How much of what you saw in Japan will likely have an audience in EN?
How is FOVE doing?
VR experiences dealing with food?

view raw

SVVR-49.mdown

hosted with ❤ by GitHub

Reflection: Bigscreen VR

Before I continue I’ll take a step back to define any VR application that brings together people in an environment as “social vr”. What makes Bigscreen interesting is the paradox of choice. In other social applications VR Chat, AltspaceVR, High Fidelity; there are no core activities that you can derive outside of being together, which makes the choices for what you can do very broad. With Bigscreen it’s a display extension or place to watch movies with a cinema experience. Simple.
..
..
Recently I saw that they enticed Paramount (?) studios to do a premiere of Top Gun in VR. I thought that was a nice blend of a social construct we know and love of going to the movies and social VR––though I didn’t attend the screening. I’ve used Bigscreen recently and the environments are nice with physically based shadows and lighting. Bigscreen’s reliance on virtual displays makes it well-positioned to benefit from forthcoming improvements in display clarity.
multiplayer-bigscreen-1400
One thing worth noting is that I haven’t been able to get audio to work for all people in a living room setting. When putting your content on the Bigscreen audio seems to play only for you. This wasn’t the case in the movie theater where a host had no problem playing Rogue One off of Netflix for all to enjoy with sound.
..
..
Finally, when Oculus Home released it’s core 2.0 update everyone on Rift had the ability to see and use their desktop screen in VR. SteamVR also enables desktop viewing. Viveport? Although, today Oculus Home doesn’t offer social as Bigscreen does… this probably effects the uniqueness of their total product and must be considered for continuing fundraising.