The Future of Augmented Reality Glasses

Meta

  • Five years ago, plans were announced to create AR glasses to blend the digital and physical worlds.
  • Orion, the latest AR glasses from Meta, aims to enhance presence, connectivity, and empowerment.
  • AR glasses offer unrestricted digital experiences with large holographic displays.
  • Contextual AI integration allows for proactive addressing of user needs.
  • Orion stands out for its lightweight design, indoor/outdoor versatility, and emphasis on interpersonal interactions.
  • Orion represents a significant advancement in AR glasses technology, combining wearability with advanced features.
  • The groundbreaking AR display in Orion offers immersive experiences and a wide field of view.
  • Orion’s unique design maintains a glasses-like appearance, allowing users to see others’ expressions.
  • Orion’s capabilities include smart assistant integration, hands-free communication, and immersive social experiences.
  • While not yet available to consumers, Orion serves as a polished product prototype for future AR glasses development.

OP: Introducing Orion, [Meta’s] First True Augmented Reality Glasses

Snap

Sophia Dominguez, the Director of AR Platform at Snap, discussed Snap Spectacles and the company’s AR initiatives at the Snap Lens Fest original post here.

  • Snap Spectacles can be connected to a battery pack for extended use beyond the standard 45 minutes, with a focus on B2B interactions that directly engage consumers.
  • There is a push for Snap to collaborate with various businesses, such as those in location-based entertainment or museums, to expand the Snap Spectacles ecosystem.
  • Sophia Dominguez has been involved in AR for over a decade, starting with Google Glass, and now oversees developers and partners creating lenses on Snapchat.
  • Snap’s approach to AR emphasizes personal self-expression as a catalyst for AR lenses, transitioning to world-facing AR lenses like those in Snap Spectacles.
  • Snap’s long-term vision is to make AR ubiquitous and profitable for developers, aiming to integrate digital objects seamlessly into the real world.
  • Snap’s focus on consumer-level AR use cases includes self-expression as a core feature, offering a variety of options for users to engage with AR content.
  • Snap’s AR platform also caters to enterprise and B2B applications, collaborating with stadiums, museums, and other businesses for unique AR experiences beyond consumer-facing lenses.
  • Snap’s technology, like Snapchat cam, is designed for venues to integrate into large screens or jumbotrons, focusing on consumer desires for virality and joy rather than just enterprise solutions.
  • The company aims to increase ubiquity by making lenses fun and approachable, partnering with entities like the Louvre to explore augmented reality possibilities in a consumer-friendly manner.
    HTC Vive has delved into location-based entertainment more than Meta, and Snap is prioritizing connected experiences, ensuring fast connectivity and optimizing for various use cases like museum activations.
  • Snap collaborates closely with developers, offering grants and support without strings attached to foster innovation in the augmented reality space, aiming to be the most developer-friendly platform globally.
  • Snap’s Spectacles have evolved over the years, from simple camera glasses to AR display developer kits, with the latest fifth generation focusing on wearability, developer excitement, and paving the way for consumer adoption.
  • The company has revamped Lens Studio to encompass mobile and Spectacles lenses, emphasizing ease of use and spatial experiences, aiming to create a seamless ecosystem for developers across different platforms.
  • Snap values feedback and collaboration with developers, striving to provide pathways for monetization and support for creators building on both mobile and Spectacles platforms.
  • Snap’s Spectacles offer a unique immersive experience, leveraging standalone capabilities and spatial interactions, aiming to enable emergent social dynamics and experiences not possible on other devices.
  • Developers are considering the length of time users spend on devices like Zoom calls or workouts, with a focus on creating a seamless experience for users on the go.
  • The new SnapOS manages a dual processing architecture for Spectacles, with Lens Studio being the primary pipeline for developers to create content for the device.
  • Snap is actively listening to developer feedback and working on enabling WebXR on Spectacles to support a variety of use cases and experiences.
  • The operating system for Spectacles includes features like connected lenses, hands and voice-based UI, and social elements out of the box to facilitate easier development.
  • The ultimate potential of spatial computing is envisioned as a way to break free from the limitations of screens, allowing for more natural interactions and connections in the real world.
  • Snap aims to empower developers to explore the possibilities of augmented reality and spatial computing, emphasizing ease of use and continuous improvement based on user feedback.

Traversal of Immersive Environments | HoloTile Floor from Disney

If you’re new to The Latent Element, I write about future market development narratives or things of interest to me, hence the name “latent” element. These views are not representative of any company nor do they contain privileged info.

More details and contact info are in the about section.

Post Details

Read time:

3–4 minutes

Table of Contents:

  1. The Challenge
  2. Early Solutions
  3. Disney Research HoloTile Floor
  4. Closing Thoughts
  5. Sources

Subscribe to get access

Read more of this content when you subscribe today.

How can mixed reality drive more engagement in movement and fitness?

Fitness is one of the most robust categories under discussion, across Augmented Reality and Virtual Reality devices. For whom does this movement level merit the moniker “fitness”? And what timeline are we working with to see the sweeping adoption of fitness via spatial computing (the term widely known now due to Apple’s Vision Pro announcement and curving of the terms VR / AR / MR collectively)?

I’m seeing new unlocks particularly as it relates to the comfort of the device, spatial awareness afforded due to camera passthrough, and greater respect for ergonomic polish among developers.

The video seen here is a clip taken November 8th, 2023, showing a first-person view of a Quest 3 experience that allows for gestures, hand tracking, and movement to be used as input to an increasing number of games.

The title is built by YUR, the app name is YUR World.

My Experience Working Out At-Home During the Global COVID-19 Outbreak

At first glance, this post might sound pedantic, for comprehensive info on the Corona Virus visit the WHO Q&As or CDC. This post is in regard to immunological fitness and how the virus is spread and my personal method of using virtual reality as an additional form of exercise:

The disease can spread from person to person through small droplets from
 the nose or mouth which are spread when a person with COVID-19 coughs or 
exhales. These droplets land on objects and surfaces around the person. 
Other people then catch COVID-19 by touching these objects or surfaces, 
then touching their eyes, nose or mouth


source: https://www.who.int/news-room/q-a-detail/q-a-coronaviruses

For the above reason, gyms and other typically crowded workout facilities are out. However, exercise is still a key part of staying healthy, more on this further down. I’ve been using an at-home workout strategy using virtual reality for over two weeks and I’d like to share why this is working for me.

TLDR

If you own a VR headset; some titles that could be used for cardio are:

- Beat Saber
- Box VR
- OhShape
- Thrill of the Fight
- Synth Riders
- Creed: Rise to Glory
- Until You Fall

Active titles that can be modified to be more of a workout:

- Rec Room
- RacketNX
- Pistol Whip
- Lone Echo
- Superhot VR

For general standing activity to afford you some low intensity movement:

- Racket Fury
- Sports Scramble
- VRChat

Virtual reality is a little known option for folks as it relates to fitness, but now we know at my company YUR that thousands of people use VR games daily to workout in a fun and efficient way. The big difference is that while wearing a VR headset you are completely immersed in playing the role of a player in a game. It’s important to note that this trend towards immersive fitness is visible with Peloton, Les Mills, and other fitness names.

YUR monthly view

My month so far has been characterized by workouts between 250 kcals and 750 kcals as you can see, every day (except for March 4th). I’ll tend to use games such as Box VR or Beat Saber, and with YUR the cool part about this is any game can be played and tracked which allows for constant novelty the moment you feel bored of your current exercise regime. This doubles as a benefit if you are feeling cooped up at home.

… with YUR any game can be played and tracked which allows for constant novelty the moment you feel bored of your current exercise regime

I would characterize the kind of workouts I do in VR as plyometric, and explosive in nature similar to a HIIT workout. However, this is up to your personal preference.

As a perennial gym-goer, I have to point out here what VR workouts are not providing me and others. Hypertrophic or strength benefits from lifting weights, cycling, rowing, calisthenics, and running are all different from VR workouts.

So how does staying immunologically fit factor into this as well as COVID-19? I’m not posing a risk to others (as long as I am the only one using my VR headset). By doing this I’m participating in a community.

To be immunologically fit, you need to be physically fit. “White blood 
cells can be quite sedentary,” says Akbar. “Exercise mobilises them by increasing your blood flow, so they can do their surveillance jobs and seek
 and destroy in other parts of the body.” The NHS says adults should be 
physically active in some way every day, and do at least 150 minutes a week 
of moderate aerobic activity (hiking, gardening, cycling) or 75 minutes of 
vigorous activity (running, swimming fast, an aerobics class).

`source: https://www.theguardian.com/lifeandstyle/2020/mar/08/how-to-boost-your-immune-system-to-avoid-colds-and-coronavirus`

So basically, in the middle of my day between 1 pm or 6 pm, I throw my Oculus Quest on and workout for maybe half an hour or so. I hope that this has been insightful to you and if you have a VR headset perhaps this can factor into your virus response.

This post initially appeared on my Linkedin.

Oculus Connect 6 Takeaways

Ahead of Oculus Connect 6 (OC6), I attended the Oculus Launchpad and Start dinner tonight. I saw a ton of vibrant communication and hopes for the next few days. In no small order, developers were internationally based, from places such as Canada and New Zealand. I noticed a pattern of developers who seem to be holding full-time jobs all the while in pursuit of publishing an app to the Oculus Store.

RealityKit Motion Capture and Apple’s future iPhone including a time-of-flight camera

Apple analyst Ming-Chi Kuo claiming in his latest report that two of the 2020 iPhones will feature a rear time-of-flight (ToF) 3D depth sensor for better augmented reality features and portrait shots, via MacRumors.

“It’s not the first we’ve heard of Apple considering a ToF camera for its 2020 phones, either. Bloomberg reported a similar rumor back in January, and reports of a 3D camera system for the iPhone have existed since 2017. Other companies have beaten Apple to the punch here, with several phones on the market already featuring ToF cameras. But given the prevalence of Apple’s hardware and the impact it tends to have on the industry, it’s worth taking a look at what this camera technology is and how it works.

What is a ToF sensor, and how does it work?

Time-of-flight is a catch-all term for a type of technology that measures the time it takes for something (be it a laser, light, liquid, or gas particle) to travel a certain distance.

In the case of camera sensors, specifically, an infrared laser array is used to send out a laser pulse, which bounces off the objects in front of it and reflects back to the sensor. By calculating how long it takes that laser to travel to the object and back, you can calculate how far it is from the sensor (since the speed of light in a given medium is a constant). And by knowing how far all of the different objects in a room are, you can calculate a detailed 3D map of the room and all of the objects in it.

The technology is typically used in cameras for things like drones and self-driving cars (to prevent them from crashing into stuff), but recently, we’ve started seeing it pop up in phones as well.”

The current state of ARKit 3 and an observation

Screen Shot 2019-07-29 at 12.07.16 PM.png

ARKit 3 has an ever-increasing scope, and of particular interest to me are those AR features which under the hood rely upon machine learning, namely Motion Capture.

Today, ARKit 3 uses raycasting as well as ML Based Plane Detection on awake or when the app using ARKit 3 is initially opened in order to place the floor, for example.

Check the video below. In it, I’m standing in front of my phone which is propped up on a table.

In this video, I’m using motion capture via an iPhone XR. My phone is sitting on a surface (namely the table) that it has determined is the floor plane, and as a result, you’ll notice that our avatar, once populated into the scene, has an incorrect notion of where the ground is.

Screen Shot 2019-07-30 at 1.27.20 PM

It’s the hope that new ToF sensor technology will allow for a robust and complete understanding of the layout of objects in the room and the floor. Such that, for the same context, the device is able to tell that it is sitting on a table yet, the floor is not that plane but the one further away in the real world scene before it.

 

Source:
The Verge, “Apple’s future iPhone might add a time-of-flight camera — here’s what it could do”

AR Industrial Applications: Defense Engineering

What is this? I chatted with Evan, Operations Modeling and Simulation Engineer at Northrop Grumman about engineering use cases for the Hololens. 

His opening remarks: It’s often a struggle integrating new technology into large-scale manufacturers due to adherence to strict methods and processes. Finding/molding problems into good use cases for a given new technology can be challenging. It’s much easier to start with the problem and find/mold a good solution than the other way around. The challenge is helping engineers and operations leadership understand what modern solutions exist.

 

——-

Evan’s Take: In the context of engineering, to show the Hololen’s capabilities in relation to the (DOD acquisition lifecycle) lifecycle stages of a product might be a high value strategy.

Image result for dod engineering systemTemporary Minimum Risk Route (TMRR): How do we design a product that fulfills mission requirements? This can take the form of:

  • visualizing the designs, making sure they’re feasible (i.e. are wires getting pinched?). Uncovering design flaws you’ll discover later in the form of defects during manufacturing. Making sure the design is producible (DFM – Design for Manufacturability).
  • communicating to the customer: In that stage of the lifecycle it’s important to be able to communicate your designs to the customer to demonstrate technical maturity.
    • inspect the product: this part of the product is called “XYZ” can then be exploded.

 

Engineering and Manufacturing Development (EMD): At this stage the customer (NG) cares about “how are we going to build it”

  • tooling design: visualizing the product sitting in the tools or workstands that will be used in production
  • visualizing the ergonomics people are going to have to deal with for example are the clearances sufficient to *screw in the screw, so ergonomics*
  • visualizing the factory flow, the customer (NG’s customer) would also be interested in seeing the proposed factory flow to build confidence. It’s becoming more common to see this as a line item in contracts (Contract Data Requirements List or CDRL)

Subsequent steps in Production & Deployment are:

  • Low rate initial production (LRIP)
  • Full rate production (FRP)

 

Who the customer is: Mechanics on the factory floor using hololens for work instructions, saw a lot of interest at Raytheon and NG to use Virtual Work instructions overlayed onto the hardware (Google Glass, Light Guide Systems, etc). In a more mature program that’s in production, the mechanic, or the electrician on the factory floor would be the end user. Today, they look away from the product where work instructions are pulled up on the computer. Their instructions might be several feet away from the work, hopefully they’ve interpreted the instructions well so they don’t cause a defect. Operators work from memory or don’t follow work instructions if it’s too cumbersome to do so. DCMA (Customer’s oversight) issues corrective action requests (CAR’s) to the contractor when operators don’t appear to be following work instructions (i.e. the page they’re on doesn’t match the step in the process they’re currently working on, or worse, they don’t have the instructions pulled up). Getting too many of these is really bad. So where AR is really useful, is when AR is overlaying instructions on the product as it’s built. Care should be given to the Manufacturing Engineer’s workflow for creating and approving work instructions, work instruction revisions, etc. Long-term, consideration probably needs to be given to integration with the Manufacturing execution system (MES) and possibly many other systems (ERP, PLM, etc.).

The Hololens tech is seemingly a ways away from that––seamlessly identifying the hardware regardless of physical position/orientation as well as making it easy for manufacturing engineers to author compliant work instructions

Another consideration, for any of the above use cases in the defense industry, is wireless. Most facilities will not accommodate devices that transmit or receive signals over any form of wireless. For the last use case, tethering a mechanic to a wired AR device is inhibiting.

 

Games as Medicine | FDA Clearance Methods

 

Games as Medicine | FDA Clearance Methods

Noah Falstein, @nfalstein
President, The Inspiracy
Neurogaming Consultant

Technically software and games are cleared and not approved by the FDA.

By background, Noah:

  • Has attended 31 GDCs
  • Been working in games since 1980 (started in entertainment and arcade games with Lucas Entertainment)
  • Gradually shifted over and consulted for 17 years on a wide variety of games
  • Started getting interested in medical games in 1991 (i.e. East3)
  • Went to Google and left due to platform perspective one had to have at Google
  • Game designer not a doctor, but voraciously learns about science and medical topics

Table of Content:

  • Context of games for health
  • New factor of FDA clearance
  • Deeper dive
  • Adv. and Disadvan. to clearance

Why are games and health an interesting thing?

Three reasons why games for health are growing quickly and are poised to be a very important thing

  • It’s about helping people (i.e. Dr. Sam Rodriguez’s work Google “Rodriguez pain VR”)
  • It’s challenging, exciting, and more diverse than standard games (i.e. games need to be fun, but if they’re not having the desired effect, for example restoring motion after a stroke, then you encounter an interesting challenge). The people in the medical field tend to be more diverse than those in the gaming space.
  • It’s a huge market* FDA clearance = big market
    IMG_2271

So what’s the catch?

Mis-steps along the way

  • Brain Training (i.e. Nintendo Gameboy had popular Japanese games claiming brain training)
  • Wii Fit (+U) (i.e. the balance board)
  • Lumosity fine (i.e. claims made that were unsubstantiated by research)

upshot: lack of research and good studies underpinning claims

Some bright spots

  • Remission from Hopelab (i.e. they targeted adherence: using the consequences of not having enough chemotherapy in their body)

FDA clearance is a gold standard

  • Because it provides a stamp of good, trustable, etc.
  • The burden is on the people who make products to go through a regimen of tests that are science-driven
  • Noah strongly recommends Game Devs to link up with a university
  • Working on SaMD – Software as a Med Device
  • Biggest single world market drives others
  • Necessary for a prescription and helps with insurance reimbursement
  • but it’s very expensive and time-consuming

IMG_2272

FDA definition of a serious disease
[missing]

MindMaze Pro

  • FDA clearance May 2017
  • Stroke Rehabilitation
  • Early in-hospital acute care while plasticity high

Pear Therapeutic

  • Positions its product as a “prescription digital therapeutic”

IMG_2273

Akili Interactive Labs

  • Treats pediatric ADHD
  • Late-stage trial results (Dec. 2017) were very positive with side effects of a headache and frustration, which is much better than alternatives like Ritalin
  • Seeking De Novo clearance
  • Adam Gazzaley – began as aging adult research with Neuroracer, a multi-year study published in Nature

The Future – Good, Bad, Ugly, Sublime

  • Each successful FDA clearance helps
  • But they still will require big $, years to dev
  • you have to create a company, rigorously study it, stall production because changing your game
    would make results invalid from studies, then you need to release it
  • Pharma is a powerful but daunting partner

Questions

  • Can FDA certification for games then reveal that some games are essentially street drugs?