RealityKit Motion Capture and Apple’s future iPhone including a time-of-flight camera

Apple analyst Ming-Chi Kuo claiming in his latest report that two of the 2020 iPhones will feature a rear time-of-flight (ToF) 3D depth sensor for better augmented reality features and portrait shots, via MacRumors.

“It’s not the first we’ve heard of Apple considering a ToF camera for its 2020 phones, either. Bloomberg reported a similar rumor back in January, and reports of a 3D camera system for the iPhone have existed since 2017. Other companies have beaten Apple to the punch here, with several phones on the market already featuring ToF cameras. But given the prevalence of Apple’s hardware and the impact it tends to have on the industry, it’s worth taking a look at what this camera technology is and how it works.

What is a ToF sensor, and how does it work?

Time-of-flight is a catch-all term for a type of technology that measures the time it takes for something (be it a laser, light, liquid, or gas particle) to travel a certain distance.

In the case of camera sensors, specifically, an infrared laser array is used to send out a laser pulse, which bounces off the objects in front of it and reflects back to the sensor. By calculating how long it takes that laser to travel to the object and back, you can calculate how far it is from the sensor (since the speed of light in a given medium is a constant). And by knowing how far all of the different objects in a room are, you can calculate a detailed 3D map of the room and all of the objects in it.

The technology is typically used in cameras for things like drones and self-driving cars (to prevent them from crashing into stuff), but recently, we’ve started seeing it pop up in phones as well.”

The current state of ARKit 3 and an observation

Screen Shot 2019-07-29 at 12.07.16 PM.png

ARKit 3 has an ever-increasing scope, and of particular interest to me are those AR features which under the hood rely upon machine learning, namely Motion Capture.

Today, ARKit 3 uses raycasting as well as ML Based Plane Detection on awake or when the app using ARKit 3 is initially opened in order to place the floor, for example.

Check the video below. In it, I’m standing in front of my phone which is propped up on a table.

In this video, I’m using motion capture via an iPhone XR. My phone is sitting on a surface (namely the table) that it has determined is the floor plane, and as a result, you’ll notice that our avatar, once populated into the scene, has an incorrect notion of where the ground is.

Screen Shot 2019-07-30 at 1.27.20 PM

It’s the hope that new ToF sensor technology will allow for a robust and complete understanding of the layout of objects in the room and the floor. Such that, for the same context, the device is able to tell that it is sitting on a table yet, the floor is not that plane but the one further away in the real world scene before it.

 

Source:
The Verge, “Apple’s future iPhone might add a time-of-flight camera — here’s what it could do”

Reblog: Suffering-oriented programming

Suffering-oriented programming can be summarized like so: don’t build technology unless you feel the pain of not having it. It applies to the big, architectural decisions as well as the smaller everyday programming decisions. Suffering-oriented programming greatly reduces risk by ensuring that you’re always working on something important, and it ensures that you are well-versed in a problem space before attempting a large investment.

[Nathan Marz has] a mantra for suffering-oriented programming: “First make it possible. Then make it beautiful. Then make it fast.”

via Did you enjoy this article? Then read the full version from the author’s website.

Reblog: Adventure at the 5th Oculus Connect Conference

oculus_connect_5_dilan.png

The following is a write up from a friend, Kathryn Hicks, on the Danse blog. The link to the original is at the bottom. 

Last week I attended the 5th Oculus Connect Conference held at the San Jose McEnery Convention Center. This two-day conference is held annually during the fall, which showcases the new virtual reality technology from Oculus. It was my second time attending, and it felt even better than the last one.

During the Keynote address, Zuckerberg announced a wireless headset that doesn’t need a cell phone, and an external computer. The Quest, a standalone headset with 6 degrees of freedom, touch controllers and is a potential game-changer for the VR industry. If you are familiar with the Rift and the Oculus Go, the Quest would be a marriage of the two. The Quest is scheduled to come out this spring and will be $399, and a lot of the Rift titles will be available on the Quest. While unfortunately, I was not able to try it, the feedback that I heard from others was positive. The tetherless aspect of the headset creates a more immersive experience and doesn’t feel confined. While the graphics capabilities of the headset are not as high as the Rift, they are good enough and don’t hinder the experience. Plus the optics, as well as the sound, have improved from the Oculus Go. On the downside, the Quest is reportedly top heavy and a denser headset than the Go, which I find the Go to be more substantial than the lightweight Rift. Since the Quest has four inside out cameras on the front of you, if you move the controllers behind you, you could potentially lose tracking. Hopefully, they will make these adjustments before it launches in the spring and add tracking on the strap. I can see much potential with the Quest, such as eSports, education, businesses, medical, engineering, set design; the list goes on. The possibilities are endless, and for the price point, it could substantially increase VR users. Considering that the Quest will be the price of most gaming consoles, without the need of television or home set up.

Walking around the conference was lovely, I felt like a kid in a candy store seeing people putting their full body into the Quest. The well-orchestrated design layouts and theme of the different experiences were terrific. It was a pleasure hearing eSports commentary and cheers as competitors go head to head playing Echo Arena and Onward. Seeing the VR community connect, share laughs, smile, and have a good time, warmed my heart. I enjoyed watching people play the Dead & Buried Quest experience in a large arena and seeing their digital avatars battle each other on screen. I can see more VR arenas being built specifically for the Quest, kind of like skate parks, or soccer parks, but with a sports stadium vibe.

While I was at the conference, I tried a few experiences like The Void – Star Wars Secrets of the Empire, which is a full sensory VR experience. You are an undercover Rebel fighter disguised as a Stormtrooper, as a user you get to interact with your teammates fully, feel, and smell the environment around you. It was a fantastic experience, and I would encourage others to try it at one of the nine locations.

Another experience I tried was the Wolves in the Walls a VR adaptation of Neil Gaiman’s book and created by the company Fable. The audience explores parts of Lucy’s house to try and find hidden wolves in the walls. It was a more intimate experience, and Lucy’s performance felt pretty lifelike. The environments and character designs were beautifully portrayed. Overall it was an enjoyable VR experience.

I also played a multiplayer combat experience called Conjure Strike by The Strike Team. It’s an engaging multiplayer experience, which you can play as a different rock like characters that have different classes like an Elementalist, Mage Hunter, Earth Warden and more. The multiplayer session I had played was similar to capture the flag game. One player has to push a box toward the other side while the opposing player stops the player. It was a fun experience similar to that of Overwatch but in VR. The multiplayer mechanics were excellent, but some of the controls felt foreign to me. Overall it’s an engaging game that seems like it would be popular amongst most VR users.

While I didn’t get to play as many demos as I would have liked, I enjoyed the ones I experienced, especially The Void. It was the most immersive experience I tried, the few things I would change are: update the headset and enhance the outside temperature and wind strength.

I’m looking forward to more development put towards, the Quest and I’m optimistic about the future of VR. As a team member at The Danse, I am excited to work on projects utilizing immersive technology such as virtual & augmented reality. Also, to work in an industry, the is ever changing and improving. It’s nice coming back to the Oculus Connect Conference and see the community excited about the future of VR.

Acceleration and Motion Sickness in the Context of Virtual Reality (VR)

As I traveled around the world with the HTC Vive and Oculus Rift, universally first-timers would be fascinated, but a bit woozy after trying VR. What contributes to this? One possibility is the vergence-accommodation issue with current displays. However, the subject of this post is locomotion and the anatomical reasoning behind the discomfort arising from poorly designed VR.

With VR you typically occupy a larger virtual space than that of your immediate physical surroundings.

So, to help you traverse, locomotion or in other words a way of sending you from point A to point B in the virtual space was designed. Here’s what this looks like:

Image result for teleportation vr gif

Caption: This guy is switching his virtual location by pointing a laser on the tip of his controller to move around.

Movement with changing velocity through a virtual environment can contribute to this overall feeling of being in a daze.

That’s why most creators smooth transitions and avoid this kind of motion (i.e. blink teleport, constant velocity movement from Land’s End). Notice how the movement seems steady and controlled below?

Image result for lands end vr gif

Acceleration and Velocity

‘Acceleration’ is, put simply, any kind of change of speed measured over time, generally [written] as m^-2 (meters per second, per second) if it’s linear or in rad^-2 (same but with an angle) if it’s around an axis. Any type of continuous change in the speed of an object will induce a non-zero acceleration.”

The Human Vestibular System

When you change speed, your vestibular system should register an acceleration. The vestibular system is part of your inner ear. It’s basically the thing that tells your brain if your head is up or down, and permit[s] you to [stand] and walk without falling all the time!

Internal ear diagram that show the semi-circular cannals where the acceleartion forces are sensed.

Fluid moving in your semicircular canals is measured and the information is communicated to your brain by the cranial nerves. You can think of this as [similar to how] an accelerometer and a gyroscope works.

[This] acceleration not only includes linear acceleration (from translation in 3D space), but also rotational acceleration, which induces angular acceleration, and empirically, it seems to be the worse kind in the matter of VR sickness…”

Now that you have this grounding for our anatomical system of perceiving acceleration the upshot is that often viewers in VR will experience movement visually but not via these semicircular canals. It’s this incongruence that drives VR sickness with current systems.

Some keywords to explore more if you’re interested in the papers available are: Vection, Galvanic Vestibular Stimulation (GVS), and Self-motion.

via Read more on the ways developers reduce discomfort from the author’s website.

Reblog: The Mind-Expanding Ideas of Andy Clark

The idea of the extended mind or extended cognition is not part of common parlance; however, many of us have espoused this idea naturally since our youth. It’s the concept that we use external, physical or digital, information to extend our knowledge and thinking processes.

Today’s “born-digital” kids––the first generation to grow up with the Internet, born 1990 and later––store their thoughts, education, and self-dialogue in external notes saved to the cloud. [1]

“… [Andy Clark describes us as] cyborgs, in the most natural way. Without the stimulus of the world, an infant could not learn to hear or see, and a brain develops and rewires itself in response to its environment throughout its life.”

via Read the full version from the author’s website.

[1] McGonigal; “Reality is Broken” pg. 127

Games as Medicine | FDA Clearance Methods

 

Games as Medicine | FDA Clearance Methods

Noah Falstein, @nfalstein
President, The Inspiracy
Neurogaming Consultant

Technically software and games are cleared and not approved by the FDA.

By background, Noah:

  • Has attended 31 GDCs
  • Been working in games since 1980 (started in entertainment and arcade games with Lucas Entertainment)
  • Gradually shifted over and consulted for 17 years on a wide variety of games
  • Started getting interested in medical games in 1991 (i.e. East3)
  • Went to Google and left due to platform perspective one had to have at Google
  • Game designer not a doctor, but voraciously learns about science and medical topics

Table of Content:

  • Context of games for health
  • New factor of FDA clearance
  • Deeper dive
  • Adv. and Disadvan. to clearance

Why are games and health an interesting thing?

Three reasons why games for health are growing quickly and are poised to be a very important thing

  • It’s about helping people (i.e. Dr. Sam Rodriguez’s work Google “Rodriguez pain VR”)
  • It’s challenging, exciting, and more diverse than standard games (i.e. games need to be fun, but if they’re not having the desired effect, for example restoring motion after a stroke, then you encounter an interesting challenge). The people in the medical field tend to be more diverse than those in the gaming space.
  • It’s a huge market* FDA clearance = big market
    IMG_2271

So what’s the catch?

Mis-steps along the way

  • Brain Training (i.e. Nintendo Gameboy had popular Japanese games claiming brain training)
  • Wii Fit (+U) (i.e. the balance board)
  • Lumosity fine (i.e. claims made that were unsubstantiated by research)

upshot: lack of research and good studies underpinning claims

Some bright spots

  • Remission from Hopelab (i.e. they targeted adherence: using the consequences of not having enough chemotherapy in their body)

FDA clearance is a gold standard

  • Because it provides a stamp of good, trustable, etc.
  • The burden is on the people who make products to go through a regimen of tests that are science-driven
  • Noah strongly recommends Game Devs to link up with a university
  • Working on SaMD – Software as a Med Device
  • Biggest single world market drives others
  • Necessary for a prescription and helps with insurance reimbursement
  • but it’s very expensive and time-consuming

IMG_2272

FDA definition of a serious disease
[missing]

MindMaze Pro

  • FDA clearance May 2017
  • Stroke Rehabilitation
  • Early in-hospital acute care while plasticity high

Pear Therapeutic

  • Positions its product as a “prescription digital therapeutic”

IMG_2273

Akili Interactive Labs

  • Treats pediatric ADHD
  • Late-stage trial results (Dec. 2017) were very positive with side effects of a headache and frustration, which is much better than alternatives like Ritalin
  • Seeking De Novo clearance
  • Adam Gazzaley – began as aging adult research with Neuroracer, a multi-year study published in Nature

The Future – Good, Bad, Ugly, Sublime

  • Each successful FDA clearance helps
  • But they still will require big $, years to dev
  • you have to create a company, rigorously study it, stall production because changing your game
    would make results invalid from studies, then you need to release it
  • Pharma is a powerful but daunting partner

Questions

  • Can FDA certification for games then reveal that some games are essentially street drugs?

 

SVVR #49 Summary Notes

SVVR Meetup #49

SVVR Passport – A membership program

This is SVVR’s new shared co-working space for demonstration

  • 24 hour access
  • Demo equipment and library
  • Digital benefits

SVVR VR Mixer 2018

  • March 21st, 2018

Lumus Optics

  • Israeli company doing reflective wave-guide optics whose mission is to be your
    leader in AR displays––more than 60 patents
  • Highest performance for smallest form factor
  • What is wave guide tech?

This boasts

  • Wide FOV (40˚ – 55˚)
  • Compact 1.7mm Form Factor
  • Life-like Image
  • True See-Through
  • Daylight Readable

Founded in 2000
Partnered with Quanta Computer (going to produce the optics engine),
Flex (OEM using Lumus reference), and Deepoptics (vergence accommodation)

They have debuted their new prototype at CES––looks like Dragon Ball equipment.

Developers

Will be able to deploy Vuforia applications or demos using other AR libraries via this.

siram@lumus-optical.com

High Fidelity – Philip Rosedale

  • Probably gonna need
    • Identity
    • IP rights
    • Proof-of-purchase
    • Payment system

Full Decentralization?

Transaction expenses are high (syncing, VR transactions need to happen quickly,
must pay gas)
Federated Consensus – near-zero transaction fees, etc.

Bad Monetary Policy

  • Bitcoin for being usable as a currency isn’t viable. Because Bitcoin is going up in
    price so much and it’s fixed in circulation.
  • Increase circulation as people join
  • Second Life – made more money at roughly the same rate that people come online
    • Use a smart contract to create an exchange rate scaling

High Fidelity Coin (HFC)

  • Stable
  • Blockchain cryptocurrency
  • Easy to use, initial grants for proof-of-identity, and multiple currency exchange

Philip waxing about tech, says something to the effect of “with any big shift in technology
what often comes as compelling is things that are currently done in the world i.e. payment”

Philip also mentioned via s-contracts allowing duels to occur for identity

Cymatic-Bruce

Currently at 6D.AI

Came back from Japan
Saw so many vr experiences, met a ton of the community, discovered VTubing which
can also be found through IMVU, etc.

Learned a lot and PC VR isn’t a thing with Japan due to space and culture

PSVR location based VR situation

VR Zone Shinjuku-–world class location based experience. Tickets range from $10-15 per experience.
They take first-timers very seriously and interestingly take matters of safety really seriously.
Bruce really admired the standard of using motion platforms for everything, tie everything into an
IP, and did a great job of executing. Looks like he did a DBZ experience, and learned how to shoot a
Kamehamehameha. Mario Kart––on a motion platform, etc.

How long is the experience?
How much of what you saw in Japan will likely have an audience in EN?
How is FOVE doing?
VR experiences dealing with food?

view raw

SVVR-49.mdown

hosted with ❤ by GitHub

Reblog: Player – Game – Designer

The above work comes from Thomas Bedenk, who I met at VRX London in 2016. See end his page for sources (link found at bottom).

This model provides a substrate, an interactive application namely a game and its production and consumption, and highlights the aspects regarding components Player, Game, and Designer into the full picture.

Read the full version from the author’s website.