Useful Resources for AI

Newsletters/blogs:
– TLDR AI (https://tldr.tech/ai) – Andrew Tan
– Ben’s Bites (https://lnkd.in/gNY8Dmme)
– The Information *paid subscription required (https://lnkd.in/gbkaFbvf)
– Last week in AI (https://lastweekin.ai/)
– Eric Newcomer (https://www.newcomer.co/)

Podcasts:
– No priors with Sarah Guo + Elad Gil (https://lnkd.in/g7Wmr6XT)
– All-in podcast – not AI specific but they talk a lot about it (https://lnkd.in/gH35UeUy)
– Lex Fridman (https://lnkd.in/gjw7zsWX)

Online courses:
DeepLearning.ai by Andrew Ng – https://lnkd.in/gWcn5UTK

Institutional VC writing: 
– Sequoia (https://lnkd.in/g-cKpn8Y)
– A16z (https://lnkd.in/g6JxqwZA)
– Lightspeed (https://lnkd.in/gczzdEcd)
– Bessemer (https://www.bvp.com/ai)
– Radical Ventures (https://lnkd.in/guCe5Mnt); Rob Toews (https://lnkd.in/ggH8HfT8) and Ryan Shannon (https://lnkd.in/gRrBzePx)
– Madrona (https://lnkd.in/gy5D8yNG)

Industry Conferences:
– Databricks Data + AI Summit (https://lnkd.in/gF5QyXYv)
– Snowflake (https://lnkd.in/gavqzw65)
– Salesforce Dreamforce (https://lnkd.in/gJk4r58N)

Academic Conferences:
– NeurIPS (https://neurips.cc/)
– CVPR (https://cvpr.thecvf.com/)
– ICML (https://icml.cc/)
– ICLR (https://iclr.cc/)

Books:
– Genius makers, by Cade Metz (https://lnkd.in/gr_78MB9)
– A Brief History of Intelligence, by Max Bennett (https://lnkd.in/g2uCrPzS)
– The worlds I see, by Fei-Fei Li (https://lnkd.in/gY8Qsvis)
– Chip Wars, by Chris Miller (https://lnkd.in/g6ZAZSCG)

The original author of this post was Kelvin Mu on Linkedin.

Unification of Meta Platforms: Exploring Account Center’s Role

2–3 minutes
  1. Context: Meta’s diverse set of platforms has a singular center for account management, called Account Center.
  2. Why does this matter?: Areas I mean to explore further
  3. Taxonomy of Account Center
  4. Conclusion

Meta offers a unified account management center for Meta Horizons, Instagram, and Facebook.

Meta Account Center

Context:

Meta’s diverse set of platforms has a singular center for account management, called Account Center.

A unified account and identity system is crucial. We all increasingly sync data from one application to another.

Though, I have questions about how identity, security, single sign-on, and seamless connected experiences will be handled in the future. These questions will be charted through multiple posts.

Account Center is focused on accounts on Instagram, Facebook (Big Blue), and Meta Horizon at the moment. Given the backdrop of the metaverse, Meta is building towards… it is in service of their interests to set a high bar on how the unification of platforms will work.

Why does this matter?: Areas I mean to explore further

1. Account Center functions for different platforms like IG, Horizons, or Facebook. But how come other Meta-owned platforms (i.e. WhatsApp) aren’t there yet? When / what will be the expansion of this from Meta into a standard we can all tap on?

2. Potential challenges and benefits of identity management in the metaverse and where turn-key solutions can exist.

3. Real-life scenarios and/or case studies to illustrate the impact of the Account Center on user experience.

Taxonomy of Account Center

The Account Center delivers its services through several key components:

Profiles: Centralized management of user profiles across different platforms.

Connected Experiences: Seamless integration and data synchronization between applications.

Password & Security: Enhanced security measures and password management.

Personal Details: Management of personal information.

Your Information and Permissions: Control over data sharing and permissions.

Ad Preferences: Customization of ad settings and preferences.

Meta Pay: Unified payment system across Meta’s platforms.

Meta Verified: Verification service for enhanced account credibility.

Accounts: Overall management of linked accounts.

Click here for the Meta Accounts Center Taxonomy (Rolling Updates)

Conclusion

Meta’s Account Center and other solutions are a significant step towards unifying user experience. Meta can set a high standard for account management in the emerging metaverse. This can be similar to the post we saw from Zuckerberg evangelizing open-source models. This action will expand its scope and set a high standard for other identity and unification centers.

These future posts will delve into the specifics of these areas. They will provide a comprehensive analysis of the Account Center’s role and potential in the evolving digital landscape.

Please let me know if there are specific details or topics you would like to explore further.

Reblog: To Understand Heart Disease, You Need To Understand This.

Heart disease does not kill people. Heart attacks do.

Appreciating this distinction is critical to understanding heart disease.

Heart disease is the presence of plaque or atherosclerosis in the coronary arteries.

(Heart disease can, of course, refer to many other heart-related conditions, but in general, the terms heart disease and coronary artery disease or atherosclerosis are typically used interchangeably)

A heart attack occurs when plaque ruptures in the artery, forming a clot that blocks blood flow down the artery, and the heart muscle dies.

from Pocket
via Did you enjoy this article? Then read the full version from the author’s website.

Traversal of Immersive Environments | HoloTile Floor from Disney

If you’re new to The Latent Element, I write about future market development narratives or things of interest to me, hence the name “latent” element. These views are not representative of any company nor do they contain privileged info.

More details and contact info are in the about section.

Post Details

Read time:

3–4 minutes

Table of Contents:

  1. The Challenge
  2. Early Solutions
  3. Disney Research HoloTile Floor
  4. Closing Thoughts
  5. Sources

Subscribe to get access

Read more of this content when you subscribe today.

Emergent abilities in LLMs

Are Emergent Abilities of Large Language Models a Mirage?
Authored by Rylan Schaeffer, Brando Miranda, and Sanmi Koyejo
Computer Science, Stanford University

https://arxiv.org/pdf/2304.15004.pdf

This work challenges the notion of emergent abilities in large language models, suggesting that these abilities are not inherent to the model’s scale but rather a result of the choice of metrics used in research. Emergent abilities are defined as new capabilities that appear abruptly and unpredictably as the model scales up. The authors propose that when a specific task and model family are analyzed with fixed model outputs, the appearance of emergent abilities is influenced by the type of metric chosen: nonlinear or discontinuous metrics tend to show emergent abilities, whereas linear or continuous metrics show smooth, predictable changes in performance.

To support this hypothesis, the authors present a simple mathematical model and conduct three types of analyses:

  1. Examining the effect of metric choice on the InstructGPT/GPT-3 family in tasks where emergent abilities were previously claimed.
  2. Performing a meta-analysis on the BIG-Bench project to test predictions about metric choices in relation to emergent abilities.
  3. Demonstrating how metric selection can create the illusion of emergent abilities in various vision tasks across different deep networks.

Their findings suggest that what has been perceived as emergent abilities could be an artifact of certain metrics or insufficient statistical analysis, implying that these abilities might not be a fundamental aspect of scaling AI models.

Emergent abilities of large language models are created by the researcher’s chosen
metrics, not unpredictable changes in model behavior with scale.

The term “emergent abilities of LLMs” was recently and crisply defined as “abilities that are not
present in smaller-scale models but are present in large-scale models; thus they cannot be predicted
by simply extrapolating the performance improvements on smaller-scale models”. Such emergent abilities were first discovered in the GPT-3 family. Subsequent work emphasized the discovery, writing that “[although model] performance is predictable at a general level, performance on a
specific task can sometimes emerge quite unpredictably and abruptly at scale”.

https://arxiv.org/pdf/2304.15004.pdf

Refuge

With violent conflicts all over the world in Palestine, Gaza, Ukraine, Myanmar, Chad, and many other places… the word refuge or sanctuary, was the center of Jack Kornfield’s Monday Dharma talk tonight.


What is your refuge? … What helps you forgive? … Take refuge in trust.

Jack Kornfield


Jack shared that the wisest of us look at the roots of suffering and the causes of conflict (for example, he mentioned cyclical, complex trauma). In light of this, when we haven’t unearthed the axioms of a situation we may find uncertainty and curiosity are valuable guiding traits. To represent this idea further see ‘taking a position’ not a ‘side’.

The Dharma then naturally veers towards the crucial role of community in today’s tumultuous times, especially in fostering peace and support. Somewhere you can feel safe.

In a grounded sense, this takes much less than you might think. A community can be just two people. I’ve seen that sending a simple text message saying “I’m sending you a big hug” to one of my really traumatized and grief-stricken Palestinian friends went a long way. In a sense, perhaps a litmus test of a community is a place where you are seen and heard by another person compassionately.

One poignant observation that was shared, particularly in the context of the conflict in Palestine, spotlighted the hurdle one faction faces in empathizing with the other, especially when their minds are clouded with their own distress. I think this is of critical importance for us to understand. When suffering, we all need to recognize that unconsciously we are likely to be less present to the experience of others.

Join a Monday Night Dharma Talk & Meditation with Jack Kornfield [Click here].

“Spirit Rock’s Monday Night Dharma Talk and Meditation program is open to all and meets every Monday night from 7:15 – 9:15 p.m. Jack Kornfield began this weekly practice and gathering over 36 years ago to introduce the practices of awareness and compassion.” 

Spirit Rock

“Don’t let the behavior of others destroy your inner peace.” – Dalai Lama

This entire post is scoped to discuss the primary definition of refuge, a physical space. However, there are also virtual and psychological kinds of refuge. The concept of refuge is helpful to those even without physical conflict perhaps those who are burned out, lonely, or struggling otherwise.

RealityKit Motion Capture and Apple’s future iPhone including a time-of-flight camera

Apple analyst Ming-Chi Kuo claiming in his latest report that two of the 2020 iPhones will feature a rear time-of-flight (ToF) 3D depth sensor for better augmented reality features and portrait shots, via MacRumors.

“It’s not the first we’ve heard of Apple considering a ToF camera for its 2020 phones, either. Bloomberg reported a similar rumor back in January, and reports of a 3D camera system for the iPhone have existed since 2017. Other companies have beaten Apple to the punch here, with several phones on the market already featuring ToF cameras. But given the prevalence of Apple’s hardware and the impact it tends to have on the industry, it’s worth taking a look at what this camera technology is and how it works.

What is a ToF sensor, and how does it work?

Time-of-flight is a catch-all term for a type of technology that measures the time it takes for something (be it a laser, light, liquid, or gas particle) to travel a certain distance.

In the case of camera sensors, specifically, an infrared laser array is used to send out a laser pulse, which bounces off the objects in front of it and reflects back to the sensor. By calculating how long it takes that laser to travel to the object and back, you can calculate how far it is from the sensor (since the speed of light in a given medium is a constant). And by knowing how far all of the different objects in a room are, you can calculate a detailed 3D map of the room and all of the objects in it.

The technology is typically used in cameras for things like drones and self-driving cars (to prevent them from crashing into stuff), but recently, we’ve started seeing it pop up in phones as well.”

The current state of ARKit 3 and an observation

Screen Shot 2019-07-29 at 12.07.16 PM.png

ARKit 3 has an ever-increasing scope, and of particular interest to me are those AR features which under the hood rely upon machine learning, namely Motion Capture.

Today, ARKit 3 uses raycasting as well as ML Based Plane Detection on awake or when the app using ARKit 3 is initially opened in order to place the floor, for example.

Check the video below. In it, I’m standing in front of my phone which is propped up on a table.

In this video, I’m using motion capture via an iPhone XR. My phone is sitting on a surface (namely the table) that it has determined is the floor plane, and as a result, you’ll notice that our avatar, once populated into the scene, has an incorrect notion of where the ground is.

Screen Shot 2019-07-30 at 1.27.20 PM

It’s the hope that new ToF sensor technology will allow for a robust and complete understanding of the layout of objects in the room and the floor. Such that, for the same context, the device is able to tell that it is sitting on a table yet, the floor is not that plane but the one further away in the real world scene before it.

 

Source:
The Verge, “Apple’s future iPhone might add a time-of-flight camera — here’s what it could do”

Reblog: Suffering-oriented programming

Suffering-oriented programming can be summarized like so: don’t build technology unless you feel the pain of not having it. It applies to the big, architectural decisions as well as the smaller everyday programming decisions. Suffering-oriented programming greatly reduces risk by ensuring that you’re always working on something important, and it ensures that you are well-versed in a problem space before attempting a large investment.

[Nathan Marz has] a mantra for suffering-oriented programming: “First make it possible. Then make it beautiful. Then make it fast.”

via Did you enjoy this article? Then read the full version from the author’s website.

Reblog: Adventure at the 5th Oculus Connect Conference

oculus_connect_5_dilan.png

The following is a write up from a friend, Kathryn Hicks, on the Danse blog. The link to the original is at the bottom. 

Last week I attended the 5th Oculus Connect Conference held at the San Jose McEnery Convention Center. This two-day conference is held annually during the fall, which showcases the new virtual reality technology from Oculus. It was my second time attending, and it felt even better than the last one.

During the Keynote address, Zuckerberg announced a wireless headset that doesn’t need a cell phone, and an external computer. The Quest, a standalone headset with 6 degrees of freedom, touch controllers and is a potential game-changer for the VR industry. If you are familiar with the Rift and the Oculus Go, the Quest would be a marriage of the two. The Quest is scheduled to come out this spring and will be $399, and a lot of the Rift titles will be available on the Quest. While unfortunately, I was not able to try it, the feedback that I heard from others was positive. The tetherless aspect of the headset creates a more immersive experience and doesn’t feel confined. While the graphics capabilities of the headset are not as high as the Rift, they are good enough and don’t hinder the experience. Plus the optics, as well as the sound, have improved from the Oculus Go. On the downside, the Quest is reportedly top heavy and a denser headset than the Go, which I find the Go to be more substantial than the lightweight Rift. Since the Quest has four inside out cameras on the front of you, if you move the controllers behind you, you could potentially lose tracking. Hopefully, they will make these adjustments before it launches in the spring and add tracking on the strap. I can see much potential with the Quest, such as eSports, education, businesses, medical, engineering, set design; the list goes on. The possibilities are endless, and for the price point, it could substantially increase VR users. Considering that the Quest will be the price of most gaming consoles, without the need of television or home set up.

Walking around the conference was lovely, I felt like a kid in a candy store seeing people putting their full body into the Quest. The well-orchestrated design layouts and theme of the different experiences were terrific. It was a pleasure hearing eSports commentary and cheers as competitors go head to head playing Echo Arena and Onward. Seeing the VR community connect, share laughs, smile, and have a good time, warmed my heart. I enjoyed watching people play the Dead & Buried Quest experience in a large arena and seeing their digital avatars battle each other on screen. I can see more VR arenas being built specifically for the Quest, kind of like skate parks, or soccer parks, but with a sports stadium vibe.

While I was at the conference, I tried a few experiences like The Void – Star Wars Secrets of the Empire, which is a full sensory VR experience. You are an undercover Rebel fighter disguised as a Stormtrooper, as a user you get to interact with your teammates fully, feel, and smell the environment around you. It was a fantastic experience, and I would encourage others to try it at one of the nine locations.

Another experience I tried was the Wolves in the Walls a VR adaptation of Neil Gaiman’s book and created by the company Fable. The audience explores parts of Lucy’s house to try and find hidden wolves in the walls. It was a more intimate experience, and Lucy’s performance felt pretty lifelike. The environments and character designs were beautifully portrayed. Overall it was an enjoyable VR experience.

I also played a multiplayer combat experience called Conjure Strike by The Strike Team. It’s an engaging multiplayer experience, which you can play as a different rock like characters that have different classes like an Elementalist, Mage Hunter, Earth Warden and more. The multiplayer session I had played was similar to capture the flag game. One player has to push a box toward the other side while the opposing player stops the player. It was a fun experience similar to that of Overwatch but in VR. The multiplayer mechanics were excellent, but some of the controls felt foreign to me. Overall it’s an engaging game that seems like it would be popular amongst most VR users.

While I didn’t get to play as many demos as I would have liked, I enjoyed the ones I experienced, especially The Void. It was the most immersive experience I tried, the few things I would change are: update the headset and enhance the outside temperature and wind strength.

I’m looking forward to more development put towards, the Quest and I’m optimistic about the future of VR. As a team member at The Danse, I am excited to work on projects utilizing immersive technology such as virtual & augmented reality. Also, to work in an industry, the is ever changing and improving. It’s nice coming back to the Oculus Connect Conference and see the community excited about the future of VR.

Acceleration and Motion Sickness in the Context of Virtual Reality (VR)

As I traveled around the world with the HTC Vive and Oculus Rift, universally first-timers would be fascinated, but a bit woozy after trying VR. What contributes to this? One possibility is the vergence-accommodation issue with current displays. However, the subject of this post is locomotion and the anatomical reasoning behind the discomfort arising from poorly designed VR.

With VR you typically occupy a larger virtual space than that of your immediate physical surroundings.

So, to help you traverse, locomotion or in other words a way of sending you from point A to point B in the virtual space was designed. Here’s what this looks like:

Image result for teleportation vr gif

Caption: This guy is switching his virtual location by pointing a laser on the tip of his controller to move around.

Movement with changing velocity through a virtual environment can contribute to this overall feeling of being in a daze.

That’s why most creators smooth transitions and avoid this kind of motion (i.e. blink teleport, constant velocity movement from Land’s End). Notice how the movement seems steady and controlled below?

Image result for lands end vr gif

Acceleration and Velocity

‘Acceleration’ is, put simply, any kind of change of speed measured over time, generally [written] as m^-2 (meters per second, per second) if it’s linear or in rad^-2 (same but with an angle) if it’s around an axis. Any type of continuous change in the speed of an object will induce a non-zero acceleration.”

The Human Vestibular System

When you change speed, your vestibular system should register an acceleration. The vestibular system is part of your inner ear. It’s basically the thing that tells your brain if your head is up or down, and permit[s] you to [stand] and walk without falling all the time!

Internal ear diagram that show the semi-circular cannals where the acceleartion forces are sensed.

Fluid moving in your semicircular canals is measured and the information is communicated to your brain by the cranial nerves. You can think of this as [similar to how] an accelerometer and a gyroscope works.

[This] acceleration not only includes linear acceleration (from translation in 3D space), but also rotational acceleration, which induces angular acceleration, and empirically, it seems to be the worse kind in the matter of VR sickness…”

Now that you have this grounding for our anatomical system of perceiving acceleration the upshot is that often viewers in VR will experience movement visually but not via these semicircular canals. It’s this incongruence that drives VR sickness with current systems.

Some keywords to explore more if you’re interested in the papers available are: Vection, Galvanic Vestibular Stimulation (GVS), and Self-motion.

via Read more on the ways developers reduce discomfort from the author’s website.