Last Lauf
InfoVibe

© 2026 Mike Laufbahn

EmailLinkedInInstagramGiphy
Meta Ray-Ban Display glasses with Neural Band

AI for Meta Ray-Ban Display

About

First consumer smart glasses with an integrated display and multimodal AI. Cameras, microphones, speakers, and compute in an all-day wearable form factor.

Role

Design Lead, AI Experiences. Collaborated on the interaction model (invoke, listen, think, respond) and owned the visual response patterns across surfaces, including mobile.

Context

Meta Reality Labs, Consumer AI & Wearables. Shipped September 2025. Key constraints: all-day wearability, additive display (black = transparent), progressive disclosure on a minimal viewport.

Meta Ray-Ban Display is the first pair of smart glasses with a full-color, high-resolution display powered by Meta AI. The display is there when you want it and gone when you don't — designed to keep you tuned in to the world, not distracted from it.

01

The Design Challenge

How does AI show up on your face? A heads-up display isn't a phone. You can't read paragraphs in peripheral vision. Every element has to earn its pixels. We needed an interaction model built for glanceability, not density.

02

Invoke, Listen, Think, Respond

A four-stage lifecycle structures every AI interaction. Invoke is always deliberate (voice or gesture via the Neural Band) so the system never interrupts unprompted. Listen confirms you're being heard. Think communicates processing without inducing anxiety. Respond delivers the answer, optimized for a single glance.

03

Constraints as Design Material

Three physical constraints shaped every decision. All-day wear required the glasses to be light and socially unobtrusive. Additive rendering means black pixels are transparent, so you see through them. This inverted traditional dark-mode thinking entirely. The small viewport forced progressive disclosure: glance for the headline, attend for detail, hand off to your phone for depth.

Close-up of Meta Orion AR glasses prototype showing waveguide optics

04

In-Situ Studies & Testing

Lab prototypes only tell you so much. We ran in-situ studies with people wearing prototypes in real environments: walking city streets, cooking, navigating transit. Context changes everything. A notification while walking feels different than sitting. Bright sunlight washes out colors. These insights directly shaped contrast ratios, type sizes, and timing curves.

05

Three Tiers of Attention

A progressive disclosure model across three tiers of attention. Glance: the headline, a caller name, a one-line AI response. Attend: full messages, step-by-step instructions, navigation turns. Handoff: the glasses pass context seamlessly to your phone. The display adds value without demanding it.

06

See, Hear, Gesture

The glasses capture what you see and hear for contextual AI. Point at a landmark and ask what it is. Look at a foreign-language menu and get a translation. The Neural Band enables gesture navigation (scrolling, selecting, dismissing) without speaking or reaching for your phone.

Meta Neural Band EMG wristband hand gestures

Previous

Create Sandbox for Magic Leap

Next

Immersive Navigation for Google Maps