Meta Ray-Ban Display glasses with Neural Band

AI for Meta Ray-Ban Display

About

First consumer smart glasses with an integrated display and multimodal AI. Combines cameras, microphones, speakers, and compute in a stylish all-day wearable.

Role

Design Lead, AI Experiences — defined how AI manifests on display glasses, including the full invoke → listen → thinking → respond lifecycle. Owned interaction model and visual language.

Context

Consumer AI & Wearables at Meta Reality Labs. Shipped September 2025. Constraints: all-day wear, additive display rendering, progressive disclosure on a minimal viewport.

Meta Ray-Ban Display is the first pair of smart glasses with a full-color, high-resolution display powered by Meta AI. The display is there when you want it and gone when you don't — designed to keep you tuned in to the world, not distracted from it.

01

The Design Challenge

How does AI show up on your face? A heads-up display isn't a phone — you can't read paragraphs in your peripheral vision. Every piece of information has to earn its pixels. We needed a new interaction model built for glanceability, not density.

02

Invoke, Listen, Think, Respond

We designed a four-stage AI lifecycle. Invoke is a deliberate trigger — voice or gesture via the Neural Band — so the system never interrupts unprompted. Listen provides visual confirmation that you're being heard. Think communicates processing without anxiety. Respond delivers the answer, optimized for a quick glance.

03

Constraints as Design Material

Three physical constraints shaped every decision. All-day wear meant the glasses had to be light and socially acceptable. Additive rendering means black pixels are transparent — you see through them — so we designed around bright elements on transparency, inverting traditional dark-mode thinking entirely. The small viewport forced progressive disclosure: glance for the headline, attend for detail, hand off to your phone for the full picture.

Meta Ray-Ban Display product shot

04

In-Situ Studies & Testing

Lab prototypes only tell you so much. We ran extensive in-situ studies — people wearing prototypes in real environments: walking city streets, cooking dinner, navigating transit. These studies revealed how context changes everything. Reading a notification while walking feels different than sitting down. Bright sunlight washes out certain colors. These insights drove our contrast ratios, type sizes, and timing curves.

05

Three Tiers of Attention

We designed a progressive disclosure system. A quick glance gives you the headline — a caller name, a one-line AI response. Sustained attention reveals more — full messages, step-by-step instructions, navigation turns. And if you need the full picture, the glasses seamlessly hand off to your phone. This glance → attend → handoff model ensured the display adds value without demanding it.

06

See, Hear, Gesture

The glasses capture what you see and hear, enabling truly contextual AI. Point at a landmark and ask what it is. Look at a menu in a foreign language and get a translation. Combined with the Neural Band, you navigate AI responses through subtle hand gestures — scrolling, selecting, dismissing — without speaking or reaching for your phone.

Ray-Ban Wayfarer form factor