Glass 2
Research edition
Open-source visual intelligence platform. Shipping Q1 2026.
Monochrome Display
Green dot-matrix waveguide projection. Crisp, legible, readable in one glance. No clutter.
Open Hardware
Published schematics. Documented optics. Forkable designs. Inspect it, modify it, manufacture it.
Privacy-First
Local processing prioritized. You control your data. Transparent architecture. No corporate harvesting.
Multi-Model AI
Run any model, local or cloud. OpenRouter integration. No vendor lock-in. Your intelligence stack, your choice.
Sub-50g Frame
All-day wearability. Comfortable enough to forget you're wearing a research platform.
Audio + Visual
Glass 1's audio intelligence plus visual overlays. Text, icons, status, navigation in your peripheral vision.
For Developers
Build AR overlays that label tools on a workbench. Ship navigation agents. Create real-time translation interfaces. Test on open hardware you can inspect and modify.
For Researchers
Run controlled experiments on human-AI interaction. Study how visual cues affect attention, task performance, memory. Access the full stack: optics, firmware, models, data pipelines.
For Open-Source
Fork the platform. Contribute drivers, UX experiments, agents. Share under open licenses. Shape the roadmap.
For Privacy-Conscious
Local processing prioritized. Transparent data flows. User-controlled storage. Your data stays yours.

Now
SDK development
Privacy architecture
Early partner outreach
Q4 2025
Developer beta units
Public SDK release
Sample application library
Research cohort recruitment
Q1 2026
Public shipping begins
Community registry launch
First hackathon
Research publications



















