According to multiple media outlets including MacRumors, Apple is accelerating the development of its AI smart glasses codenamed "N50," aiming to challenge the market position of Meta's Ray-Ban. The device will deeply integrate Apple Intelligence and use gesture operations as its core interaction logic. In terms of hardware configuration, the glasses are said to be equipped with two built-in cameras: one high-resolution lens for taking photos and videos, and another low-resolution wide-angle lens specifically used for recognizing hand gestures and providing real-time visual input to Siri.

To maintain a slim and lightweight design, Apple has chosen to omit 3D cameras, LiDAR sensors, and integrated screens in the first generation. This "screenless" design significantly reduces power consumption, making it suitable for a small-capacity battery and enabling all-day wear. In terms of technology, Apple is adapting the mature gesture navigation technology from Vision Pro and combining it with several recent patents related to "spatial shopping" and "visual guidance" to build a closed-loop interaction system that relies on AI vision without touch controls.
Analysts point out that this glasses will serve as an AI companion for the iPhone, using intelligent routing technology to assign perception tasks to backend processing. Tim Cook has prioritized this project as a key part of the company's strategy, viewing it as a crucial step in advancing "visual intelligence." Currently, Apple is testing lightweight frame materials such as acetate, and the product is expected to be released as early as around the 2026 Christmas holiday period, or possibly postponed to 2027. As AI wearable devices enter a period of explosive growth, Apple's move marks a further expansion of its productivity ecosystem from screens to spatial perception.


