NextFin News - Samsung Electronics is preparing to challenge the dominance of Meta and Apple in the burgeoning wearables market with a new pair of AI-powered smart glasses featuring a camera positioned at "eye level." In an interview with CNBC following the conclusion of Mobile World Congress (MWC) 2026 in Barcelona, Jay Kim, Executive Vice President of Samsung’s mobile business, detailed a vision for a device that prioritizes utility and information density over traditional screen-based interaction. The move marks Samsung’s most significant foray into head-worn hardware since its early experiments with mobile VR, signaling a strategic pivot toward "ambient computing" where the smartphone acts as the engine for a hands-free experience.
The centerpiece of the new device is its integrated camera system, which Kim noted is specifically designed to capture and interpret the world from the user’s perspective. By placing the lens at eye level, Samsung aims to bridge the gap between human perception and machine intelligence. This hardware choice is not merely for photography; it is the primary sensor for a suite of AI services that Kim suggests will provide users with "a lot of information" in real-time. This "back and forth" interaction between the glasses and the user’s Galaxy device suggests a symbiotic relationship where the glasses serve as the eyes and the phone provides the processing muscle.
Samsung’s approach appears to be a calculated middle ground between the high-end, display-heavy Apple Vision Pro and the more minimalist, camera-centric Ray-Ban Meta glasses. Notably, Kim declined to confirm whether the glasses would feature a built-in display. He argued that if a user requires a high-resolution screen, they already possess one in their pocket or on their wrist. This suggests Samsung may be leaning toward a lighter, more socially acceptable form factor that relies on audio cues or perhaps a very discreet heads-up display, rather than the bulky "goggles" that have hindered mass-market adoption of augmented reality to date.
The competitive landscape for this device is fierce. Meta has already proven there is a market for camera-equipped frames that look like standard eyewear, while Google and Qualcomm have been working closely with Samsung on the underlying software and chipset architecture for this specific category. By emphasizing the "eye-level" perspective, Samsung is betting that the value of the device lies in its ability to see what the user sees, allowing AI to offer contextual help—such as identifying objects, translating text, or providing navigation—without the friction of holding up a smartphone.
Financially, the stakes are high for the South Korean giant. As the global smartphone market reaches a plateau of maturity, hardware manufacturers are desperate for a "second screen" or a replacement device that can drive ecosystem loyalty. Samsung’s integration of these glasses into the broader Galaxy AI ecosystem is a clear attempt to lock users into its hardware stack. If the glasses become the primary interface for AI interactions, the smartphone becomes a secondary, background processor, fundamentally shifting how consumers interact with digital services.
The success of Samsung’s vision will ultimately depend on privacy and battery life, two hurdles that have tripped up previous attempts at smart eyewear. A camera at eye level raises immediate social concerns, an issue Samsung will need to address through visible recording indicators or strict data-handling policies. However, by focusing on utility and the "information" aspect of AI rather than just entertainment or social media, Samsung is positioning its smart glasses as a professional and personal productivity tool. The era of the smartphone as the sole window into the digital world is ending, and Samsung is betting that the next window will be worn on the face.
Explore more exclusive insights at nextfin.ai.
