Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster


Samsung is planning a Galaxy Unpacked event for July, and the company plans to introduce new foldable smartphones and AI "Galaxy Glasses," according to Seoul Economic Daily.

Apple-Glasses-Purple-Feature.jpg

Samsung's event will take place on July 22, so it will debut new Galaxy Z Fold8 and Z Flip8 foldable smartphones just weeks ahead of when Apple's first foldable iPhone is introduced, plus it will beat Apple to AI glasses.

Apple has been racing to develop its own smart glasses to compete with the Meta Ray-Ban AI glasses, but rumors suggest Apple won't launch the glasses until 2027. There is a chance Apple will preview the glasses in 2026, but there's no certainty yet.

Samsung is working with eyewear company Gentle Monster for its AI glasses, and the wearable will run Google's Android XR operating system with Gemini integration. The glasses will feature a high-definition camera, speakers, and a microphone, similar to the Meta Ray-Bans, and there will be no built-in display. AI integration will be a main selling point, with Gemini able to use video captured by the wearer to answer queries. Samsung will link the glasses to Galaxy smartphones and its SmartThings home appliance ecosystem.

The glasses that Samsung is working on sound similar to everything rumored for Apple's own AI glasses. Apple's glasses will rely on Siri, and will include cameras to feed visual information to the AI. Speakers and microphones will be included, but no display is expected for the first version.

Samsung is also planning for a Fold Wide, or a foldable smartphone that's similar to the dimensions that Apple plans to use for its foldable iPhone. Samsung's foldables to date have been taller than they are wide, but Apple is planning for a wider, iPad-like 4:5 aspect ratio.

After Samsung's event, Apple will unveil its next smartphones at its traditional September event. Dates are not known at this time.

Article Link: Samsung Set to Beat Apple to AI Smart Glasses With July Launch
 
I am the only who thinks this things are not the future. Who wants to walk around with a camera attached to the face all the time? Who wants another battery device to babysit and be on the face. Who wants to softwares update their glasses. What about people who need glasses. They can’t take them off in inappropriate situations.
 
The Gentle Monster collab could get some buzz, but this is the Samsung pattern again: introduce a product that's first to market but second in impact because Apple's will be better. Samsung does this with such regularity that I have to suspect there's someone at the intersection of their Apple supply division and their own product teams who isn't honoring the NDAs they signed.
 
I'd love to have an iPhone Pro level camera to take paragliding videos without needing to attach a camera to my gear. I can see some potential uses for AI that knows what I'm seeing, but that still needs a use case before I'm willing to buy into that as a feature I need.

On the other hand, poor camera quality would be useless to me. And they need to solve the privacy issue around cameras that are always up. People can sneakily record with their phones, but you can clearly see the phone being held up. These glasses are less obvious, so there needs to be a way to ensure everyone knows when they're recording, and to keep them from starting up in places where they shouldn't (i.e., bathrooms).

I'm thinking Apple has figured some of this out. I don't see any of this in this Samsung device ("HD" camera is not enough in 2026, privacy concerns all over, etc.). So if Apple solves those issues and makes a real case for AI assistants via visual interpretation of the world... they'll walk all over this product.
 
I am the only who thinks this things are not the future. Who wants to walk around with a camera attached to the face all the time? Who wants another battery device to babysit and be on the face. Who wants to softwares update their glasses. What about people who need glasses. They can’t take them off in inappropriate situations.
This trend reminds me of the 3D TV push from the 2010s; however, Apple has always been known for finding new use cases and making them work in a way that feels natural and complements other devices and services…genuely interested to see what they come up with.
 
Apple should create a new new Siri. From scratch. They should call this new new AI "Steve" and they should integrate all Steve Jobs knowledge inside it and also extrapolate it to become pertinent for the future. And each "Steve" will be different, it will live on device and grow from user's interactions... a little bit like a Tamagotchi or a Pokemon. Each one will evolve with its own character... a really really super personal AI. Nothing will be send into the cloud, the "Steve" will just "communicate" together when their respective iPhone are close and share their knowledge, nothing send outside the apple device, nothing into cloud... just a worldwide iPhone mesh of "Steve".

Hey Steve, turn on the lights.
Done ! Amazing, lights just works ! ... One more thing.... I see you used a stylus to work on your iPad... yaaaark who needs a stylus ? So I did put it in the trash.
 
Societally we’re doomed with all this smart glasses crap. The mobile internet revolution with the mass adoption of smartphones was a major mistake especially as social media, algorithmic feed bubbles, and short-form video have hijacked people’s attention and distorted reality.

Now wait for that to be made much worse with always-on manipulated live feed straight to the eyeballs along with always-on cameras enabled from every walking moron wearing these. Eventually normalizing an increasingly dystopic privacy hell upon all of us and further eroding the social fabric.
 
I am the only who thinks this things are not the future. Who wants to walk around with a camera attached to the face all the time? Who wants another battery device to babysit and be on the face. Who wants to softwares update their glasses. What about people who need glasses. They can’t take them off in inappropriate situations.
The point of products like this is to avoid an obvious camera!

The Vision Pro was an experimental product that's kinda obviously a camera (and is large) because the way it generates augmented reality is to
- capture the world with a camera
- generate augmentation to the world with the GPU
- fuse the two together in a display buffer
- show those on a miniscreen in front of your eyes.
So you are not seeing the "real world", you are seeing a screen that displays a camera capture of the real world.

The alternative, the dream everyone has been working towards is something like
- *flat lens* camera captures a rough (non-image) approximation to the real world
- GPU generates augmentation based on the (non-image) data from that camera
- augmentation data is "projected" onto your eye without a screen needed. It's fused by your eye onto the real world that you are seeing anyway.

This relies on (at least!)
- flat lens technology and tech to extract information from the signal. This is a very recently solved problem
- a way to "pipe" and "project" the augmentation signal either onto your eye directly, or onto a glass lens you are looking through. There have been many many suggestions for how to do this, many of them discarded over the years.

Note that you can do a simpler, dumber version of this by leaving out the flat screen part. Now all you are doing is projecting information (but not *augmented reality*) onto the eye or glass. This still has to solve the second problem above, but not the first. The Meta glasses are like this:
You get a display but not a display fused with the outside world, and (as I understand it) IO requires you to wear the neural band (which detects hand movement) rather than having a camera detect them the way Vision Pro does.
Almost certainly Samsung will ship the same sort of thing.

My guess is that Apple wants to do more, in particular they want augmented reality, having sold the Vision Pro on that basis. Using flat lenses (with no imaging) allows them to achieve that goal while not looking like there's a camera present, being thinner/smaller, and being unable to take photos [if that's considered an important goal].
As for the display there is this very recent patent
https://patents.google.com/patent/US20260086292A1
which I assume indicates that they have at least solved the problem in the lab, and have moved to the question of "how do we actually make this hardware?"
 
The Gentle Monster collab could get some buzz, but this is the Samsung pattern again: introduce a product that's first to market but second in impact because Apple's will be better. Samsung does this with such regularity that I have to suspect there's someone at the intersection of their Apple supply division and their own product teams who isn't honoring the NDAs they signed.
Why is everything thinking this is Samsung first to market?
Regardless of horse race nonsense, this is Samsung two years after Meta released this sort of thing.

(Which is not the same thing as saying Samsung saw Meta and decided to copy it!
These products are so complex [regardless of whether you think the Samsung or Meta version suck compared to what Apple will ship] that you can't just go from seeing a demo to shipping something working in two years. It's more like everyone knows this form factor is part of the future, and has been working on it for a while, all shipping suboptimal products, like HoloLens or even Vision Pro, because shipping is the only way you learn enough to get to the next step.)
 
This trend reminds me of the 3D TV push from the 2010s; however, Apple has always been known for finding new use cases and making them work in a way that feels natural and complements other devices and services…genuely interested to see what they come up with.
They missed with the vision pro. that's for sure. See what they come up with for their glasses. With Ternus at the helm, he might go, meh...we are not going to bother with this segment. He didn't want the AVP in the first place.
 
  • Like
Reactions: marblesbarkley
That's fine, go for it. Build the technology and the factories for five years and then license it all to Apple, exactly like the foldables.
 
How do all these tech companies feel when they announce a product and they always get compared to what Apple will do?

They feel just fine and probably hopeful. If Apple is doing it, there is a greater chance there is a market for it. They know that there are people who will buy their product specifically because it is not Apple–Apple haters & Android/Google lovers are a built-in segment. More risky for a company is to be the only one out there. The executive who ushers in a new first-of-its-kind product that flops has his butt hanging out. But if it is a me-too effort, he has cover.
 
AI integration will be a main selling point
They may market that as the main selling point but they'll keep on ignoring reality. Most reviews of all the Meta and similar products say the main selling points are

  1. They are decent glasses for vision correction or sun protection.
  2. They are surprisingly convenient headphones.
  3. They are a discrete and hands free camera.
 
What about people who need glasses. They can’t take them off in inappropriate situations.
My thoughts too since I wear real glasses not sure how places will "enforce" in public yet privacy required areas
*basic examples financial institutions, non-public facing work place, gym locker rooms, restrooms etc....

With respect, in my opinion if you wear these or any similar glasses, you’re a creep.
Totally agree
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.