Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
- Whether that information is presented "context free" as you say or through AR its the same principle putting information on the lens for someone to see.

- I think a camera built in is pretty much a guaranteed component.

Arstechnica, quoting Prosser, say " There was no camera on the device due to privacy concerns."
I agree this makes little sense -- hence my statement
 
"The glasses will start at $499 with the option for prescription lenses at an extra cost"

If that turns out to be true I might be tempted.

Assuming it doesn't add too much weight or reduces visible over my normal prescription lenses.
Though I'd never get V.1 of any new Apple device.
 
I agree that the AppleWatch is a relatively unnecessary accessory.

It is convenient for many and brilliant is some aspects (like health), but make no mistake, Apple Glasses will change EVERYTHING if Apple delivers.

Imagine:
-NOT having to look over to your dashboard-mounted phone when driving, or even better, getting non-obstructive walking directions while in motion.

-Getting notifications for emails, text, or calls in your peripheral vision, and being able to respond or swipe them away with a hand gesture or a light head shake.

- Being on a train watching Netflix on your glasses, and it pauses and gets out of the field of vision the moment you stand up

-Having the device call 911 when detecting a life threatening sudden stop while device is registering your eyeballs are close to the glass

-Being able to see a new paint color on the walls, or how furniture fits in a room without having to hold a device in front of you and with greater realism.

-Imagine not needing a computer monitor at all when stationary (for light tasks, at least).

I could go on, but this is the FUTURE. IDK if all or ANY of the above can/will happen, but I'm genuinely curious to see where Apple goes with this.

These are some great examples and I think at least some if not all of these are exactly where Apple will take this. Plus no doubt some other examples we haven’t even thought of yet.

Your computer monitor one is one I’ve thought about a lot. Right now I can have a large and/or multi monitor set up at home/office but when I’m out and about I have to settle for the smaller laptop display - or the iPhone/iPad/watch display depending on what I’m doing. These glasses could change all that: I could set the glasses up to emulate (AR) any monitor arrangement I like, anywhere for any of my devices - 32 inch 6K display for my watch? No problem. All these devices we have are varying levels of computing power and different screen sizes. The glasses can give me any virtual screen screen size(s) I want for any of my devices. The only limitation would be the resolution of the glasses themselves.

Perhaps even that’s limiting it too much anyway - imposing today’s paradigms on the new tech. Instead these glasses could potentially redefine how we do everything we do on computers today in ways we can’t even imagine yet.

For now, these plus some AirPods Pro, say, will certainly redefine how we see, and hear (ie. receive) a computer’s output - what currently the monitor and speakers do. For further revolutionizing computing, we need to revolutionize how we input into the computer. For decades it’s been keyboard and mouse. More recently it’s touch and speech, with gestures on the rise. What’s next for input?

The ultimate is when we figure out how to send our thoughts directly to these things. That’s a way off yet of course - and even when we figure that out there’s still be work to do to make sure it’s safe and always accurate. In the meantime...

Well, if these can track our eye movements and focus (tech exists that can do that, but can it fit into something like a pair of glasses?), then that either alone or combined with gestures or touch (eg. look at something on my virtual monitor to emulate the mouse movement, and tap finger and thumb together by my side to do what a mouse click does - or something like that) can potentially replace the mouse for a computer and touch screen input for phone and tablet). But still need a more convenient way to get text into a computer. Speech cuts it only to a point. Pretty hard to dictate code (eg. Web developer inputting html, css, JavaScript, etc.). So some kind of text input still required until we get to a point where perhaps we’re redefining programming languages to be more compatible with dictated language.

Needless to say, these glasses are the start of some (more) huge changes. We’ll look back in 10 years at how they’ve changed the world in ways similar to how we look back at what the iPhone has done.

Exciting stuff if you ask me. 😊
 
  • Like
Reactions: Darth Tulhu
You bring up google glass, an absolute failure... the tech wasn't ready.
Just like Apple had a touch compatible mac before the iPhone, some things were just never ready for market.
Im sure Apple has had advanced AR tech for years, but why risk a google glass when you can refine the technology and wait for the perfect time. With LiDAR and AR now poping up everywhere, Apple has hit the gold rush refining a product and its tech for years for the perfect time when it'll be most effective in the market, not a crapy piece bulky plastic with a camera and small glass square that looks like a kids toy.

I don’t think it’s about risk. It’s about taking the time to get it right.

As we know, Apple didn’t invent digital music players, or smart phones, or contactless payments, or most of the other types of products they’re in. But they did these things better than everyone else by taking their time to get it right.

They‘ve almost always come “late” to these markets but I don’t think it’s about waiting for the market to be ready. I think they know the market long before then. I think it’s about taking the time to refine it and get it right while the others are in a rush to be first so they release half-baked crap that doesn’t work half as well as Apple’s “late” entry when it comes.

I’m pretty sure that’s what’s happening here with these glasses. When they come everyone will say Google did it first, but Google’s failed and Apple’s will succeed. And it’s not because Apple is waiting or testing the market. It’s because Google’s, while innovative, is crap compared to what Apple’s will be when it comes - and that’s because they’re taking years to refine it.

Companies like Google approach it from “hey we’ve figured out this cool tech now let’s see what we can do with it - and quick, before any else does!” Apple approach it from the user experience first and then figure out the tech AND the experience. That’s their big difference and why they take two years or more longer than everyone else to enter (and then redefine) the market.

I could be wrong but I’m pretty sure that’s it.
[automerge]1590254498[/automerge]
Wearable displays for the masses are coming whether people like it or not. Just a matter of time until someone does it right and looks like Apple got the chops for it. Hopefully it's the beginning of a new era like original Iphone was, when Apple redefined what was possible and changed the way we interact with computers.

Yep. Exactly.
[automerge]1590254812[/automerge]
Basically look at every AR app that is already out on the iPhone and now instead of holding an iPad up to your face to see AR imagine it being transferred to your frontal vision via the glasses. So You’ll be able to see how furniture will look in your room. YoIll have instant translation of foreign language text on signs and in books. Virtual work out partners from exercise apps. Pokemon games. Etc

This is spot on. More than anything that’s what these glasses are about. They’re not about notifications, messages, etc.

It’s all about AR. Augmenting reality. It’s a layer of information over real life around us - everything that AR does now, and more - without having to hold the phone or tablet up in front of our faces.
 
Last edited:
... I think that's where we're going and if we are, it will change everything. But it's not just Apple that will go there. The entire industry will, and soon smartphones will be this weird phase humanity went through...

Nicely put. I agree.

And as wild as it sounds now, I’ll bet that eventually it’ll move from glasses to contact lenses, and then to some direct interface into our eyes/brains. Sure maybe that’s 50 years away, and a lot of work to be done to make sure it’s safe, etc. but I bet that’s where we’re going long term.

It’s exciting stuff and I’m really looking forward to it.
 
I’m think you’re unable to differentiate between innovation and just having a successful product if you’re listing all those as such...which is typical if you are a brand fan.

Yes the iPhone was innovative for sure. It changed the way we interacted our mobile devices ....and perhaps the iPad (reaching a bit but I’ll accept) ... that’s about it...

the others were just good products in well established fields and to call them innovative is just silly

I think we could get into a long and silly argument about semantics here. I think when someone does something significantly different for the first time that counts as pretty innovative to me. I think there’s degrees of innovation. Something doesn’t have to revolutionary to be innovative.

I’m happy to agree the Watch wasn’t innovative to anything like that scale and was just a good iteration.

But the iPod...? It wasn’t just a better MP3 player. It was pretty revolutionary if you ask me. Certainly innovative in that it was the first with an enormous amount of space for the time (because hard drive), combined with a revolutionary user interface (scrolling menus and click wheel).

That market was pretty niche and very fragmented before the iPod. It was the first that was actually easy to use and had a decent quantity of songs on it, and because of that, everyone wanted one, even at five times the price of the competition. It changed the world as much as the iPhone did.

All of that counts as innovation in my mind at least. If you disagree then we have different understandings of the meaning of the word. That’s ok.
 
Nicely put. I agree.

And as wild as it sounds now, I’ll bet that eventually it’ll move from glasses to contact lenses, and then to some direct interface into our eyes/brains. Sure maybe that’s 50 years away, and a lot of work to be done to make sure it’s safe, etc. but I bet that’s where we’re going long term.

It’s exciting stuff and I’m really looking forward to it.

Possibly. The problem is, people get weirded out by things. I can't touch my eyeball and wouldn't wear contact lenses myself. It would freak me out to not be able to take them out quick enough if I needed to. Especially if stuff started going crazy and my system was crashing or something. Or it ended up being hacked and people started throwing up all kinds of images I didn't want to see.

I'm also sure the same is true of a brain interface. Some people would call it the mark of the beast and wouldn't touch it. People would start spreading rumors about how it's trying to change your thoughts or messing with your brain chemistry. Even if not true it would turn a lot of people off from trying.

A pair of glasses is something you can easily take on or off. Even if they aren't always ideal. But I could see them evolving over time. Becoming lighter, thinner and nearly invisible. Probably being powered off body heat at some point.
 
This will be liberating when done right even if it will take several iterations. No longer people will need to be hunched over their computing devices for big part of the day. Can you imagine doing lot of computer tasks while exercising, walking, fishing etc? Watching driving directions without taking your eyes from the road?
 
Last edited:
This will be liberating when done right even if it will take several iterations. No longer people will need to be hunched over their computing devices for big part of the day. Can you imagine doing lot of computer tasks while exercising, walking, fishing etc? Watching driving directions without taking your eyes from the road?

No, navigation is about the only thing I would ever want while doing something else. And everyone is certainly going to need their eyes on the road to avoid all the people walking listlessly into traffic while a youtube video plays in their glasses.
 
I’m guessing these will be like the Apple Watch as far as frames are concerned? For example; Aluminum for the base versions and then stainless steel and or ceramic for the upscale models?

Whatever Apple decides to do, I’m in. They just keep putting out hit after hit each time they enter a market and I don’t see them stopping anytime soon.
 
This will be liberating when done right even if it will take several iterations. No longer people will need to be hunched over their computing devices for big part of the day. Can you imagine doing lot of computer tasks while exercising, walking, fishing etc? Watching driving directions without taking your eyes from the road?
All this with hand gestures? Good luck.
 


Front Page Tech host and leaker Jon Prosser today shared several alleged details about Apple's rumored augmented reality glasses, including an "Apple Glass" marketing name, $499 starting price, prescription lens option, and more.

Apple-Glass.png
  • The marketing name will be "Apple Glass"
  • The glasses will start at $499 with the option for prescription lenses at an extra cost
  • There will be displays in both lenses that can be interacted with using gestures
  • The glasses will rely on a paired iPhone, similar to the original Apple Watch
  • An early prototype featured LiDAR and wireless charging
  • Apple originally planned to unveil the glasses as a "One More Thing" surprise at its iPhone event in the fall, but restrictions on in-person gatherings could push back the announcement to a March 2021 event
  • Apple is targeting a late 2021 or early 2022 release
Apple is also rumored to be working on a more traditional AR/VR headset that resembles Facebook's Oculus Quest, with previous reports suggesting that the headset will be released prior to the glasses. Earlier this year, a leaked build of iOS 14 revealed a new app codenamed "Gobi" that Apple appears to be using to test new augmented reality experiences.

Prosser also claimed that this year's iPhone event could be held in October, rather than September as usual, due to the global health crisis. Multiple analysts including Ming-Chi Kuo and Jeff Pu have indicated that the highest-end 6.7-inch iPhone 12 Pro Max might not be available to order until October due to supply chain disruptions.



Article Link: 'Apple Glass' Rumored to Start at $499, Support Prescription Lenses, and More

Should have been called eyeSpecs.... or iSpecs
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.