Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I used to carry around my PDA to check my class schedule and take notes and people thought I was nerd (I WAS nerd). But then iPhone came out and everyone carried them. I expect same thing to happen.
 
THIS is the one feature that makes this type of tech interesting for me. The fact that it can track what you are looking at, and then give you information on that object is actually useful. I still think the form factor and power requirements are limiting, but 5 years or a decade down the road, this could be useful indeed.
Really? 5 years from now it will be useful to look at a tree and the Vision Pro says "That's a Larch."

Really?

Really?

A Larch.

A Larch.

"Hey Siri, what's that?"

"That's a desk, you idiot."

"Thanks, Siri."

"I'm not paid enough to help people get information on random objects."

You do realize how banal of a feature that is for $3499? I mean, for ****'s sake my iPhone can do that.
 
So, everyone is expecting that all this technology will fit in regular glasses weight and size but for the last 30 years cellphone and laptop batteries have been reduced only slightly? Is anyone really going to buy this to wear it outside walking on the street? I really doubt it. I think it is a great show off of technology but I doubt it will take off for the next 20 years or possible way way longer than that.
 
So, everyone is expecting that all this technology will fit in regular glasses weight and size but for the last 30 years cellphone and laptop batteries have been reduced only slightly? Is anyone really going to buy this to wear it outside walking on the street? I really doubt it. I think it is a great show off of technology but I doubt it will take off for the next 20 years or possible way way longer than that.
This is where Apple’s focus on perf/watt is causing a major shakeup in the industry. Future generations for Apple’s embedded systems are going to run at 5w or less with astonishing performance for that power.
 
Nope.

The people who thought:
"A solution in search of a problem"
"NO"
"Useless product"


Still think:
"A solution in search of a problem"
"NO"
"Useless product"


The people who thought:
"CoolAF"
"WOW"


Still think:
"CoolAF"
"WOW"


So, no. Nothing has changed.
I've never understood why people say "It doesn't solve any problems."
Sometimes tech can just be cool AF. That can be enough.

However, in the case of AR, it does solve problems. Example: are you the kind of person who stares at your phone while your family is in the same room with you? Now you don’t have to. You can stay engaged and have whatever feed in your field of vision as well. Still not great, but far superior to staring at your phone and ignoring the people around you.

That’s one of many issues this could solve.
 
  • Like
Reactions: friedmud


The Apple Vision Pro headset's visionOS operating system includes a feature called "Visual Search," which sounds like it is similar to the Visual Lookup feature on the iPhone and the iPad.

vision-pro-headset-1.jpg

With Visual Search, users can use the Vision Pro headset to get information about an item, detect and interact with text in the world around them, copy and paste printed text from the real world into apps, translate text between 17 different languages, and more.

Real world text that includes contact information, webpages, and unit conversions and similar information can be acted upon in visionOS. So, for example, if a printed handout has a website link in it, you can scan the link with the Vision Pro, opening up a Safari window to view the website. Or, if a recipe calls for grams and you need ounces, you can convert using the headset.

Real-time text translation will also be useful for traveling and other instances when you might want to quickly translate what you're seeing in the real world. The Apple Vision Pro headset will be able to automatically detect text and documents, similar to how the iPhone can detect text in photos and allow it to be interacted with.

The Visual Search function was found in visionOS by Steve Moser. visionOS can be accessed through the latest Xcode beta at the moment, as Apple released the first version of the software earlier today.

Article Link: Apple Vision Pro 'Visual Search' Feature Can Identify Items, Copy Printed Text, Translate and More

iOS' Translate app is SEVERLY lacking in:
language support,
ability to run audio translation in background process,
and the UI is really NOT intuitive when used in practice!
 
I read a few pages today in the new Make Something Wonderful book by Jobs, and I got to the section about the 2001 first Apple Store. He is talking about the apple products but then mentions that they also had third party cameras, camcorders, digital organizers, and mp3 players… and naturally my mind thought about how my iPhone does all that (and tons more) in a relatively short time after the Apple Store first opened...

And I wonder, 5-10 years from now?, what will this thing look like, how slim will it get, will it replace the glasses I wear now and be just a few ounces and look like normal glasses? Seems crazy to even think that...

But considering that my first gen iPod I listen to led to my thin and powerful iPhone, and the iMac g4 on my home desk is archaic compared to my 11" super thin iPad Pro → I look fwd to what the future holds for this tech and the innovation that comes along with it.
Your comment and the stage this technology is in right now reminds me very much of a favorite quote from the show Halt and Catch Fire, which was an excellent show if anyone hasn't seen it yet. The main character, Joe, is talking about the early PCs and says "Computers aren't the thing. They're the thing that gets us to the thing". And it certainly feels like that's where we are at with v1 of the Vision Pro.
 
Your comment and the stage this technology is in right now reminds me very much of a favorite quote from the show Halt and Catch Fire, which was an excellent show if anyone hasn't seen it yet. The main character, Joe, is talking about the early PCs and says "Computers aren't the thing. They're the thing that gets us to the thing". And it certainly feels like that's where we are at with v1 of the Vision Pro.
I also thought later after I made my first comment… I haven't had time to look for the email/message but I remember a few years ago (maybe 3 or so) talking with a guy who works at a large Bible software company. Has some basic features but think scholars/professors/pastors with intellectual bent, complex Greek searches in the Bible and thousands of books and commentaries, etc., and we were going back and forth about how cool it would be to have smart glasses that work with a preacher's speaking. For example, during the week you study and type your manuscript. You have a few options, you can 1) have no notes, which few people should try since the gift to speak for a half hour and know what your talking about is slim 2) have your notes in front of you and try not to read line by line and bore people 3) try to do a combination of both 4)have a screen in front of you that the sound booth controls, or 5)have a screen in front of you that you control.

all of those are methods people use… but what if the use of smart glasses and AI (or ML) would allow me to 1) study all week 2) write my sermon 3) load it into an app 4) that app connects with my smart glasses 5) the mic in the glasses detects when I begin speaking 6) as I speak my words show on my lens like a mini teleprompter allowing me to look at my manuscript and the audience at the same time 7) when I go off course, tell a story, run a rabbit trail, the app detects the departure, pauses the manuscript in my lens, and when I start reading my words again it picks back up and starts scrolling.

For a calling (and profession) that means studying each week enough to deliver multiple 30-45 min talks to the same group of people year after year without repeating yourself (much) and in that audience not be distracted by looking down at your notes too much nor being distracted by looking at the audience and interruptions, walking around, sleeping (yes, lol), smart glass that could do that would be amazing.

and while that sounds like a dream… when I got my first Mac in 2003 and had to drive an hour away just to get a $100 card that goes under the battery so I could try out this new thing called wireless internet → now I can talk to my watch and it talks back to me → the scenario above is likely closer to us now than the time gap between my first AirPort Extreme card and my Apple Watch.
 
The device / technology looks amazing. I can't wait to try them, and I can't wait to buy the first iteration of the 'cheaper' version. ($2500 ?)

A side note / observation from someone with a long retail background...

The theft situation with these is going to be insane. Robbery of Amazon Prime, Feds, and UPS trucks the day of and the day before launch is going to be unlike anything seen before. One truck with a hundred of these is more valuable than anything we've seen in this internet era, and a million times easier to steal than a Brinks money collection truck.

And store break-ins and grab and runs are gonna be so common. Yes, I realize that they will not just keep piles of them behind the counter, but people are going to get mugged / hurt / killed on the way to their cars after buying them. No, of course not everyone, but it will be more than just one or two isolated incidents.

Sure wish people were not like this...
So, this has happened to you? What products?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.