Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
And yet we really aren't much further on along this front than 10 years ago, at least not in a revolutionary way that would be required for glasses/contact lenses as are being discussed in AVP threads.

We aren't even remotely close to the miniaturization needed there.

Things have plateaued quite a bit in the last decade on this front
Completely disagree.
Today you could run Crysis on an iPhone, 10 years ago needed a giant desktop computer to do that. That kind of increase in compute is useful for all kinds of things, notably AR and on device AI in thin and light devices.

Contact lenses, yeah I have no idea, I said that already in my first post, glasses though, very possible.
 
Processor miniaturization is close to reaching its limit and the 18 month tick tock of Moore’s Law hasn’t been happening for several years now. There are limits on how small some things can be made. Even ignoring processors you still need lenses for cameras and those can’t get much smaller than they already are. Just one of the barriers. The argument that largely self contained glasses frames are possible with this tech is simply wrong.
this is categorically untrue. There are no laws of physics saying this can't be done. There are simply lots of unknowns still.
 
  • Like
Reactions: G5isAlive
You’re talking about a phenomenon that has only really existed over the last 50 years or so. The idea that this will continue indefinitely into the future is misguided.
the idea that it won't continue for the next 10 years at least is misguided;

5 years ago people were saying this, then suddenly apple silicon (and to some extend AMD Zen, both designed by Keller) proved the entire industry wrong, and now lo and behold AMD and Intel are finding major new improvements each year.
 
This is it as far as the glasses fantasy goes.

There's some research that has shown slightly different compositions of the anode/cathode for Li-ion based tech can improve energy density. It works, in theory! Commercialization has been extremely challenging. Here's the latest, and it's economic at the scale of storage for planes, buses, etc.


One of the companies in that story is Amprius. I toured Amprius in 2015 or 16, i can't remember which year it was, and we met some of their lead researchers. In 2018 it's public info that they reached commercialization for the first time. In 2022 they shipped something. The horizon for being suitable for consumer electronics due to stability, longevity, and cost is long, and we may never cross it at all.
the problem with this conclusion, is even with batteries at a standstill, the improving silicon can make glasses happen. And if there are major battery material improvements like the research in that article, it opens a whole new chapter of possibilities.

You also realize 10 years ago people said the same thing about EV batteries and that electric cars would never work? Yet here we are with Tesla and others that have better battery density and lower cost in EVs each year and that also has a roadmap of continued improvement. It doesn't have to be radical or fantasy to improve, just steady progress.
 
  • Like
Reactions: G5isAlive
Completely disagree.
Today you could run Crysis on an iPhone, 10 years ago needed a giant desktop computer to do that. That kind of increase in compute is useful for all kinds of things, notably AR and on device AI in thin and light devices.

Contact lenses, yeah I have no idea, I said that already in my first post, glasses though, very possible.

The increase in computing power is meaningless unless the compute power is used for something substantial where the extra compute significantly boosts my experience somehow. I use my iPhone 14 Pro Max the same way I used my iPhone 6 except it's marginally faster (mostly thanks to 5G) yet theoretically the compute ability of the 14/15 is at the level of laptops from just a couple years ago. It's like 10x faster than the iPhone 6. Why am I not getting laptop levels of productivity? Is it because of something other than raw compute? What has Apple added over the last decade (the iPhone 6 came out in 2014) to make my iPhone experience 10x better/faster than the iPhone 6? I don't see it. Enlighten me because maybe I'm taking something massive for granted or I'm forgetful.

It's impressive I can game in 4k now with my 4090 but my gaming experience is not massively different than when I was using a 980Ti.

EDIT:
Oh I almost forgot. Increase in compute power is not really comparable to perfect AR glasses because the technological bottom line for that product requires technology we don't have yet. Why do you think AR glasses are possible any time soon? I'm thinking it's another 15-20 years away minimum.
 
  • Like
Reactions: AlumaMac
Miniaturization is the most predictable outcome of computers and has been so for the last 50 years.
The difference being that we are now getting close to chip structures being just a couple of silicon atoms across. Obviously we can't build a silicon chip with structures that are smaller than a single atom, and we have yet to see whether that size is even viable. Unless something radically different like optical computers comes along we'll have a problem to continue the exponential increase in transistor budgets. The problems with silicon are already showing, with memory and other parts not scaling down like core logic still does. The problem with optical computers of course is that they are "just around the corner", together with room-temperature superconductors and commercial fusion power.
 
The increase in computing power is meaningless unless the compute power is used for something substantial where the extra compute significantly boosts my experience somehow. I use my iPhone 14 Pro Max the same way I used my iPhone 6 except it's marginally faster (mostly thanks to 5G) yet theoretically the compute ability of the 14/15 is at the level of laptops from just a couple years ago. It's like 10x faster than the iPhone 6. Why am I not getting laptop levels of productivity? Is it because of something other than raw compute? What has Apple added over the last decade (the iPhone 6 came out in 2014) to make my iPhone experience 10x better/faster than the iPhone 6? I don't see it. Enlighten me because maybe I'm taking something massive for granted or I'm forgetful.

It's impressive I can game in 4k now with my 4090 but my gaming experience is not massively different than when I was using a 980Ti.
Right so thats a completely different discussion. An interesting one to be sure, but I'm just talking about what could be possible in the future and future devices / and or experiences based on improving technology, not the quality or utility of the experiences themselves.
 
  • Like
Reactions: G5isAlive
Right so thats a completely different discussion. An interesting one to be sure, but I'm just talking about what could be possible in the future and future devices / and or experiences based on improving technology, not the quality or utility of the experiences themselves.

I'll grant you that but don't you agree impressive generational improvements in chips is not an indication we're going to conquer the physics problems of building a perfect AR glasses product that doesn't look like Xiaomi's Reality glasses or Google Glass? The problem is the optics and display tech doesn't seem to be there yet. Some labs will present experiments that make it look like we're making progress but nothing practical so far. Meta says 15 years until "AR glasses" and Snap says 10 years. So we can safely double those estimates. Other experts (re: not CEOs hyping up their investors) are saying 20-30 years minimum.

I think it's a while off unfortunately.
 
The difference being that we are now getting close to chip structures being just a couple of silicon atoms across. Obviously we can't build a silicon chip with structures that are smaller than a single atom, and we have yet to see whether that size is even viable. Unless something radically different like optical computers comes along we'll have a problem to continue the exponential increase in transistor budgets. The problems with silicon are already showing, with memory and other parts not scaling down like core logic still does. The problem with optical computers of course is that they are "just around the corner", together with room-temperature superconductors and commercial fusion power.
Sure but we haven't hit the wall yet, not even close, and there's still things to be discovered like materials other than silicon for transistors. And true, some technologies never seem to make it.

I guess my point is if engineers at Apple, AMD, Nvidia, and Intel had the attitudes of people on this forum, we would all still be using Pentium 1 chips on our beige desktops because anything else would be "not possible".

Conversely, listen to the man who actually built most of this stuff and his thoughts on the future
 
I'll grant you that but don't you agree impressive generational improvements in chips is not an indication we're going to conquer the physics problems of building a perfect AR glasses product that doesn't look like Xiaomi's Reality glasses or Google Glass? The problem is the optics and display tech doesn't seem to be there yet. Some labs will present experiments that make it look like we're making progress but nothing practical so far. Meta says 15 years until "AR glasses" and Snap says 10 years. So we can safely double those estimates. Other experts (re: not CEOs hyping up their investors) are saying 20-30 years minimum.

I think it's a while off unfortunately.
That's fair, we shall see on the optics. I'm just saying the silicon / compute power will be ready.
 
OK, so let's have a conversation.

Two sticking points for me,
First what are the "immovable limits of physics" that you suggest which would prevent lightweight glasses from being developed? I surmise that I disagree with your assumptions here, but let's hear them.
Second, by saying "people like you" you seem to assume I am quite ignorant about the situation required, and the implied connotation is rather condescending.

Also, what do you do for a living, what is your education, that back up these claims? What is your expertise and/or sources of expertise? I'm happy to learn from someone who truly is an expert in this area. (this is what I meant about whose opinion matters more etc.)

Let me share a bit about myself, before you continue to assume things about me.

As it turns out, I have a computer engineering degree, work as a software engineer, and know people on the Vision Pro team. As a hobby I follow silicon and hardware design and optimizations since I find it interesting and reminds me what I studied originally at university. Now I haven't worked on the headset teams directly, so certainly there are things I don't know, but I feel that I have a high level grasp on whats required.

I am not making 5-10 year predictions blindly or with wishful thinking analogous to time travel as you suggest.

I am thinking about the progress of silicon. If Vision Pro is a baseline, Apple needs M2 level of performance in the size and power constraints of an Apple Watch to make a compelling glasses style device. If they can get to AirPods form factor, even better.

Let's look at history, Apple has basically got M1 performance into iPhone form factor with A17 in 3 years. Furthermore, in certain ways it actually outperforms an M2, (newer GPU etc.) so perhaps A17 is all Apple needs for glasses. Furthermore, with the S9 chip they have A15 architecture from iPhone 13 Pro in the watch, this was done in two years. Now, its just the E cores, so performance is maybe closer to A12 or A13 (iPhone XS, or iPhone 11), so in about 5 years they have iPhone level performance in a Watch.

And this continues forward as well, as TMSC (and everyone else) has no indication of slowing down shrinking process nodes as well as upcoming material changes performance keeps drastically improving; essentially, Moore's Law is not dead. If you don't believe me, there's a good talk from Jim Keller (world renowned CPU architect) explaining this in more detail.

Therefore, it seems reasonable that Apple can achieve M2 (or at least A17) level performance in a Watch size thermal and power constrained device in 5 years time. And perhaps in even smaller sizes by 10 years. This is primarily how I came to my prediction.

In addition to that, with glasses you need transparent OLED screens or a kind of micro projector, both of these exist, and continue to improve. I'm not totally sure on the time frame of these technologies however, so if you have insights there I'm open to hearing them.

Also, while you do still need cameras and lidar to track objects, you don't need constant video passthrough for the user and so no need for the R1 chip, so that should help thermal and power issues as well. It may also be the case that portable glasses one wears outside may not have all the features a larger headset has, (same as iPhone vs Mac) but this does not prevent the device from existing or being compelling.

So, what about this "literally cannot happen" ?

Thanks for a well thought out discussion on where this could be going based on historical trending. You make me optimistic for the future. I’d say what about power density and battery requirements but I watched an interview of the founder of oculus where he applauded Apple for putting the battery separate on the AVP… his point is Apple could have made a head mount in the back for it for this particular use, but they were projecting forward to a time of glasses and wanted to prepare us now for separate batteries.
 
  • Love
Reactions: Audentia
the idea that it won't continue for the next 10 years at least is misguided;

5 years ago people were saying this, then suddenly apple silicon (and to some extend AMD Zen, both designed by Keller) proved the entire industry wrong, and now lo and behold AMD and Intel are finding major new improvements each year.

Apple’s silicon did not reverse the trend of slowing Moore’s Law. That’s an absurd thing to claim.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.