Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Screw the “touchbar”. It’s a stupid gimmick that just needlessly increases cost. It provides zero value. It’s absurd.

In your opinion, i've said this before but the Touch Bar is useful if your doing things like video editing in Final Cut Pro X and so on.
 
Screw the “touchbar”. It’s a stupid gimmick that just needlessly increases cost. It provides zero value. It’s absurd.

It maybe but I think there is a reason for this. Apple ensures its only on the top tier, the best selling laptops are the cheapest therefore for app developers to feel the need to actually make apps usable with it it needs to be in the hands of the masses. If adoption was faster and at a higher rate it might not be as cr*p as it is and there would be a need to improve it.

That's what is ridiculous. There is potential, its just a first gen and the implementation has been poor thus far.

No wonder the adoption has been so poor, if its a feature they are serious about it should be implemented across all products not just two top tier over £1700 models.

The main reason to have the TB is for touch ID but with face ID this would make it void and we could all be happy with the normal keyboard
[doublepost=1525164675][/doublepost]
In your opinion, i've said this before but the Touch Bar is useful if your doing things like video editing in Final Cut Pro X and so on.

And yet they have left it out of the desktop on the newest keyboards...

In all honesty how many people edit video 100% of the time on a laptop, the screens are too small even a 15". Makes the feature void imo, most will have a desktop set up they can dock the laptop then the touch bar is out of reach and pointless. The fact they haven't pushed it across the product line just boggles the mind imo.

If its going to be a feature... put it on everything or nothing.
 
And yet they have left it out of the desktop on the newest keyboards...

In all honesty how many people edit video 100% of the time on a laptop, the screens are too small even a 15". Makes the feature void imo, most will have a desktop set up they can dock the laptop then the touch bar is out of reach and pointless. The fact they haven't pushed it across the product line just boggles the mind imo.

If its going to be a feature... put it on everything or nothing.

There may be issues putting it on an external keyboard, a lot of people do edit all there footage on a laptop the 15" is a pretty big screen to be fair and more than enough to edit footage. The 5K iMac's are often used as a home or office/editing suit but most people who edit on a laptop like the fact that it's portable and the can work on the commute to work, a meeting, holiday and so on.
 
There may be issues putting it on an external keyboard, a lot of people do edit all there footage on a laptop the 15" is a pretty big screen to be fair and more than enough to edit footage. The 5K iMac's are often used as a home or office/editing suit but most people who edit on a laptop like the fact that it's portable and the can work on the commute to work, a meeting, holiday and so on.

Ye im not convinced.

Its convenient but anyone who does this sort of stuff for a living or more than once in a while will dock for any serious work with a bigger display.
 
Ye im not convinced.

Its convenient but anyone who does this sort of stuff for a living or more than once in a while will dock for any serious work with a bigger display.

Have you ever been to London? people on the train are often using laptops to do their work including editing and so on. One of my friends uses his MacBook Pro to edit and he does it for a living, some people prefer using a laptop rather than a desktop, i'm not saying that people don't use desktops it's just that a lot of people prefer laptops, Apple's own sales even show this with the MacBook Pro selling more than the iMac's.
 
Have you ever been to London? people on the train are often using laptops to do their work including editing and so on. One of my friends uses his MacBook Pro to edit and he does it for a living, some people prefer using a laptop rather than a desktop, i'm not saying that people don't use desktops it's just that a lot of people prefer laptops, Apple's own sales even show this with the MacBook Pro selling more than the iMac's.

I also use my machines to make a living... Im a commuter and know for a fact you cant use a 15" on anything but a bay 4 seat section on any train in the UK. It wont physically fit on a fold out double seat which means it has to be on your knee... and the lid has to rest against the seat which is uncomfortable for any length of time. That means your better off with a 13" and again its too small for any work but on the go. If you have to use a 13" there is no doubt when at home its docked in a set up or have a sepereate desktop.

You didn't read my comment.

Im not saying people dont use laptops, but when doing serious work they will be docked therefore the Touch Bar is either out of reach unless your a contortionist or the lid is closed. Therefore the TB is a moot point unless your on the go or editing on the sofa which I cant see anyone doing a small casual edit because its not productive.

The fact they haven't come out with a TB keyboard, who cares if its wireless if it needs the power make it a wired full size keyboard. If its a serious feature it should be on all the products and its not, which makes it a gimmick so apple could sell charge more for specific products AKA the macbook pro. You cant buy the full fat CPUs in the non TB macbook pro they are all lower powered chips so the only way to get the full speed CPU is to buy the TB model.

Its hardly a fluid experience because you cant use the TB on a desktop or if the laptop is docked so switching between the two means you have to do the same operation in a different way. Not exactly ideal is it.

Not sure if you have heard of price differentiation? Basically it means having a reason to charge X amount for X product. Certain people are willing to pay X over another product. The TB is a perfect example of how they tried to split the macbook line up by charging an extra £3-500 for models with the TB over the previous model. In 2017 they reduced the price and added a second non TB model because it obviously didnt work as they planned and the adoption rate wasnt as high.

Will be interesting how apple handles it, whether it gets better or they drop it entirely.

Lets be fair the main reason for most to get the touch bar is touch ID. Now with face ID again the technology is moot. Im sure face ID wont use as much power either...

Either way the TB was a mistake and in that respect they either go all in or all out.
 
I also use my machines to make a living... Im a commuter and know for a fact you cant use a 15" on anything but a bay 4 seat section on any train in the UK. It wont physically fit on a fold out double seat which means it has to be on your knee... and the lid has to rest against the seat which is uncomfortable for any length of time. That means your better off with a 13" and again its too small for any work but on the go. If you have to use a 13" there is no doubt when at home its docked in a set up or have a sepereate desktop.

You didn't read my comment.

Im not saying people dont use laptops, but when doing serious work they will be docked therefore the Touch Bar is either out of reach unless your a contortionist or the lid is closed. Therefore the TB is a moot point unless your on the go or editing on the sofa which I cant see anyone doing a small casual edit because its not productive.

The fact they haven't come out with a TB keyboard, who cares if its wireless if it needs the power make it a wired full size keyboard. If its a serious feature it should be on all the products and its not, which makes it a gimmick so apple could sell charge more for specific products AKA the macbook pro. You cant buy the full fat CPUs in the non TB macbook pro they are all lower powered chips so the only way to get the full speed CPU is to buy the TB model.

Its hardly a fluid experience because you cant use the TB on a desktop or if the laptop is docked so switching between the two means you have to do the same operation in a different way. Not exactly ideal is it.

Not sure if you have heard of price differentiation? Basically it means having a reason to charge X amount for X product. Certain people are willing to pay X over another product. The TB is a perfect example of how they tried to split the macbook line up by charging an extra £3-500 for models with the TB over the previous model. In 2017 they reduced the price and added a second non TB model because it obviously didnt work as they planned and the adoption rate wasnt as high.

Will be interesting how apple handles it, whether it gets better or they drop it entirely.

Lets be fair the main reason for most to get the touch bar is touch ID. Now with face ID again the technology is moot. Im sure face ID wont use as much power either...

Either way the TB was a mistake and in that respect they either go all in or all out.

is that in 1st class? i've seen people using the tables on the train with 15" laptops that's what at least one of my friends do when travelling. The 15" screen may not be big enough for you but for a lot of people it is, even at home most people don't need to dock there MacBook Pro, granted i'm not talking about the 12" MacBook here as it's very small but the 15" doesn't need to be docked.

I don't own a Touch Bar MacBook Pro, i'm not really in the market for one when i do upgrade from my current (and now old 2011 MacBook Pro) it's more than likely going to be a 12" MacBook, but going off of what a friend says he is more than happy with the Touch Bar on the 15" version, he says that it helps save time when cutting clips and so on, which to him is a great thing especially in the editing industry when time is money.

Yes they could put the Touch Bar on an external keyboard who's to say they are not working on that now? in fact there was a rumour/leak that they were doing just that, i think the rumour was from last year but maybe they have hit upon some issues. I don't think the Touch Bar is the failure that people around this forum think it is.
 
Ye im not convinced.

Its convenient but anyone who does this sort of stuff for a living or more than once in a while will dock for any serious work with a bigger display.
I don't know if you're in the industry or not, and my profession is about as far removed from the industry as you can imagine, but I nonetheless have been able to observe some film shoots in person, on location. How? Over the years there have been a couple of TV episodes and a few TV commercials shot in my house.** These were lower budget TV shows and mid to high budget TV commercials.

I can't claim I fully understand all of what they are doing with their MacBook Pros, but it seems they were reviewing footage and sometimes doing some quick editing on location, plus lots of other stuff (mixing, etc?). I can also say they may spend hours on those machines over the course of the shooting day, and there usually were no full-on hardware docking stations involved. I do recall seeing an external monitor occasionally, but more often than not there was no external monitor. They did make liberal use of external drives though.

**Where I live there is a province-wide database of shooting locations, and homeowners can submit a description and pictures of their own personal homes to that database for free. Location scouts, producers, directors, etc. can check out the database to look for locations suitable for their needs, and pay the owners directly for use of the houses/condos. The types of houses used range from run down bungalows to high end mansions and everything in-between, obviously depending upon what the shoots need. Because of this database, I get phone calls from location scouts a few times a year, and occasionally those calls result in an actual shoot at my house.
 
Have to kind of feel bad for Intel, apparently the issue is they overstretched on trying to increase the density of 10nm - they have gone for 2.7x the transistor density of 14nm - which would give them an edge right down to where die shrinks are no longer possible at ~5nm. In trying to actually bring something really impressive to the table they have shot themselves in the foot...
 
I don't know if you're in the industry or not, and my profession is about as far removed from the industry as you can imagine, but I nonetheless have been able to observe some film shoots in person, on location. How? Over the years there have been a couple of TV episodes and a few TV commercials shot in my house.** These were lower budget TV shows and mid to high budget TV commercials.

I can't claim I fully understand all of what they are doing with their MacBook Pros, but it seems they were reviewing footage and sometimes doing some quick editing on location, plus lots of other stuff (mixing, etc?). I can also say they may spend hours on those machines over the course of the shooting day, and there usually were no full-on hardware docking stations involved. I do recall seeing an external monitor occasionally, but more often than not there was no external monitor. They did make liberal use of external drives though.

**Where I live there is a province-wide database of shooting locations, and homeowners can submit a description and pictures of their own personal homes to that database for free. Location scouts, producers, directors, etc. can check out the database to look for locations suitable for their needs, and pay the owners directly for use of the houses/condos. The types of houses used range from run down bungalows to high end mansions and everything in-between, obviously depending upon what the shoots need. Because of this database, I get phone calls from location scouts a few times a year, and occasionally those calls result in an actual shoot at my house.

Ye were talking about different things.

Capture on location can be done in many different ways, most of the time cinema cameras will have their own SSDs with monitors that a creative director would view the capture, cuts on set don't happen footage is sent back to a studio to be edited. Especially on something like your describing above, but laptops can be used to tether and review footage, even so 15" is too small for a director to review on.

I mean come on its hard to see if you have hit critical focus on a still camera on a 15" display unless you zoom into 100% let alone for a videography.

Even so your talking about high end work, say the average youtuber... go out shoot your footage say your getting the train back as described above... ye you can import and capture footage, depending on what your using transcode, line up some footage and cut but its far more likley that they head back to a studio with a full desk set up to do the heavy lifting its just far easier.

Most plug into an external display or "Dock" therefore using the machine as best of both worlds instead of having a separate desktop. Its far easier to edit on a big screen than on a laptop even 15" wasnt that big back in the day. My first macbook pro was a 17" I still hooked that up to a 30" ACD before I bought a mac pro.

Although you say your not in the industry, you use 2 iMacs right? one to drive another in target display mode? You have a machine that does specific job... You can use a macbook pro to do everything doesn't mean its the best at every job.

Anyway back to my argument about the touch bar.

In desktop mode the Touch bar is pointless because if your attached to a screen your likely using a keyboard and mouse and the laptop will most probably be closed. Its a disjointed editing experience being able to cut like that on the macbook pro and not on any other mac computer. Having to do two separate workflows to do the same thing. Which is why its weird the iMac pro would have been a perfect time to launch a new smart keyboard... they launched the full length wireless keyboard but there was nothing fresh there. The Touch Bar has been around nearly 3 years why hasnt it been adopted anywhere else?

Anyway I dont know many people that rely on a laptop as a main machine, more of a convenience to have a high powered portable machine get work done on the road. The macbooks are powerful but they pail in comparison to a desktop for any kind of production work. Almost everyone I work with have both desktop and laptop. The agency I work for provides both, its just standard.
 
Last edited:
Ye were talking about different things.

Capture on location can be done in many different ways, most of the time cinema cameras will have their own SSDs with monitors that a creative director would view the capture, cuts on set don't happen footage is sent back to a studio to be edited. Especially on something like your describing above, but laptops can be used to tether and review footage, even so 15" is too small for a director to review on.

I mean come on its hard to see if you have hit critical focus on a still camera on a 15" display unless you zoom into 100% let alone for a videography.
Fair enough.

Even so your talking about high end work, say the average youtuber... go out shoot your footage say your getting the train back as described above... ye you can import and capture footage, depending on what your using transcode, line up some footage and cut but its far more likley that they head back to a studio with a full desk set up to do the heavy lifting its just far easier.
Well, if you're talking about the low end stuff: My relative does low end documentary editing and Vimeo stuff on her MacBook Pro. No external monitor.

She's going to school though so not yet a "pro" and on a budget, so that may change if she gets hired by a production company after school is done.

Although you say your not in the industry, you use 2 iMacs right? one to drive another in target display mode? You have a machine that does specific job... You can use a macbook pro to do everything doesn't mean its the best at every job.
Yep, I have a dual-iMac 27" dedicated setup. One iMac is a 2.5K 2010 iMac used in target display mode, driven by a 5K iMac. My laptop is a 12" MacBook. I no longer use a MacBook Pro.

BTW, I too have no use for the Touch Bar.
 
Its not really an argument over what to use, because you can do pretty much anything on a mac these days. Although experience will differ hugely and pros will buy what they need not what will do.

Its more the fact that why is certain technology reserved for certain products. Touch ID took 5 years to come to the mac and in the same year was superseded by face ID... which is far more usable for a laptop.

Seems like the touch bar was designed for touch ID as they were obviously struggling to find a good way of implementing it, a home button would have looked ridiculous on a laptop as they are twice the size of the old gen buttons and the new power buttons took their place where the F keys live. I assume that's where the idea came from.

Obviously price differentiation comes into play but all the tech they have could dramatically improve the Mac space just seems like its not a priority and its all disjointed with different products having these features. The difference is the touch bar doesn't have that reach... It looks super cool but in reality has limited reach in apps because it was only sold on the high end models when the low end sells far far more. With the ripple it sent through the industry it would have made more sense to try it out on a lower end product that has far more reach first like the MacBook or the MacBook air... bring it to the higher end products later once the first gen is out the way.

Just really confusing what the strategy is, the current gen of laptops are probably the weakest in terms of reliability, continuity and selection. The whole range is full of compromise and little performance benefit. That never used to be the case.

WWDC is a developer conference but I think this year there is a lot to do... really hope its not another disappointing keynote.
 
Last edited:
Have to kind of feel bad for Intel, apparently the issue is they overstretched on trying to increase the density of 10nm - they have gone for 2.7x the transistor density of 14nm - which would give them an edge right down to where die shrinks are no longer possible at ~5nm. In trying to actually bring something really impressive to the table they have shot themselves in the foot...

Where did you read about a 2.7x die shrink?

As far as I was aware (and I try to really keep up with the world of lithography and processor technology as an electronic engineer) they were going for a ~30% or ~1.4x shrink.

The reason they are struggling is due to their DUV process being at its' absolute limits, thus requiring more masks for the same processes which results in costlier CPU's with worse yields that take longer to produce.

Until EUV goes mainstream I don't think we will see much improvement and that likely won't be until 2020 at the earliest.
You need to actually look at things like transistor density, the physical dimensions of the transistors/ space between them, leakage etc if you want to really compare. Intel's 14nm vs TSMC 7nm vs GloFo 12nm doesn't tell you much, these are just marketing terms these days.

Also Intel is really struggling, their foundries which cost silly money with each shrink almost never run at capacity because they only produce Intel chips and the x86 market isn't growing much outside of server chips. TSMC will fab for anyone willing to pay meaning they can have more profitable and productive foundries. Going forward Intel is in between a rock and a hard place. My bet is Intel also starts fabbing chips for other customers.

Anyway TLDR: 2.7x is wrong by quite a margain.
 
Where did you read about a 2.7x die shrink?

As far as I was aware (and I try to really keep up with the world of lithography and processor technology as an electronic engineer) they were going for a ~30% or ~1.4x shrink.

The reason they are struggling is due to their DUV process being at its' absolute limits, thus requiring more masks for the same processes which results in costlier CPU's with worse yields that take longer to produce.

Until EUV goes mainstream I don't think we will see much improvement and that likely won't be until 2020 at the earliest.
You need to actually look at things like transistor density, the physical dimensions of the transistors/ space between them, leakage etc if you want to really compare. Intel's 14nm vs TSMC 7nm vs GloFo 12nm doesn't tell you much, these are just marketing terms these days.

Also Intel is really struggling, their foundries which cost silly money with each shrink almost never run at capacity because they only produce Intel chips and the x86 market isn't growing much outside of server chips. TSMC will fab for anyone willing to pay meaning they can have more profitable and productive foundries. Going forward Intel is in between a rock and a hard place. My bet is Intel also starts fabbing chips for other customers.

Anyway TLDR: 2.7x is wrong by quite a margain.
2.7X is what is out there in the press:

https://www.tomshardware.com/news/intel-cpu-10nm-earnings-amd,36967.html

Krzanich explained that the company "bit off a little too much on this thing" by increasing 10nm density 2.7X over the 14nm node. By comparison, Intel increased density by only 2.4X when it moved to 14nm. Although the difference may be small, Krzanich pointed out that the industry average for density improvements is only 1.5-2X per node transition. Because of the production difficulties with 10nm, Intel has revised its density target back to 2.4X for the transition to the 7nm node. Intel will also lean more on heterogeneous architectures with its EMIB technology (which we covered here).

---

Judging by those comments, it's definitely more than 1.4X, and more than even 2X.
 
Where did you read about a 2.7x die shrink?

As far as I was aware (and I try to really keep up with the world of lithography and processor technology as an electronic engineer) they were going for a ~30% or ~1.4x shrink.

The reason they are struggling is due to their DUV process being at its' absolute limits, thus requiring more masks for the same processes which results in costlier CPU's with worse yields that take longer to produce.

Until EUV goes mainstream I don't think we will see much improvement and that likely won't be until 2020 at the earliest.
You need to actually look at things like transistor density, the physical dimensions of the transistors/ space between them, leakage etc if you want to really compare. Intel's 14nm vs TSMC 7nm vs GloFo 12nm doesn't tell you much, these are just marketing terms these days.

Also Intel is really struggling, their foundries which cost silly money with each shrink almost never run at capacity because they only produce Intel chips and the x86 market isn't growing much outside of server chips. TSMC will fab for anyone willing to pay meaning they can have more profitable and productive foundries. Going forward Intel is in between a rock and a hard place. My bet is Intel also starts fabbing chips for other customers.

Anyway TLDR: 2.7x is wrong by quite a margain.
https://www.extremetech.com/computi...0nm-process-wants-change-define-process-nodes

https://techreport.com/review/33579/intel-outlines-its-struggles-with-10-nm-chip-production

https://www.tomshardware.com/news/intel-cpu-10nm-earnings-amd,36967.html

A lot of sites reporting it, it sounds like it’s come from an interview with Brian Krzanich as it’s cited alongside his quote they “bit off more than they could chew”.
 
https://www.extremetech.com/computi...0nm-process-wants-change-define-process-nodes

https://techreport.com/review/33579/intel-outlines-its-struggles-with-10-nm-chip-production

https://www.tomshardware.com/news/intel-cpu-10nm-earnings-amd,36967.html

A lot of sites reporting it, it sounds like it’s come from an interview with Brian Krzanich as it’s cited alongside his quote they “bit off more than they could chew”.


Thanks for that. So, it's a disaster- disaster. So their only option is BS and pushing substitutes.

Lovely.
 
There is a very good chance that Apple’s A12/A12X in 2018 will be on a smaller process (TSMC “7 nm” which is roughly like Intel’s 10 nm) than Intel’s Y parts in 2018 (14 nm).

That would make for a good time for Apple to introduce an A12X MacBook.
 
  • Like
Reactions: Dave245
There is a very good chance that Apple’s A12/A12X in 2018 will be on a smaller process (TSMC “7 nm” which is roughly like Intel’s 10 nm) than Intel’s Y parts in 2018 (14 nm).

That would make for a good time for Apple to introduce an A12X MacBook.

Wildly predicting an A12 Macbook?

Best Buy was having a fire sale yesterday - wondering if that's a harbinger of some kind of MB announcement at WWDC.
 
Wildly predicting an A12 Macbook?

Best Buy was having a fire sale yesterday - wondering if that's a harbinger of some kind of MB announcement at WWDC.

I’m really hoping so as I would love to buy a 12” MacBook after WWDC or a 13” version if announced.
 
Thanks for that. So, it's a disaster- disaster. So their only option is BS and pushing substitutes.

Lovely.
It’s a shame they couldn’t make it work even for this year, I was really hoping for an icelake based MBP. Probably could have had near MBA level battery life judging by previous die shrinks like Broadwell. If Apple really are gearing up to switch over to A series chips from 2020, then I expect this probably played a part. I think I’d be ok with an A series MBP, as I don’t do anything that would break compatibility for, though a lot of others aren’t going to be happy with it.
 
  • Like
Reactions: Falhófnir
Are there any dates for this? WWDC likely?
I’d say there’s a decent chance of seeing A11X touting iPad pros at WWDC, the A12 probably with the new iPhones in Q3. Going by recent patterns the A11X probably won’t be too far off of A12 scores, and the graphics capability will be out of this world (for an iGPU) if it’s a significant step up over the A10X...
 
I’d say there’s a decent chance of seeing A11X touting iPad pros at WWDC, the A12 probably with the new iPhones in Q3. Going by recent patterns the A11X probably won’t be too far off of A12 scores, and the graphics capability will be out of this world (for an iGPU) if it’s a significant step up over the A10X...
Yup. Plus what makes those A12 numbers believable is that they are “only” 20-25% higher than A11 scores. A11X would be great in a MacBook too, esp. since it is also rumoured to be 7 nm too.
 
No 14” MacBook rumors? It’d be good to see a return of that form factor.

Also, what kind of chips would we likely see in an update 2018 MacBook? Any major processor or graphics improvements? I understand they use a 4.5 W TDP thermal envelope, which is super low-power.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.