Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
There’s something skunk about the MS Office suite. I’ve always had problems with it. Then you download the open source clones of MS Office and they’re snappy. Have you tried Libre Office?

I prefer using Apple's iWork suite. But sometimes I'm forced to use MS Office because of client supplied files.
 
  • Like
Reactions: smirking
https://www.notebookcheck.net/Radeon-RX-580-vs-GeForce-GTX-1070-Max-Q_8108_8008.247598.0.html
No idea where you get that impression from. And that’s a desktop PCIe RX 580. The blackmagic eGPU likely has a downclocked solidered 580. Take a further 15-20% performance hit from TB3 and it’ll be solidly slower.
Oddly, I was getting that from the same site but a different page: https://www.notebookcheck.net/NVIDIA-GeForce-GTX-1070-Max-Q-GPU-Benchmarks-and-Specs.224732.0.html
I think you're right.
[doublepost=1531770049][/doublepost]
There’s something skunk about the MS Office suite. I’ve always had problems with it. Then you download the open source clones of MS Office and they’re snappy. Have you tried Libre Office?
Maybe cause it's asking every machine on the local subnet if they have a copy running with the same license :D
I'm half kidding. It used to do that with '08, would be surprised if they still did it.
 
  • Like
Reactions: smirking
Thanks for highlighting just one issue with dongles. An HDMI port might not be needed by everyone but when you do need it HDMI that works is extremely important. It just amazes me that Apple apparently doesnt have excutives or engineers that travel, if they did they would never have permitted HDMI deletion.

How about this one... just back from London where neither my Anker or Apple USB-C dongles would work with the HDMI monitor... screwing around for a over 30min with reboots, etc to no avail... ended up showing stuff on an HP Envy (oh and the Dell XPSs and Lenovos in the room had no issue).

It was embarrassing and if Timmy is going to have us live in dongle hell with no recourse, then get those dongles implemented solidly (and if you say "well you can't for all those different models" then don't send us to dongle hell then)... but I guess "it just works" is as dead as disco
 
Nope. We’re still talking about normal, everyday economics. $1 in 1990 could buy as much as $1.95 now. So if Apple was selling a laptop for the same sticker price in 1990, their new laptops are half as expensive.
Which is also why average US wages have been plummeting since the 1950s if you account for inflation.
 
That makes no sense, you would still need to carry the mouse, keyboard and monitor... they have those already, they are called 'laptops'.
Yes, when you wanted to use your powerful Mini on the train, you would use a super thin and small laptop to be the keyboard and monitor, hopefully connected wirelessly. Together they wouldn’t weigh more than a 15” but be much more powerful and comfortable. At home or office you could plug into big monitor and favorite keyboard, and also still use the little laptop on the couch to control it.
 
Nope. We’re still talking about normal, everyday economics. $1 in 1990 could buy as much as $1.95 now. So if Apple was selling a laptop for the same sticker price in 1990, their new laptops are half as expensive.

It ain’t 1990 anymore, homie. And inflation has been negligible since 2016, so that excuse isn’t going to fly.

There was a MASSIVE price hike to the MacBook Pro line with the introduction of the gimmick touch bar. This isn’t disputable.
 
Well, I said 1060 is almost 3x the performance of 560, not 1050ti. If the 560 is 70% of 1050ti, then the 1050ti is 43% faster. And 1060 itself is 80% faster than 1050ti. So that’s like 150% faster or 2.5x the 560. 560x is only a rebadge and slight clock boost and Apple underclocks the chips.
Nvidia is so ridiculously ahead in terms of performance/watt it’s not even funny. If you look at the power draw, the 1050ti only draws on average 10% more under load but performs 30%-90% better.

Looks like you didn't bother to check the benchmark posted of 560 vs 1050ti....
[doublepost=1531775551][/doublepost]
https://www.notebookcheck.net/Radeon-RX-580-vs-GeForce-GTX-1070-Max-Q_8108_8008.247598.0.html
No idea where you get that impression from. And that’s a desktop PCIe RX 580. The blackmagic eGPU likely has a downclocked solidered 580. Take a further 15-20% performance hit from TB3 and it’ll be solidly slower.

Why should it be down clocked? And why are you assuming that a 1070 can run full speed in a ultrabook? Prove it! I never seen one running without throttling, even the Dell throttle with the 1050ti
[doublepost=1531775762][/doublepost]
Oddly, I was getting that from the same site but a different page: https://www.notebookcheck.net/NVIDIA-GeForce-GTX-1070-Max-Q-GPU-Benchmarks-and-Specs.224732.0.html
I think you're right.
[doublepost=1531770049][/doublepost]
Maybe cause it's asking every machine on the local subnet if they have a copy running with the same license :D
I'm half kidding. It used to do that with '08, would be surprised if they still did it.

Well in some games and app the 580 is ahead in some the 1070 is according to your first link.
[doublepost=1531776122][/doublepost]
It ain’t 1990 anymore, homie. And inflation has been negligible since 2016, so that excuse isn’t going to fly.

There was a MASSIVE price hike to the MacBook Pro line with the introduction of the gimmick touch bar. This isn’t disputable.

2.32% on average per year according to Statista.com in the US for the past 10 years
 
  • Like
Reactions: Derived
Make or save more money if you want it. Life is about trade offs.

You missed the whole point. I was talking about value. I could buy 10 MacBooks if I wanted, which has nothing to do with being able to afford one (or 10). I think Apple prices are too high and only going up. Life is also about making good decisions.

I take my laptop on the road with me everyday. I have dropped a few, had a few overheat in the car, and had one stolen. I like to use Apple products but I feel spending 2k or 3k for something that might get damaged or worse is not a good idea. Life is about trade offs for sure, but to assume I am complaining because I don't have the money to buy one is a little off base. I am complaining because I could use a lower cost option that makes more sense to take on the road. Unfortunately Apple has all but abandoned their lower cost products.
 
It ain’t 1990 anymore, homie. And inflation has been negligible since 2016, so that excuse isn’t going to fly.

There was a MASSIVE price hike to the MacBook Pro line with the introduction of the gimmick touch bar. This isn’t disputable.
Inflation is definitely not negligible. If anything I started seeing prices increases everywhere specifically after 2016.

Edit: But also I think computers should be getting cheaper, not more expensive. A low-end processor is as fast as an older high-end processor. Problem is software is getting heavier every day.
 
Last edited:
  • Like
Reactions: smirking
Someone posted a 7lb neon monster the other day, with dual 1080’s, that requires two (2!) 330w power supplies plugged into it to be used for more than two hours and complained that the new MBP had worse graphical performance than it. Where do you even start with that?

Must have been a regular MacRumors forum poster.

Some of the comments in this thread are shameful, if this machine isn't for you then stop clogging up & junking up threads with tired, old, unfunny jokes and memes and go buy a plastic Dell.

Physics and math aren't magic. The Man isn't trying to hold you back. You want "better GPUs" in MacBook Pros, you'll pay dearly in battery life, heat, weight, noise and nearly everything else. For something that is not needed most of the time. Hence eGPU. I bet the full price of a new maxed out 15" that at least 90% of the commenters here whining about the GPU wouldn't even come close to using the full potential of what that machine already comes with in anything other than benchmark racing.

The Touch Bar isn't going anywhere. SD card slots aren't coming back. Neither is USB-A. Or DVI ports. Or HDMI. Or ethernet. It's well past time to move past that. The same complaints happen every time a new standard is released. And all that happens is that all the complainers look very silly a few years down the road. And the rest of us just move on and buy new cables and enjoy the better user experience. I wish they had found a way to keep MagSafe, but that's really the only complaint I have with the new design.

What's next, holding out for an ExpressCard slot?
[doublepost=1531782683][/doublepost]
btw when people say "nobody cares about thin and light," they're lying.

I for one am constantly amazed at how thin and light my macbook is compared to my laptops even 5 years ago.

I quite literally bought an iPad Pro as one half of a 2012 non-retina 15" MacBook Pro replacement (other half is a classic Mac Pro), as I found it was just easier to carry it around...not because I couldn't possibly handle the back-breaking weight of a 5-6 pound laptop, but it was just less of a hassle, and if the iPad slides off the couch onto the carpeted floor, nothing happens. The MacBook almost certainly will be damaged in some way.

But now the 15" is so thin and light, it's not much heavier or bigger in footprint, or even thicker, than a 13" iPad Pro with the keyboard cover. So, given that I'm sort of tiring of using the iPad as a primary machine when I'm away from my desk...I think it'll get replaced by one of those.

The comparison between a current machine and my old 2012 is almost laughable, and to think, those were "thin and light" back in the day, and people back then were whining about Apple "prioritizing form over function" - always makes me smile.
[doublepost=1531785245][/doublepost]
Inflation? Are we in Venezuela now?

Inflation exists in literally every country in the world, it's about 3% a year on average in the US.
 
:rolleyes:

As I own one (a 2018 2.6/16/512/560x), it says more about you that you think I'm only looking for reasons to bash them. It's a solid update in some respects - notably the CPU - as I've noted. That does not mean it's immune to criticism, and my criticisms of the 2016/17/18 form factor remain, including the frippery that is the touchbar.
Every comment you have posted is negative. Why did you waste your money then?
 
it's 2018 and people are still complaining about the graphics chip in a macbook pro. when will these people quit?
The answer is NEVER. Whatever transpired between Apple and NVIDIA will never be known, but after the 860M GT debacle and the GT650 video issues, both of which caused recalls, it should be quite apparent to all concerned that Apple has absolutely ZERO interest in partnering with NVIDIA ever again. Anyone thinking otherwise underestimates the bad blood between them.
 
So, Intel has one of the biggest year over year performance gains since 2011 in their 8th gen processor lineup, and the MacBook Pro which uses Intel's new 8th gen processors has the same performance gains?

Wow didn't see that one coming.

They cheated by adding 50% more cores. We are comparing hexa-core with quad-core in 15 inch, and quad-core with dual-core in 13 inch

Per-core-wise, it's not that interesting.

What Intel is really progressing and is now actually cashing in on is thermal reduction and power efficiency, thanks to that improved lithography, not so much about progress in chip architecture.
 
I don't think people are saying "a single grain of dust will cause your keyboard to fail"

There really are people saying exactly that, but to be fair, most probably mean it in a figurative way and some probably haven't thought through the implications of what they're saying.

we hear stories of people having gotten their keyboard repaired/exchanged and see problems again only a few months later. You'd need be extremely unlucky to get hit by the same rare manufacturing flaw multiple times.

Unlucky, sure, but uncommon for incidents of people getting multiple replacements? It shouldn't be surprising to see. Credible sounding estimates of the failure rate that I've seen passed around are between 5 and 12%. If it's anywhere around those numbers, there will be scores of people who are unlucky enough to get a bad copy multiple times. In some of those cases, you probably do need to blame the user, but excluding those, the laws of probability will produce plenty of people who get unlucky numerous times in a row. People have won the lottery multiple times. A number of people have even won multiple times on the same day!

We also have heard stories from people with 2015 MBs, 2016 MBPs, and 2017 MBPs. If there was a manufacturing flaw, it should have been fixed (or at least significantly improved) after more than two years of production.

It's probably BOTH. There's probably a flaw in the design, but the flaw is only fully exposed when there's a slippage in the manufacturing process. If it were a design flaw alone, we're going to see the failure rate trend toward 100% in year 1 and trend toward 0% after they adjust the design.

I have expensive weather sealed prime lenses for my dSLR that have dust in them. If you can't keep dust out of a weather sealed lens, you're not keeping it out of a keyboard. Dust is a fact of life.

But in the end I'm convinced the underlying reason is a susceptibility to dust-ingress caused problems that causes the keyboard to fail for too many people. And 'too many people' could be 0.1% of MBP buyers. But I'd be willing to bet that the milder form (ie, one fixed with compressed air) does affect more than 0.1% of all MBP buyers.

I do believe that it's rational to conclude that dust has something to do with it. I just don't think it's as simple as it looks. Dust is probably not the only culprit and may not even be the main culprit. It's merely the most noticeable one.

The only people who have the data and experience to know for sure is Apple and they're not going to be telling us anytime soon.
 
Last edited:
  • Like
Reactions: artfossil
They cheated by adding 50% more cores. We are comparing hexa-core with quad-core in 15 inch, and quad-core with dual-core in 13 inch

Per-core-wise, it's not that interesting.

What Intel is really progressing and is now actually cashing in on is thermal reduction and power efficiency, thanks to that improved lithography, not so much about progress in chip architecture.
How is this not interesting? For a long time, a workload has had to use multiprocessing effectively, and it's rare that anything able to take advantage of 4 cores won't be able to take advantage of 6. Most common case I can think of is you have exactly 4 single-threaded programs each taking 100% CPU.
[doublepost=1531800738][/doublepost]
it's 2018 and people are still complaining about the graphics chip in a macbook pro. when will these people quit?
When Apple puts a good graphics chip in, or eGPUs make it unnecessary. eGPUs aren't ready yet. Too many gotchas for average users who need graphics power, too many limitations even for enthusiasts. And Apple's married themselves to AMD while also abandoning OpenCL/GL in favor of Metal. Plenty of people who find Apple's hardware not viable as a result. I'm just glad I'm not one of them.

People will also quit complaining if they give up and switch to Windows. But for a while everyone had hope that somehow things would work out, and there's still some hope.
 
Last edited:
  • Like
Reactions: geromi912
You need to get out of the house more...

MSI Gaming Laptop with nVidia 1060

We have some similar models at work for show laptops....they are about twice as thick as the 1st gen 15" touchbar MBPs.

Sure they have a bunch of "gimmick" looking crap; but I have been in the live production business a long time and these MSIs are by far the best ****ing windows laptops I have ever worked with. And until 4k starts becoming affordable, native 1080p displays are actually better than some weird ass Retina display resolution.
[doublepost=1531737852][/doublepost]

This could easily be done if Jonny wasn't so hell bent on making them paper thin.

I am going to add that the MSIs we bought were less than $1500; 16GB RAM, SDD + 1 conventional drive. 4GB GPU.

3.175cm thick, that's thicker than an Intel NUC. I don't get out of the house much you're right.
 
I don't recall that. That sure would be nice though. That kind of battery life in Macs and iOS devices would be amazing.
I don't recall that. That sure would be nice though. That kind of battery life in Macs and iOS devices would be amazing.

Here you go:
https://www.macrumors.com/2015/08/24/hydrogen-fuel-cell-phone/

My mistake, I thought Apple had already acquired them... still, it’s strange that we haven’t heard anything more after 3 years on this technology...
[doublepost=1531805336][/doublepost]
We’ve been “on the verge” of a battery breakthrough for a decade. None of those is coming to consumer devices anytime soon. Lithium ion batteries were first prototyped many years before they made their way into our devices.

https://www.macrumors.com/2015/08/24/hydrogen-fuel-cell-phone/
 
Here you go:
https://www.macrumors.com/2015/08/24/hydrogen-fuel-cell-phone/

My mistake, I thought Apple had already acquired them... still, it’s strange that we haven’t heard anything more after 3 years on this technology...

Oh yeah, I dimly remember this. There’s a convenience and availability issue with electronics devices with hydrogen I think. Electricity is convienently available all round us to charge batteries, but how do we charge hydrogen cells? Carry around bigger hydrogen cells? How do we charge those? Can’t charge them in your (non-hydrogen) cars, there’s no hydrogen piped to your home.

It seems like one of those things that would make a neat side detail in a classic (40s-60s) sci-fi short. An alternate timeline where batteries and electricity to the home lost and hydrogen was the norm. If it had been proposed and won out then it would seem as normal as plugging into the power unit in your car, or the outlet at home. Maybe I’ve just been reading too much old sci-fi lately :oops:
 
Last edited:
The answer is NEVER. Whatever transpired between Apple and NVIDIA will never be known, but after the 860M GT debacle and the GT650 video issues, both of which caused recalls, it should be quite apparent to all concerned that Apple has absolutely ZERO interest in partnering with NVIDIA ever again. Anyone thinking otherwise underestimates the bad blood between them.
Because the HD 6000 series had no issues? The D500 and D700s didn’t fail? The m295x didn’t burn through the screen?
 
That’s probably why they’re not called MacBook Game.

No but they're called Macbook Pro and the GPU in them is entry level at best.

The performance improvement on geekbench is not a surprise. It's the first bump in core count for the mobile form factor that intel have made since about that time.

Every other vendor has been shipping Coffee-Lake for months now.
[doublepost=1531806267][/doublepost]
Because the HD 6000 series had no issues? The D500 and D700s didn’t fail? The m295x didn’t burn through the screen?

It's more to do with Apple trying to kill CUDA on macOS, which is not cross-platform across hardware vendors.

Apple don't want to be tied to a single vendor, ever. Right now they're in bed with intel for CPU and AMD for GPU, but given the open standards they are promoting, they could change vendors overnight if required. OpenCL / vulkan / metal alternatives also can be used on their own GPUs, whereas CUDA can not.

Continuing to have CUDA applications on the Mac ties them to NVIDIA, so they don't want to encourage CUDA in any way by shipping NVIDIA hardware.
 
No but they're called Macbook Pro and the GPU in them is entry level at best.

The performance improvement on geekbench is not a surprise. It's the first bump in core count for the mobile form factor that intel have made since about that time.

Every other vendor has been shipping Coffee-Lake for months now.
[doublepost=1531806267][/doublepost]

It's more to do with Apple trying to kill CUDA on macOS, which is not cross-platform across hardware vendors.

Apple don't want to be tied to a single vendor, ever. Right now they're in bed with intel for CPU and AMD for GPU, but given the open standards they are promoting, they could change vendors overnight if required. OpenCL / vulkan / metal alternatives also can be used on their own GPUs, whereas CUDA can not.

Continuing to have CUDA applications on the Mac ties them to NVIDIA, so they don't want to encourage CUDA in any way by shipping NVIDIA hardware.
Funny, by completely relying on intel quick sync on FCPX they’re completely tied to intel CPUs. I don’t think they care about that.
 
Funny, by completely relying on intel quick sync on FCPX they’re completely tied to intel CPUs. I don’t think they care about that.

It works without quicksync. As the Ryzen hackintoshes out there can attest. Sure, it's slower, but they could replace that code if required as i suspect it's provided to the application via OS library.

CUDA is a third party library that Apple has no control over.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.