Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Sorry, but it will run also faster on older OS. Every computer company is building new computers every year with better everything. The older ones gets price reduction. Don't start bragging that the other companies don't make as much money as Apple does. At least their customers get what they pay for. It's all same old, same old at Apple. I wonder how long this will last till also the ones with Apple glasses on will recognize that there beloved company is really taking it too far. Wake up!

As long as the Mac is at least barely capable of creating software for iOS, they really don't care and if you want a Mac, you really have to love it for one to be worth it at this point.
 
Hm, 15" Retina MacBook Pro was released prior to the upcoming OSX version (ML) being available...

In fairness yeah, forgot that precedence and I bought the 2012 on Lion when it came out and it was buggy as **** until ML. I think they learnt their lesson.
 
As long as the Mac is at least barely capable of creating software for iOS, they really don't care and if you want a Mac, you really have to love it for one to be worth it at this point.

I think this is where I start to disagree. I'm not a software dev, but I'm not sure why iOS dev couldn't be done on an iOS device down the road. So, I'm not sure that's a valid reason to have to keep Macs around.

But, if things were better on the other side of the fence (i.e.: Windows/Unix) some of us with the skills to make the jump already would have. I'm very much concerned about the rapid decline of OS X and Mac, but in my experience, we're not at that tipping point yet, unless you really, really need something on the Win side (specific hardware or software). It's no fun over there either! (It just used to be mostly fun on the Mac.)
 
Sorry, but it will run also faster on older OS. Every computer company is building new computers every year with better everything. The older ones gets price reduction. Don't start bragging that the other companies don't make as much money as Apple does. At least their customers get what they pay for. It's all same old, same old at Apple. I wonder how long this will last till also the ones with Apple glasses on will recognize that there beloved company is really taking it too far. Wake up!

Nope. 2012 rMBPs defo didn't run faster on the older OS. And actually, this particular model had issues with animations for things like Expose until El Cap and the switch to Metal. My jaw dropped when I first did the trackpad swipe and it was ultra smooth.
 
The rest of the email is missing. He said "Stay tuned. We're working on getting a really good deal on 5400RPM hard drives."
 
  • Like
Reactions: davys
Ok, I'm not sure where you're trying to go with this. I've worked in IS/IT for over 25 years now, part of that in a Fortune 100. I spent a good bit of that time working with an industrial design firm, where I did all their CAD and 3D rendering. So, I know a thing or two about it from both perspectives.


then you should definitely realize all of your work happens prior to the point of pushing the render button.. and that CAD applications are single threaded..
this is true of just about every single other application out there.. all of the user's work is happening on a few cores at best and the only time an application is utilizing many multiple cores is when the user is no longer working..
there are no example's of a computer firing all cores while a user is sitting there doing something with the process..


Or, I could run a render job and assign that to several of the cores, while the rest are available to use in the CAD app to do some modeling, or I could be working in the on setting up the next scene while I wait for the render results.
yes, you can do that.
but if your goal is to have a render finish asap then it's a ridiculous idea to do it this way anymore.



Ok, you caught me. I haven't done any rendering for a couple of years now as I've been focused on other things and my animation software is incompatible with El Cap. That's pretty easily resolvable should a job come along, or my focus change.

But, I do follow the industry, and do some CAD from time to time. I'm pretty sure the 3D industry hasn't changed that much. If your software can't use multiple cores and RAM, I can recommend some other apps for you.
great, i would really like for you to recommend to me some CAD/3D software which will utilize multiple cores.. i think if you do that, you might just realize there aren't any.. or that all the software listed as being able to use multiple cores are speaking about a few very specific functions such as the rendering portion of the software.


Absolutely. I've been using Renderama on multiple computers since before there were even multi-core machines. It will use AS MANY cores and nearly as much RAM (and as many computers) as I can feed it, and yes, IT WILL get the job done more quickly if I do. Fortunately, there are now cloud services like Amazon Cloud computing where I could rent whatever computing capacity I'd like, if I wanted to.

if you're aware of cloud rendering services then you should know how much cheaper and faster they are than purchasing some 24 core personal computer and using like a render farm.. doing that is about the biggest rip off in all of computing these days.. you'll get small increases in render speed for relatively enormous amounts of money.

personally, i used to use RebusFarm for rendering but now am using Autodesk's cloud since it's so well integrated into the software.. if i'm not under time restraints then i'll still render locally as in the past.

And, absolutely, I can continue working on other projects/software while using local cores and RAM for rendering. Again, if you can't, I'd recommend checking some other software out.
like i said, i'm looking forward to see your recommendations.. i only ask that you don't go to Autodesk's site, see a checkbox stating (whichever) software package is multithreaded, then come back here and say "see, AutoCad will make use of all available cores".. speak of what you know to be true based off use cases instead of some blanket statement or advertising by whatever company.


Well, then you realize the importance of cores and RAM, or you need some IT/software advice. :)
you're giving bad advice.. this isn't the 90s anymore.. sorry.
 
McDonald's cashier: "It's coming."
Tim Cook: "Stay tuned."
[doublepost=1473547799][/doublepost]
aww Tim, why the double spaces after the period? why Tim why!?
Let's be charitable and assume he dictated this to his secretary, who put the double spaces in between sentences.
 
Screen Shot 2016-09-10 at 4.48.24 PM.png
 
Because he’s not a Millennial. That’s how adults were taught to type.
At 45, I learned to type on a typewriter in my teens. All the typing tutorials back in the 80s were for typewriters, which only offered a monospace Courier font. On computers with proportional fonts, double spaces after periods looks awful. Even paper books and magazines use single spaces after periods.

And GIF is hard-G.
 
then you should definitely realize all of your work happens prior to the point of pushing the render button.. and that CAD applications are single threaded..
this is true of just about every single other application out there.. all of the user's work is happening on a few cores at best and the only time an application is utilizing many multiple cores is when the user is no longer working..
there are no example's of a computer firing all cores while a user is sitting there doing something with the process..


Sorry, this is just technically inaccurate. We could argue about the level to which things are multi-threaded. No, they don't magically, evenly distribute across the cores... but I never argued they did.

I'm not sure what your issue is. You act like I'm arguing against faster cores. I'm not. I'm just telling you that the reality is more cores and less advancement in single-core performance. It's not a matter of whether I like that or not, it's reality.


yes, you can do that.
but if your goal is to have a render finish asap then it's a ridiculous idea to do it this way anymore.

No, because with lots of cores, having those rendering jobs run doesn't much impact the other things I'm doing, nor the render job. Why is it ridiculous to use unused computing power, especially for things like preview renders as you work, etc.? It's inefficient to send those off to a remote render-farm.

great, i would really like for you to recommend to me some CAD/3D software which will utilize multiple cores.. i think if you do that, you might just realize there aren't any.. or that all the software listed as being able to use multiple cores are speaking about a few very specific functions such as the rendering portion of the software.

Electric Image does, and I'm pretty sure Cinema 4D does. ViaCAD 3D/Shark FX does. Of course it's limited to certain functions. Again, I'm not saying every process evenly distributed across cores. That would be great, but that doesn't mean multi-threading is useless either.

if you're aware of cloud rendering services then you should know how much cheaper and faster they are than purchasing some 24 core personal computer and using like a render farm.. doing that is about the biggest rip off in all of computing these days.. you'll get small increases in render speed for relatively enormous amounts of money.

personally, i used to use RebusFarm for rendering but now am using Autodesk's cloud since it's so well integrated into the software.. if i'm not under time restraints then i'll still render locally as in the past.

Well, they aren't small increases. Renderama uses nearly 100% of whatever hardware I give it, whether that's in the local machine, within the local network, or cloud. The overhead is mostly initial file transfer and return of the results, and some stitching for the master. But, if you're just rendering a quick preview frame, etc. then that's normally done locally... which means all those local cores help quite a bit.

But yes, I'd not build up a shelf of machines anymore, when I can just rent some time on an array of cloud-based computer instances.

like i said, i'm looking forward to see your recommendations.. i only ask that you don't go to Autodesk's site, see a checkbox stating (whichever) software package is multithreaded, then come back here and say "see, AutoCad will make use of all available cores".. speak of what you know to be true based off use cases instead of some blanket statement or advertising by whatever company.

And, I wish you'd stop putting words in my mouth. I think I've been pretty clear about what I'm claiming. You're constructing a straw man there... but I'm used to spotting those.
 
“I love the Mac and we are very committed to it. Stay

tuned.


Tim”

Why does this sound just like Drumpf? Same specifics and details in the proclamation.


Oh Tim!
 
Last edited:
I am actually quite surprised that people haven't taken up my challenge to explain why they need new machines, other than they don't like paying for older technology, even when it is just as capable.
I think it's less about need than want—a reason to get excited about a new purchase. I got a first-gen 12" MacBook and got disillusioned with the keyboard pretty early on. For nearly a year I've been thinking about switching to a 13" MBP, but the promise of a new model was just over the horizon, waiting to be announced at the next Apple event. If Apple doesn't announce another event by next month, I'll just pick up a used late-model MBP, which solves the keyboard problem. The only reason I'm holding back now is the additional likelihood of a jet black version (there's still time for Apple to release a 10th Anniversary Black MacBook!).

There's no reason to release an iPhone every year on technical grounds. Let's face it, technology is fashion.
 
I'm not sure what your issue is. You act like I'm arguing against faster cores. I'm not. I'm just telling you that the reality is more cores and less advancement in single-core performance. It's not a matter of whether I like that or not, it's reality.
more cores and faster cores are two separate things but i feel you're saying "well, if we can't get faster cores then we might as well get more cores so things will more-or-less even out that way"

..which is simply a myth and it's propagated by many people around here.. and often with the tag of "pro's needs" added to it as a means to strengthen the myth (for whatever reason).

No, because with lots of cores, having those rendering jobs run doesn't much impact the other things I'm doing, nor the render job. Why is it ridiculous to use unused computing power, especially for things like preview renders as you work, etc.? It's inefficient to send those off to a remote render-farm.
most renderers have openGL previews these days which is more-or-less giving you instant feedback.. the rendering software i use is openCL and if i'm wanting ray-traced previews, i can do smaller resolution and again achieve nearly real-time feedback..
for finals, i'll either let a set cook overnight or if i need them right now, i'll send them to cloud.. via cloud, i can get an image in under 1 or 2 minutes which, last decade, would of needed up to 24 hours to complete on my mac pro.

what i meant by 'ridiculous' was the notion that a 'pro' should or would spend $15,000 on a 24core machine in order to get a render back in 4 hours as opposed to 12 on an 8core machine...

that's a complete ripoff when compared to spending $1 to get the image back in 1 minute from a 64,000 core supercomputer via their 4 or 6core personal computer.



Electric Image does, and I'm pretty sure Cinema 4D does. ViaCAD 3D/Shark FX does. Of course it's limited to certain functions.
right.. for rendering o_O

Again, I'm not saying every process evenly distributed across cores. That would be great, but that doesn't mean multi-threading is useless either.
it's not useless at all.. but sub-$5000 setups utilizing 4-6 cores.. 8-cores tops..
is where the benefits are happening.. anything over that is a complete waste of money for nearly every single operation out there from a professional on a personal computer.

'bring on the cores' is bad info.. if your application can scale to 112 cores, don't recommend 12 cores vs 8.. recommend 4 or 6 faster cores (as in- you should be using the fastest single core speeds available instead of a bunch of slow ones that will sit idle 95% of the time).. if an application can scale to 112 cores then it's highly likely there are that many cores available for you to use via cloud at a cost far (FAR!) cheaper than it would cost to purchase.. and certainly far cheaper than you'll be spending to get the minuscule addition of 4 more cores to your personal computer with negligible speed increases.


Well, they aren't small increases. Renderama uses nearly 100% of whatever hardware I give it, whether that's in the local machine, within the local network, or cloud. The overhead is mostly initial file transfer and return of the results, and some stitching for the master. But, if you're just rendering a quick preview frame, etc. then that's normally done locally... which means all those local cores help quite a bit.

idk, spending thousands of dollars in order to get a render back in 20 minutes instead of 40 isn't what i'd call 'quite a bit'..

i think if you analyzed the entirety of a given architecture (for example) project, you'd see how it makes almost zero sense to invest in mega core machines..

you're going to spend weeks designing/modeling/etc. with the computer.. then spend a whole bunch of extra money in order to complete a render package in 1/2 day instead of 1 day ???

so much more work goes into a project other than rendering.. when you look at the project as a whole, i think you'll find that doubling the speed of a render is of nearly inconsequential difference and the project speed has maybe increased by a fraction of a percentage.
 
Last edited:
I didn't know he had been "knighted".
Old news.
[doublepost=1473555910][/doublepost]
it appears their reasoning behind them removing the phone jack isn't because they want headphones to have lightning plugs instead.

it's because they'd like to continue the push towards wireless connectivity.

(see W1 chip for more clues alluding to this)

in other words, i think we're more likely to see these new chips inside future macs than a lightning port.
I doubt wireless was the reason, otherwise they would've shipped the iPhone without Lightning headphones and left them as a supplemental purchase.

They dropped the single-tasking audio jack because the Lightning port already does the job, and at some point in the design cycle, having the redundant function has too great an opportunity cost. Whether it's adding a speaker and Taptic engine this year, or dropping the chins next year, the analog jack becomes wasted space. There's no consumer demand for dropping the jack, but technically it's in the consumer's long term interest.
 
Ok... So given the choices today (MacBook Pro worthy Kaby Lake cpus won''t be available until next year), in a MacBook Pro, you'd rather have Intel's just-announced m-series low-power Kaby Lake cpu and suffer poorer performance over a recently released higher power dissipation Skylake cpu worthy of putting in a MBP?

Seems like you are more into labels (Kaby Lake sounds better than Skylake?) rather than actual performance.

The ignorance is strong on this one.

First, who's talking about an M Core processor?

Second, it seems you missed the quotes around low powered. As in "low powered".

Third, the Dell XPS is (apparently) being released with a Core i7 7500U, which is an improvement over Skylake performance wise in both CPU and iGPU. Compared to a 4th gen Core i7, which is what is being sold now in the 15'' rMBP line, the 7500U is not only an increase in CPU performance but specially in power efficiency and GPU performance. Since everything these days depends on both CPU and GPU processing (from your browser to macOS) I'd say that yes, I'm very interested in performance.
 
  • Like
Reactions: symphara
The iMac supports 64 GB of memory, how much do you need? Likewise TB supports an unlimited amount of storage.

The iMac only supports 64GB of memory using third party components. Not a big deal anyway. Thunderbolt will require the purchase of a TB chassis together with the machine just to continue with the storage we currently have. Again not a big deal but something that has to be done.
 
... but i feel you're saying "well, if we can't get faster cores then we might as well get more cores so things will more-or-less even out that way"

No, I'm saying since we can't get to faster cores, then they may as well give us more cores for our money/advancement. And, for sure, we'll take any speed increases they can figure out too. I don't want them to just keep selling the same 2-4 core chip for the next decade.

In the end, it's about how much computing power comes in the package (look at GPU advancements!), and then finding ways to take advantage of it. For most users, current speeds and cores are overkill. And, yes, certain functions need as much single-core speed as we can get.

It doesn't necessarily even out, depending on the software and use, but the power is there to be tapped into at least. And, afaik, we're running into physics limitations right now. In that case, give us more cores and we'll find ways to use them.

... and often with the tag of "pro's needs" added to it as a means to strengthen the myth (for whatever reason).

It's because there are certain types of computing jobs, and certain types of software that tend to be able to utilize that extra power. It's a myth in that it isn't automatic and across the board. But, if I have a 4-core, maybe I can have a rendering or Folding@home going in the background, and still effectively use xyz app without noticing much impact. But, if I have a 12-core, I could run Folding@home, a rendering engine, be encoding some video, be compiling some software, and still use whatever app I need to be using while that's all happening. That could happen on the 4-core too, I suppose, but it's less likely to be as smooth and all the background processes won't finish as quickly.

Sure, most of the time many of those cores are going to waste (in terms of pure productivity), but I can always find things to keep them busy (ex: Folding@home).


most renderers have openGL previews these days which is more-or-less giving you instant feedback.. the rendering software i use is openCL and if i'm wanting ray-traced previews...

Oh sure, a lot has been off-loaded to the GPUs now (which, as noted above, are highly multi-core). But, say you're importing a model into the CAD program, that could run on a core so you could keep working w/o slowdown. Or, maybe the calculations of some plugin in the 3D program might run on a core, not slowing down aspects of the main app. Sometimes, things which appear to you as being single are actually handled by routines which are broken down across multiple cores. It just depends.

Plus, as OSs and development languages/tools advance, more aspects are becoming multi-threaded at a lower level.


what i meant by 'ridiculous' was the notion that a 'pro' should or would spend $15,000 on a 24core machine in order to get a render back in 4 hours as opposed to 12 on an 8core machine... that's a complete ripoff when compared to spending $1 to get the image back in 1 minute from a 64,000 core supercomputer via their 4 or 6core personal computer.

No argument there, though I'd be a bit surprised if that kind of cloud computing was that cheap. But, the general principal, yes. But, if I'm spending a certain amount on a computer, I'd rather have a 12-core than a 4-core if the speed isn't going to be that much different anyway.

right.. for rendering o_O

More than rendering, but rendering is a good example, because the programmers took the time to make that work. They can do that to other aspects of the program as well. The rendering aspect just works particularly well for me, as it was designed as a stand-alone, scalable app. So, not only does it scale, but it does so incredibly efficiently. Like I said, I can use nearly 100% of any hardware available. It isn't like the 2-core 100%, 4-core 80%, 8-core 60%, etc.

it's not useless at all.. but sub-$5000 setups utilizing 4-6 cores.. 8-cores tops..
is where the benefits are happening.. anything over that is a complete waste of money for nearly every single operation out there from a professional on a personal computer.

I agree in general... for many apps and many people. But, if they give us 12 or 22 or whatever cores in that same sub-$5000 box, or even if I have to pay an extra $1000 to get that, then I'd take it. And, that seems to be where Intel says they are going, unless I've misunderstood.

'bring on the cores' is bad info.. if your application can scale to 112 cores, don't recommend 12 cores vs 8.. recommend 4 or 6 faster cores (as in- you should be using the fastest single core speeds available instead of a bunch of slow ones that will sit idle 95% of the time).. if an application can scale to 112 cores then it's highly likely there are that many cores available for you to use via cloud at a cost far (FAR!) less cheaper than it would cost to purchase.. and certainly far less cheaper than you'll be spending to get the minuscule addition of 4 more cores to your personal computer with negligible speed increases.

But, the problem is the faster ones are capping out. So, say you could have 12-cores at 2.9 GHz or 4-cores at 3.0 GHz. I'd take the 12 any day. But, that would depend. If you're an average computer user, or a gamer, maybe go with the 4-cores as you'll gain a bit of speed and not use the extra cores. I'll use them.

That's more like what we're currently facing. It's not like the choices are 4-cores @ 3GHz vs 12-cores @ 1GHz.

idk, spending thousands of dollars in order to get a render back in 20 minutes instead of 40 isn't what i'd call 'quite a bit'..


That just depends on what your priorities, though I agree that the cloud is more cost effective for some applications. Say you're running some physics analysis that doesn't scale to the cloud, then you'll pay to cut times in half.


you're going to spend weeks designing/modeling/etc. with the computer.. then spend a whole bunch of extra money in order to complete a render package in 1/2 day instead of 1 day ???

But, what if I can decrease my processing time from 1-day to a 1/2 day, in the background, while I'm still working on the next project? Again, that would have to be cost-analyzed for each situation. Not everyone (or every app) has cloud-computing capability.


They dropped the single-tasking audio jack because the Lightning port already does the job, and at some point in the design cycle, having the redundant function has too great an opportunity cost. Whether it's adding a speaker and Taptic engine this year, or dropping the chins next year, the analog jack becomes wasted space. There's no consumer demand for dropping the jack, but technically it's in the consumer's long term interest.

It's a matter of priorities... it needs a 3.5mm jack more than it needs 'stereo' speakers. They could drop that speaker too if they really needed the space. But, they figured that gimmick is more sexy than keeping the jack. And, it's hardly in the long-term interest of the consumer unless the future is Lightening audio (which it is most certainly NOT!).

the Dell XPS is (apparently) being released with a Core i7 7500U, which is an improvement over Skylake performance wise in both CPU and iGPU.

Yea, it's probably more about the iGPU, bus, and other components than advancements to the CPU itself. For example, whatever chip they put in there, I want TB3/USB-C 3.1, etc. The exact details of the CPU cores aren't as important to me. But, sure, I'll always take more performance at lower power. :)
 
  • Like
Reactions: symphara
Hey Tim.
Nvidia graphics cards.
VR.
That would be an 'upgrade' to Macs, some some cosmetic makeover.






Apple CEO Tim Cook has responded

tim-cook-mac-email.jpg

An update to the Mac lineup at some point is inevitable, and the bigger focus is now on when that will happen. The latest word is that Apple will release new MacBook Pro and MacBook Air models with USB-C ports as early as October, while updated iMac models with an option for new AMD graphics chips are also in the works.

That report reiterated rumors that the new MacBook Pro will be thinner and include an OLED-based touch bar along the top of the flatter keyboard, which will present functions that dynamically fit the current task or application, as well as integrate Touch ID to enable users to quickly log in using their fingerprint.

Our own Mac Buyer's Guide shows that it has been 479 days since the MacBook Pro was last updated, while the MacBook Air has not been refreshed in 550 days. Similarly, the Mac mini has not been refreshed in 694 days, while the Mac Pro is five days shy of 1000 days since its last update. iMacs stand at 332 days.

Article Link: Tim Cook Says Apple is 'Very Committed' to the Mac and to 'Stay Tuned'
 
The iMac only supports 64GB of memory using third party components. Not a big deal anyway. Thunderbolt will require the purchase of a TB chassis together with the machine just to continue with the storage we currently have. Again not a big deal but something that has to be done.
You'd be silly not to use 3rd party ram, it has always been that way. Many great external TB chassis on the market now for very reasonable prices. I'm partial to the OWC Thunderbay 4 mini.
 
  • Like
Reactions: SteveW928
I am actually quite surprised that people haven't taken up my challenge to explain why they need new machines, other than they don't like paying for older technology, even when it is just as capable.

Well, one reason would be if you hit the CPU limit in your existing machine.
I can do that with my 2013 rMBP using Cubase [or Logic] when running multiple VST instruments and effects.
There are workarounds like freezing tracks, rendering VST tracks to audio, but these are all compromises to get around the fact that the limit of the CPU has been reached. And often the workarounds completely destroy the creative process.

So yeah. Thats why many need new computers, because realtime software like Cubase and Logic are limited by CPU.
 
Last edited:
  • Like
Reactions: symphara
If you fear change, Buy what is available now to be on the safe side. My Mac book pro arrived last week and i love it. Same goes for the iphone. Buy the iphone 6s and don't look back. Get the **** now while it is still available.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.