Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Make that statement once again when working with PSD files in excess of 1gb, dozens of layers in CMYK, etc.

I find it ridiculous how you make such a vague and general statement about professionals still on powermacs and powerbooks. How would you know how many professionals are still on a 6+ year old platform, including applications.

It's a problem quite a few people have around here. Most people think that because the computer they use for their daily tasks runs perfect for what they do, it's therefore perfect for everyone. All this excess power is useless, because it's not useful to them.

It's like could use Photoshop just to tweak colors in all my vacation photos. That's a task that'd hardly put a strain on an MBA. But you with your 1GB+ .psd files? That's just dumb. Why? Because I know for a fact that my MBA does everything I need it to.
 
Working in software development, I have noticed that more and more people in software development, people working in testing, managers, just about everybody, are switching to Macs for the computer they use at home, _and_ for the computer they will get or try to get for work. Well above 50% by now I would say.

And really, the people that I work with seem to have better things to do with their lives than overclocking computers. Overclocking isn't for geeks, it is for mugs.

I used to be very interested in things like overclocking, now I'm just interested in a computer that works, out of the box so that I can do my work.
 
I can't imagine Apple would go with AMD anytime soon, since Apple has adopted and promotes Thunderbolt.

The Thunderbolt controller chip is not built into either Intel's support chipset or the CPU package. It is an independent chip that hooks to the PCI-e bus ( an industry standard that AMD implements ) and the Display Port bus ( another industry standard that AMD implements ).

There is nothing Intel specific about that at all.

What do you people think happens in the peripheral devices that doesn't have any x86 chip in it at all ( e.g, external TB disk , video capture device , etc. ) . An x86 chip is not necessary. So Intel's CPU isn't necessary.
 
I'm thinking I just got it backwards. Could have sworn it was Intel though. Been a few years since I watched it.

I assure you it was AMD that burned up. Both CPU's at the time had throttle capabilities, the problem with AMD's throttling is that it sampled the temperature too infrequently. Good enough for something like a fan failure or clogged heat sync but not nearly enough to compensate for the rapid rise in temperature caused by a heatsink falling completely off. It's a moot point though since that was many years ago and AMD has addressed that issue.
 
It's a problem quite a few people have around here. Most people think that because the computer they use for their daily tasks runs perfect for what they do, it's therefore perfect for everyone. All this excess power is useless, because it's not useful to them.

It's like could use Photoshop just to tweak colors in all my vacation photos. That's a task that'd hardly put a strain on an MBA. But you with your 1GB+ .psd files? That's just dumb. Why? Because I know for a fact that my MBA does everything I need it to.

Agreed.

Which is why I bought myself a netbook for a few months a while back. I only needed it for email and transferring files, so it was fine. That doesn't make it 'good enough' for others.

The other thing is, even if a computer can get the job done at a slow pace, if a client were to know the amount of time it would take to execute a project, and options A and B were present with A being delivered in a 3 days on a faster machine, and B were done in 7 days on a 'PowerMac or iBook' why would the client settle for the slower turn around time, even if it were cheaper? Time is money. That said, I'm also fairly sure that such a dramatic difference between speeds (G4/G5 vs. Core i5/i7) would give more pressure to the user to upgrade to newer architecture, not just for speeds sake, but for compatibility issues as well;

If I were still on a G4 or G5 system with an old version of Adobe Creative Suite, I don't think receiving documents in CS5 and up would be very fun to open... CS5 doesnt run on PPC and therefor compatibility between CS4 and below with CS5 and above would be limited. For example, FLA documents saved in CS5.5 won't open at all in CS5.

That right there says; time for an upgrade.
 
To me any mac with an intel graphics card is useless. Intel graphics has always sucked hard.

The first company to build an SoC with an nice combo of cpu and gpu performance wins.
Imagine a very small board with an SoC that would be apple's dream for the macbook air, if they could incorporate everything (not counting ram and ssd) into a chip they would save a lot of space.

I'm betting that macs will become like ipods what you buy is what you will ever get (no hw updates).
 
I think AMD would be a perfect fit for the MBA. I don't get people complaining about the CPU performance in relation to intle, the CPU performance is more than enough. Serious question what would be faster with intel? Processors are not the bottleneck for almost any computer activity. Even if you do want to bring up processing intensive tasks to pit intel againgst AMD:

maya rendering -> better AMD GPU, GPU render AMD prolly wins.

photoshop -> GPU acceleration AMD prolly wins.

bitcoin mining -> AMD for the win.

gaming -> AMD for the win.

ripping dvds -> intel for the win.

I know but apple can't switch cuz there are SO many people using their MBA to rip dvds, y'know with no optical drive and all.
 
Except no one cares about bit coin mining except those who are still buying into that farce... Serious waste of CPU/GPU cycles. If I was apple I'd put a weak GPU in there just to spite/discourage it.

Photoshop still relies very heavily on CPU as does MAYA

Many games today, and for the last 3 years have been more CPU dependent than GPU. You can always turn settings down to get playable frame rates, if your begin CPU limited there isn't anything you can do to lessen the load.

Then there is efficiency and heat. I haven't looked at the very latest AMD offerings but if history (even recent history at that) is any indication, AMD's offerings have been hotter and more power hungry than Intel. In something like the MBA, this is of primary concern over anything else. Even bit coin mining.
 
Except no one cares about bit coin mining except those who are still buying into that farce... Serious waste of CPU/GPU cycles. If I was apple I'd put a weak GPU in there just to spite/discourage it.

Photoshop still relies very heavily on CPU as does MAYA

Many games today, and for the last 3 years have been more CPU dependent than GPU. You can always turn settings down to get playable frame rates, if your begin CPU limited there isn't anything you can do to lessen the load.

Then there is efficiency and heat. I haven't looked at the very latest AMD offerings but if history (even recent history at that) is any indication, AMD's offerings have been hotter and more power hungry than Intel. In something like the MBA, this is of primary concern over anything else. Even bit coin mining.

So if you ran apple you'd put a weak GPU in every computer to discourage bitcoin mining? I'm sorry but that incredibly dumb. Also you act like I'm talking about no cpu + gpu vs an intel cpu when in reality (based on next gen estimates) it's AMD's 25% worse cpu + 30+% better graphics vs intels 25% better cpu + thier 30+% WORSE gpu.

taking the low gpu estimate and treating cpu and gpu performance equally AMD's processors win by AT LEAST 5%.

I'd take a slightly slower cpu and a heck of a lot better gpu, especially when many heavy duty tasks can be partialy offloaded to the gpu.

edit: and not to mention games.

edit2: I'm still trying to find exact numbers, it looks like AMD's mainstream notebook trinity chips are around 65w and intel ar 35 45 and 55w. However it looks like both intel and AMD's ULV chips come in at 17w. I'll re-edit or make a new post when I find more concrete numbers.
 
Last edited:
Agreed 100%. A MacBook Air will definitely run most software, but it'll be a pain in the a$$. A few years ago I bought myself a small 10" netbook for a 2 month vacation in Europe. I really only needed it for email, tv shows when bored and storing photographs from my dSLR. It served its purpose perfectly. I had photoshop installed on it, as well as lightroom, but doing any kind of serious work on it wasn't feasible.

Now, given that the Atom processor inside that laptop sucked for photoshop purposes (i'm a graphic designer), it really boggles my mind how I used to work on a Power Mac G4 dual processor @ 867mhz with less than a gigabyte of RAM.... for 7 years!

I remember those. I didn't own one, but I worked at a few. You really had to dial down those settings and install a scratch disk to keep it from being an annoying experience. We can get more power than that from a mini today, but the problem size hasn't remained static. Sometimes updating a computer can change the way you work, and on ram, I think attitudes will shift as applications become more optimized toward 64 bit. For a long time there the application was a soft bottleneck, and there wasn't a huge drive for higher density ram in laptops and computers for a couple years. It was more about mobile phones from the low end and possibly servers on the high end, but I haven't kept up with that so I'm not sure (others on here know way more than me about that).


Just out of curiosity, what do you edit that eats up that large an amount of resources? I use PS mostly to make textures. The highest resolution I usually go for is 4096x4096. I've edited PS documents that size with 30+ layers in them. Multiple image layers with alphas, various adjustment layers, all that neat stuff. In all the years I've been using PS, I don't think I've ever seen it peg higher than 2GB before.

I'm not calling you out or anything. I'm honestly curious here. Unless you're editing 20MP RAW photos with over 60+ layers, I can't imagine what could push you beyond the 4GB mark.

edit: I thought about it for a second, and pretty much answered my own question. If you're doing pro photography or advertisement work, you're gonna be dealing with tons of lossless quality RAW images, vector graphics, and who knows what else. Texture work in comparison is considerably less strenuous on a computer. I'm usually dealing with much smaller .jpgs and .tga files.

Heh... 30MP seems like the low end these days. It's a matter of size and settings. If you need to compile a lot of 32 bit files and sizes of 6k and up, it takes a lot of ram. The thing is that given 64 bit application builds and cheap ram, you can use ram for much of what used to be allocated to scratch disks. If you look back a few years, making this stuff tolerable generally meant 8 bpc was your only option, if you encounter banding in smooth areas, blend it with noise, make sure one channel isn't blocked up and causing the issue. Dedicated or sometimes raided scratch disks were common in dealing with large images, and you had to watch settings like thumbnails, history cache settings, and everything else with large files. The G3 and on really starting to make the price tags of the older graphics workstations feel redundant, but at that time you still needed to adjust your work to what the hardware would support.

Today you can get away with almost anything. I was just saying that a lot of ram gives you quite a bit of freedom with your settings regardless of bit depth or how many layers (32 bit isn't entirely uncommon if you're comping renders and photography). I think the next step would be a functional linear workflow for digital camera files with the typical gamma correction curve and standard profile applied for viewing only until the end. This would open up a lot of editability with floating point math, but I don't know if they'd have to reconsider rasterization and channel interpolation methods considering the nature of digital cameras and RGBG bayer sensor arrays. Basically pixels have not only gaps between them but each only represents one color with the other two interpolated. The doubled green channel is to maintain ideal perceived acutance.

This seems to be turning into a post on why I wish photoshop was more like nuke:p. Anyway it's quite liberating not having to close out programs or keep history settings low without experiencing lag. I'm not sure about his testing methods, but digilloyd showed gains on many of his tests up to around 32GB of ram using an image around 10k or so. It's large, but it's not that uncommon.
 
So if you ran apple you'd put a weak GPU in every computer to discourage bitcoin mining? I'm sorry but that incredibly dumb. Also you act like I'm talking about no cpu + gpu vs an intel cpu when in reality (based on next gen estimates) it's AMD's 25% worse cpu + 30+% better graphics vs intels 25% better cpu + thier 30+% WORSE gpu.

taking the low gpu estimate and treating cpu and gpu performance equally AMD's processors win by AT LEAST 5%.

I'd take a slightly slower cpu and a heck of a lot better gpu, especially when many heavy duty tasks can be partialy offloaded to the gpu.

edit: and not to mention games.

edit2: I'm still trying to find exact numbers, it looks like AMD's mainstream notebook trinity chips are around 65w and intel ar 35 45 and 55w. However it looks like both intel and AMD's ULV chips come in at 17w. I'll re-edit or make a new post when I find more concrete numbers.

I'll take the more powerful CPU any day. Out of everything you've mentioned, bitcoin mining is the only one that would clearly benefit from the more powerful GPU which is completely useless metric, not only is the very concept dumb, but even a more powerful (but still very weak) mobile gpu is going to do very little in terms of mining, so it's useless on two fronts.

Gaming will depend on the game and visuals, not to mention there are services like OnLive which completely negate the need for a powerful GPU.

Everything else you mentioned would easily benefit more from the CPU.

So you've come up with a whole 1 thing that will clearly benefit which is not only useless but would barely benefit at that.

Sure, if I could have it all I'd chose an Intel CPU with AMD IGP but given the options currently available, Apple easily made the right choice.

You think somehow AMD magically got their TDP down to 17w to match Intel while their desktop offerings use significantly more power than Intel? No, they did it at the expense of CPU and a significant expense at that. Have you even used an amd APU based machine? I doubt you have so you should try it first. It performs like an Atom based netbook. The simplest tasks spike the processor up to 100% meaning you'll be hitting that 17watt tdp early and often. When Intel is dissipating 17watts, it's doing a heck of a lot more work.
 
I'm not a graphic designer but I thought photoshop could leverage a gpu, I thought the gpu performance increase might negate the cpu decrease. Likewise I think a graphics card is benificial to maya. Again I'm not a graphic artist.

But why are we arguing against fast vs. faster for a glorified netbook? I'm sure a MBA is not some graphic artist's main computer.
 
I'm sure someone somewhere noticed the "2011" version here.
I'm curious what they are going to do for the 2010 and even for the ole mac lisa?
 
But why are we arguing against fast vs. faster for a glorified netbook? I'm sure a MBA is not some graphic artist's main computer.

Netbooks would be selling better if they were anywhere near as fast as a 2011 Macbook Air. The Sandy Bridge i5 and i7 chips used in Macbook Airs are many times faster than the good old Atom.
 
I'm not a graphic designer but I thought photoshop could leverage a gpu, I thought the gpu performance increase might negate the cpu decrease. Likewise I think a graphics card is benificial to maya. Again I'm not a graphic artist.

But why are we arguing against fast vs. faster for a glorified netbook? I'm sure a MBA is not some graphic artist's main computer.

Yes newer versions of photoshop are GPU accelerated, that does not mean the CPU is out of the equation and AMD will perform better. It just means it will perform a little less crappy than it would have otherwise. And we are arguing itbecause that's the topic of this thread. (using AMD) and comparing it to what it's actually using is only logical.
 
Not AMD please!

Just look at where this source came from - Semiaccurate - this site is know for being anti-Intel - total FUD.

Personally if Apple goes AMD, this MacBook Air I am typing this on will be my Last Apple that I ever will buy - include Phone and Tablets.

----------

The Thunderbolt controller chip is not built into either Intel's support chipset or the CPU package. It is an independent chip that hooks to the PCI-e bus ( an industry standard that AMD implements ) and the Display Port bus ( another industry standard that AMD implements ).

There is nothing Intel specific about that at all.

What do you people think happens in the peripheral devices that doesn't have any x86 chip in it at all ( e.g, external TB disk , video capture device , etc. ) . An x86 chip is not necessary. So Intel's CPU isn't necessary.


AMD would have to license Thunderbolt technology, anyway with financial problems AMD is having this would be a BAD decision for Apple, that is why I believe it just FUD.

----------

Although I'd be surprised if that ever happened (at least any time soon), it's a fantastic idea if it were to be implemented properly!

Have a basic processor for all your day to day tasks (internet, email, etc) while maximizing battery life, while having a more powerful "real" processor to use when you need some heavy lifting... how cool would that be!

NO NO NO NO.... one thing that is worst than AMD is ARM. Both is total disaster.

----------

Anyone wanna bet that the next Air wil not come with Ivy Bridge?


Not being on Ivy Bridge is almost as likely as being on ARM - which is Never.

----------

All AMD could provide for Apple if it was ever bought is GPUs. Apple buying AMD would effectively make Intel the only x86 vendor in the world with no competition whatsoever.

There is actually rumors that Apple is switching back to NVidia GPU, if so I am buying a new one.

----------

I'll wager that Apple creates their own ARM-based custom processor for the MBA before they start using AMD processors. It's either Intel or ARM. AMD doesn't cut it from a performance (compared to Intel) or power efficiency (compared to ARM) standpoint.

Forget ARM cpu - Apple has already stated that is NOT an option in the Air. That would be the end of Apple in computer line. Definitely - it would mean my end.

----------

He's actually right. The AMD processor doesn't have heat sensor, so it'll keep chugging along for a few seconds before it burns. Intel processors have heat sensors that will shut the system down before any damage could be done by overheating.

So if Apple ever put an AMD processor into their product, I'll buy the last generation of product that has ARM or Intel. Apple will receive nothing from me if they decide to cheap out with AMD.

If Apple decides to make their own processors based on ARM, I'll still support them even if it's underpowered. I'd rather have an underpowered computer than one that heats up like crazy and burns itself out. Although if Apple decides to use their MBA as a multi-use tool such as to cook your eggs and pancakes for breakfast, an AMD processor would definitely be suitable for that.

I agree with the first part of this - but you lost me on last part. ARM is worst than AMD - maybe for phones and tablets and with Intel's midfield that could change.
 
Just look at where this source came from - Semiaccurate - this site is know for being anti-Intel - total FUD.

Personally if Apple goes AMD, this MacBook Air I am typing this on will be my Last Apple that I ever will buy - include Phone and Tablets.

Sounds to me like you're more anti-AMD then SA could ever dream of being anti-intel. Did AMD rape your sister or something? What does them using AMD in an MBA have anything to do with their phones or tablets?
 
So let me see if I have this clear. Apple apparently wanted integrated CPU and GPU for the Macbook Air (so they could sell a bunch of these things to the consumer), but yet they don't want to support integrated graphics from just a few years ago in their newest version of OSX? Can you say SCREW THE CONSUMER? Yes, I knew you could. :rolleyes:
 
Sounds to me like you're more anti-AMD then SA could ever dream of being anti-intel. Did AMD rape your sister or something? What does them using AMD in an MBA have anything to do with their phones or tablets?

Actually it probably closer to AMD raping Intel. As a software developer my self include Intel x86 assembly for 7 years - I understand what Intel created with it's cpus. AMD was created because companies wanted a second source of cpus - later AMD decided that was not good enough and wanted part of Intel business and started there own cpu clones.

What I don't understand how people can be so against both Apple and Intel - but blindly like AMD and Android. Then if you closer you can see why both Apple and Intel originally developers their products.
 
I think AMD would be a perfect fit for the MBA. I don't get people complaining about the CPU performance in relation to intle, the CPU performance is more than enough. Serious question what would be faster with intel? Processors are not the bottleneck for almost any computer activity. Even if you do want to bring up processing intensive tasks to pit intel againgst AMD:

maya rendering -> better AMD GPU, GPU render AMD prolly wins.

photoshop -> GPU acceleration AMD prolly wins.

bitcoin mining -> AMD for the win.

gaming -> AMD for the win.

ripping dvds -> intel for the win.

I know but apple can't switch cuz there are SO many people using their MBA to rip dvds, y'know with no optical drive and all.


Even though I disagree with your reasons, I respected your opinions until the last sentence. This is obvious from some one that does not like Apple - because of comment of ripping DVD's

As for the other parts

maya rendering, don't use maya - but do use Lightwave and Vue and for these platforms Intel is clearly the professional choice here. I actually purchase a dual xeon for this purpose. NVidia GPU are graphics choice

Photoshop - I been using Photoshop since 3.0 and Intel has alway been top
- Photoshop actually does not use much of GPU - later versions have additional which actually came from Newtek - makers of Lightwave.

bitcoin - never used it - I know that the benchmarks are heavily bias toward AMD.

Gaming - personaly I always uses Intel and NVidia and a lot of games have that on them also - so I believe others.

DVD ripping - well we know that is cpu.

In general for me the perfect combination is Intel CPU with NVidia GPU.
 
Actually it probably closer to AMD raping Intel. As a software developer my self include Intel x86 assembly for 7 years - I understand what Intel created with it's cpus. AMD was created because companies wanted a second source of cpus - later AMD decided that was not good enough and wanted part of Intel business and started there own cpu clones.

What I don't understand how people can be so against both Apple and Intel - but blindly like AMD and Android. Then if you closer you can see why both Apple and Intel originally developers their products.

I fail to see how anything you said relates to banning the use of things like phones and tablets if AMD processors are in certain laptops.

I also fail to see how AMD is raping Intel. Because Intel wasn't allowed to be a monopoly? I'd like to know why you feel less innovation and higher prices are going to benefit you as a consumer? That's exactly what would happen if Intel was the only supplier of x86 CPUs.

Again, I don't see anyone here being against Apple and Intel anywhere near as much as you are against AMD, and for no good reason it appears.

To each their own though. I for one am thankful the industry didn't work out the way you feel it should have. Otherwise I'd be paying a heck of a lot more of a heck of a lot less.
 
AMD would have to license Thunderbolt technology, anyway with financial problems AMD is having this would be a BAD decision for Apple, that is why I believe it just FUD.


What you are spreading is FUD.

It is a discrete chip. For the immediate future, AMD doesn't have to license anything. For the moment there are no "per port" licensing fees for Thunderbolt similar to the early fees for Firewire.

Downstream there might be a question whether AMD had to weave the technology into the supporting chipset or processor package as a future step toward System-on-a-Chip (SoC) or two chipset solution.

Facts are Intel isn't going to have a SoC or two chipset solution for Thunderbolt for a long time. So it is moot that AMD doesn't have it. Apple simply buys the discrete chips from Intel (if they remain the sole controller supplier) and weaves them into the systems design.

The other issue right now is that many in the industry are skittish about whether TB is an attempt at a submarine patent troll move by Intel and/or Apple to snare them into a crappy licensing agreement. Thunderbolt is suppose to be freely licensed but with Apple suiting or threatening to sue just about everyone in the industry, it should be surprising most are taking a "go slow" approach to adopting this. It is a not a open standard governed by a committee/group. It is much more so a dictatorial standard. Sure Intel is being a benevolent dictator now, but is that just letting the fish bite down deeply on the hook? It doesn't help that Apple played that "no fees ..... gotcha changed our mind ... there are fees" game with Firewire. People in the industry have long memories.

One reason Intel backed off and let NEC take point of getting a USB 3.0 controller out to market is same problem that Thunderbolt is facing. There are no 3rd party Thunderbolt controllers because few want to jump into the ring with the 800lb gorilla. If Intel wants to they can easily unilaterally wipe out any multi-million dollar investment in trying to get a 3rd party option off the ground.

If Intel chronically delivers "too few" TB controllers to market over an extended period of time perhaps some 3rd parties will jump in. Until then though it isn't going to be surprising if no one does. However, that is no impediment at all to coupling a AMD chipset that exports standard PCI-e lanes to the PCI-e inputs on an Intel controller. It is solely a matter if the system designer choosing the correct parts and assembling them.

Apple using AMD or not hindges far more on AMD being able to :

1. deliver the right mix of performance

2. deliver the enough parts at the right price on time.


The integrated graphics on AMD options are better, but the x86 cores lag behind a bit. That's largely because AMD allocated larger transistor budgets to GPUs than x86 cores. Intel made the opposite call ( more limited GPU budgets than x86 cores). Limited GPUs can be augmented with discrete graphics. Limited x86 cores aren't so easy to augment.

Likewise, Global Foundries has stumbled a bit after being spun out. AMD is a several months late in delivering for design cycles so they have lost on several design wins.


The dubious parts to these rumors are whether AMD had creditable parts (hitting performance, price, and low power requirements) that were ready in time for production runs. Apple probably did give the every chance to win.... AMD just missed the windows of opportunity. Thunderbolt played no role in that.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.