20% difference in performance is a big deal! Apple will then forever be associated with underperforming machines, even if it is cheaper. This is a bad thing!
Apple is already associated with sub-par performing machines.
20% difference in performance is a big deal! Apple will then forever be associated with underperforming machines, even if it is cheaper. This is a bad thing!
Approximate distance in miles from Cupertino California to Sunnyvale California is 4 miles.
Let's not fool ourselves. The only motivation behind such a move would be higher profit margins for Apple.
Apple's prices are ridiculous already. My 3 years old PC cost less when I bought it than my June-2009-MBP, and it's still running faster than the MBP.
If Apple continues to increase their profit margins, I'm moving from genuine Apple hardware to OSx86. There are many other computers that are on par with Apple's (both in specs and in build quality), for far less money.
Steve Jobs sometimes gives me idea that he has completely forgotten how computers work.
Yet again, the Apple-bashers know more about computers than Jobs and Apple. I really can't understand why Apple just doesn't fire all of its engineers and just come here to ask all the experts. Sheesh.![]()
I don't have an MBA, so I need someone to help me understand. Apple has a huge amount of money in cash reserves, so why doesn't Apple just buy AMD and have total control over the development of its processors as it would be favorable to Apple's future products? This would ensure the continuation of the plan to develop multi-core processors to handle graphics without a wasteful standalone graphics processor. I know that for most businesses it's a mistake to get carried away with vertical integration, but for a company like Apple it would make more sense.
Great point. That's such a critical issue in today's world. Getting your horse and buggy from Cupertino to Sunnyvale is certainly the determining factor in who you choose to do business with.
Your post indicates a very, very clear lack of understanding of Apple's business.
First, the ULTIMATE goal is higher profits for Apple, but Apple almost never does that by choosing cheaper components. They choose high end components and build high end systems. AMD chips will be perceived as second best and would not be used unless there was a clear performance advantage to using them. The quote above ("80% of the performance for 60% of the price") would never fit Apple's plan. So they save $100 on a $2,500 system and lose 20% performance - while giving the customer the impression they're using cut-rate parts? Not a chance.
As for 'on par..for less money', that's rarely the case. Yes, there are crappy PCs out there for less, but when you find similar quality, Apple is almost always competitive - but with far, far better reliability and customer satisfaction.
Yet again, the Apple-bashers know more about computers than Jobs and Apple. I really can't understand why Apple just doesn't fire all of its engineers and just come here to ask all the experts. Sheesh.![]()
This whole PC's don't last more than 2-3 years is complete and utter horse crap.
I have a Dell Laptop that I bought in 2003 with a Pentium M, and it runs Windows 7 perfectly. It's actually quite fast and even runs programs like Photoshop CS5 without a hitch.
My brother has a slightly newer Dell XPS M1210 laptop from 2005 with a Core 2 Duo and it runs incredibly well. He regularly uses it for video editing and graphics rendering.
If your PC from 5 years ago isn't running well, it's because you don't know what you're doing.
Please read through this guide and follow that, and your five year old PC will run just as fast as a modern day PC.
http://boards.ign.com/teh_vestibule/b5296/191015261/
I have to agree with you. My wifes dell desktop is 5 years old, my XPS desktop is 2.5, I have an insprion laptops; one that is going on 11 years old (little noisy but is still a great email, internet computer and the other inspiron is 7 years old, still running a dual boot XP/Vista.
I am waiting for the day to find an excuse to buy a new machine.
I don't have an MBA, so I need someone to help me understand. Apple has a huge amount of money in cash reserves, so why doesn't Apple just buy AMD and have total control over the development of its processors as it would be favorable to Apple's future products? This would ensure the continuation of the plan to develop multi-core processors to handle graphics without a wasteful standalone graphics processor. I know that for most businesses it's a mistake to get carried away with vertical integration, but for a company like Apple it would make more sense.
I'm not trying to endorse Dell. I actually think their build quality isn't as good as the higher end PCs like Sony Vaio's.
But I think it just goes to show that PCs easily last 6-8 years if you know what you are doing (and I'm sure Macs do too).
Here's how... http://boards.ign.com/teh_vestibule/b5296/191015261/
P.S: One of the aforementioned Laptops I had, I accidently knocked off the balcony of my 2nd floor apartment about an year ago (it was old so I wasn't very careful with it). All it got was a crack in the exterior plastic, but everything else still runs perfectly, even the LCD is still perfect and like I said, I later installed Windows 7 on it and continue to use it as a backup in my bedroom.I don't know what these people who claim PCs don't last are doing to their computers but I doubt it involved a drop from a 2 story drop!
If Apple did buy AMD, then AMD would probably be in Macintosh ONLY. These leaves Intel in the pc market... alone. While I love Intel, and hate AMD (IMO they are cheap low cost, underperforming CPUs. they generate a lot of heat and not the best performance) Some people DON'T like Intel, and favor AMD. So intel is already monolipizing the Pc industry, then AMD would only be in Macs making it worse. Apple wouldn't want its subsidiary making chips for its competitor
It is interesting to think about this, a combo of ATi/AMD could probably work for the Macintosh line of Apple computers but Intel are just too good to let go of, the Core family of CPUs are practically the best thing going right now and have been for quite some time...Not to say AMD can't ever match them but I haven't seen anything from them that makes me think, "Wow, that's innovative".
Correct me if I'm wrong but isn't Intel the one developing Light Peak?
Yes, I could see Apple's troubles with Intel forcing their integrated GPUs with their iX CPUs and how it's making things difficult with nVIDIA and of course the overwhelming demand coupled with low supply that could cause delays on new product launches, but Intel is still the way to go.
I don't have an MBA, so I need someone to help me understand. Apple has a huge amount of money in cash reserves, so why doesn't Apple just buy AMD and have total control over the development of its processors as it would be favorable to Apple's future products? This would ensure the continuation of the plan to develop multi-core processors to handle graphics without a wasteful standalone graphics processor. I know that for most businesses it's a mistake to get carried away with vertical integration, but for a company like Apple it would make more sense.