Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Ack. I wouldn't mind the Intel Inside sticker if it'll save me at least $10 dollars and I can peel it off without leaving marks.

The Intel inside stickers aren't that bad. I collect Intel bunnymen. :D

Here's to the Crazy Ones
 
how about

~Shard~ said:
I don't even know how they could make that thing any smaller, minimalist and simple than its current incarnation.
How about getting rid of that big, awkward blank area underneath the screen, and putting some useful pixels there?

I don't know why anyone would buy an all-in-one, instead of getting a sleeker, more elegant monitor and an SFF system that could be hidden on or behind the desk. It also seems like such a loss to have to throw away the monitor to upgrade the CPU, or vice versa - an all-in-one is not cost-effective.

A Yonah MiniMac plus a Cinema Display makes so much more sense than the oddly proportioned iMac. Maybe we'll never see an Intel iMac - Apple will finally realize that the iMac is too much of a compromise for much of their base.

An all-in-one makes as much sense as a floppy drive - it's an antique solution to a problem that no longer exists.
 
jhu said:
1) the darwin kernel is relatively slow
2) most people will choose by the price. it's debatable whether the mac is a better value. for some it is, for me, the price to be different isn't worth it.

The Kernel isn't slow, its just that applications running on top of a micro-kernel architechture are switching in and out of user-mode a lot more often then applications running on a monolothic kernel. So if you test something like running IPC calls over and over you'll see slower system performance.

What does that mean? Only that BSD chose security over speed. Not a bad thing in my opinion.
 
AidenShaw said:
How about getting rid of that big, awkward blank area underneath the screen, and putting some useful pixels there?

Yep, that would be a good start I guess! ;)

AidenShaw said:
I don't know why anyone would buy an all-in-one, instead of getting a sleeker, more elegant monitor and an SFF system that could be hidden on or behind the desk. It also seems like such a loss to have to throw away the monitor to upgrade the CPU, or vice versa - an all-in-one is not cost-effective.

...

An all-in-one makes as much sense as a floppy drive - it's an antique solution to a problem that no longer exists.

I agree. For me, a PowerMac would have been overkill, hence why I bought my iMac, but I know what you mean - the monitor will be useless once I decide to upgrade, or at least useless in the respect of not being able to use it with my next Mac.

And one thing I do miss from my PC days is having that security/capacity/potential/option to upgrade any little piece of my machine at any time that I want, and replace any problematic components without requiring a whole new system if something fails. Not so with my iMac.

Here's hoping I have some better alternatives available to me in a couple years when I plan on buying a new Intel Mac. :cool:
 
AidenShaw said:
How about getting rid of that big, awkward blank area underneath the screen, and putting some useful pixels there?
I like that space there. It is the perfect size for post-its. :)

I've seen artwork framed the same way, the border on the bottom is larger than the border on the top or sides.

If Apple filled the entire iMac with a screen then it would no longer be widescreen. :(
 
dernhelm said:
The Kernel isn't slow, its just that applications running on top of a micro-kernel architechture are switching in and out of user-mode a lot more often then applications running on a monolothic kernel. So if you test something like running IPC calls over and over you'll see slower system performance.

What does that mean? Only that BSD chose security over speed. Not a bad thing in my opinion.

i'd say that makes it slow. if you performed the same tests comparing freebsd, openbsd, and netbsd against darwin, darwin would still come out last.
 
Custom Apple/Intel CPU minus x86 won't happen

Norse Son said:
... Now, about the second part, of Apple & Intel "cooking up" a custom cpu; I made a comment in the thread just yesterday about that idea - I had read the AppleInsider article. However, I've seen the topic mentioned more than once in the past... Here's my take & timeline:
...
• Apple phases out Classic - it's not present in Leopard (they license MacLink Plus Deluxe technology to open "legacy files").
... [late 2007-MWSF '08]...
• Rumors start to circulate of an ultra-top-secret project... taking place at Intel's labs in Oregon... Apple has buried Classic, and now, working secretly with Intel, they are building a quad-core cpu on a 45nm process that ditches all the "legacy patches & scaffolding" that the x86 architecture has accrued over the years to maintain support for "paleolithic" DOS applications...
Yeah, I'm quoting myself, but I found a flaw ("Just one, Aintstein!?!") in my logic after going out for a drink.

If... IF... Apple and Intel were even thinking of doing this it would have changed their words & actions all the way back to this year's WWDC. That's because Apple would have hesitated to say that, "while they won't prevent people from loading Windows on the Intel Macs, they will neither support or sponsor it..." (paraphrased). Now, that implies that Apple knows some people will do it, but, more importantly, it helps Apple sell more "dual-boot" systems to corporate and educational customers.

Therefore, if they were going to have Intel build them a custom x86 cpu that "ditched" all the legacy crap dating from DOS through all the putrid incarnations of Windows... Well, they just eliminated all those customers from the last sentence of the previous paragraph...

Also, it would piss off all of Intel's other OEMs, such as Dell, HP, Lenova, etc. if they saw how "screaming fast" a streamlined x86 was (minus the "86" parts), but they couldn't have it, because Apple refused to license them the MacOS X... And Intel could not afford to take 2-4 fab lines "out of the loop" to build custom CPUs just for Apple's mobile, consumer, desktop and server lines of Macs - even if Apple had 10% of the market, it still represents only about 8-11 million chips... out of a possible...? ...

Now, don't get me wrong. That's a hell of a chunk of change... But when a new fab can cost upwards of a couple billion, Intel cannot afford to alienate all its other customers... Or chase them to AMD... Or see Microsoft buddy up to AMD, while "dissing" Intel... Get the picture: it's Apple/Intel versus Microsoft/AMD/Dell/HP/Lenova/Sony/Gateway/Toshiba/Acer/Adinfinitum/Adnauseum...

On the other hand, if Apple "failed miserably" to keep MacOS X off of non-Apple Intel hardware, and they "reluctantly" decided to license it to OEMs meeting certain/select criteria... Then I could see Apple, working with Intel, and for the benefit of its licensing "partners" developing a cpu that eliminated the legacy bottlenecks. Of course that would likely mean those "evolutionary" CPUs could not run Windows or apps written for it, except under some exotic form of emulation...

Seriously doubt it's gonna happen...

Nice dream, though...
 
Norse Son said:
Therefore, if they were going to have Intel build them a custom x86 cpu that "ditched" all the legacy crap dating from DOS through all the putrid incarnations of Windows... Well, they just eliminated all those customers from the last sentence of the previous paragraph...

yes, they tried it. it was called "powerpc" with openfirmware. also, intel probably learned their lesson with itanium.
 
meh

meh, I don't care if Intel designs it or someone else,
as long as the machine runs great.
But please no BIOS and please let me still have Target disk mode.

I'm really wanting the transition to feel seamless, sure I'll know there's an intel chip in there, but I shouldn't need to notice, in all other aspects I would hope that it will still look feel, & act like a Mac with all the things we now know, enjoy and expect from a Mac.
(but a new and improved spiffy mac of course :) )
 
jhu said:
i'd say that makes it slow. if you performed the same tests comparing freebsd, openbsd, and netbsd against darwin, darwin would still come out last.

Sure, and I could find other tests where a micro-kernel architecture shines and out-performs a monolithic kernel. The tests anandtech performed happened to be tests where the monolithic kernel performs quite well.

Again, the speed of the kernel is not in question here, the speed of the application is. In a microkernel architecture, the kernel can often fit in RAM entirely. Apps that happen to make use of that fact and do not switch out to user mode too frequently can actually run faster on a micro-kernel architecture.

But that is not the point I was trying to make. When microsoft developed NT, and when Linus Torvalds developed Linux, they both made the same decision - to NOT have a micro-kernel, but instead have a monolithic kernel architecture. In Windows, they went to the extreme, and some of the most ridiculous things imaginable run in ring 0. Linux isn't quite as bad, but the bottom line is that in both systems, poorly written apps are doing a lot more in ring 0 than they would running on a micro-kernel architecture box. This means that any poorly (or maliciously) written application could cause your system to entirely crash.

In a micro-kernel architecture, this is a far more difficult thing to do, which is why OS/X is much harder for an app to bring down the OS completely than Linux or Windows. The price for that robustness is a little speed reduction, but that's a trade-off I'd take 9 times out of 10.
 
generik said:
I doubt so, Apple will probably want to save that $0.10 per PowerMac.

While Apple is famous for selling new computers just for a marginal increase in processor speed, it would actually be CHEAPER to use Intel's common, commodity, socketed processors. Heck, even the majority of Intel-based NOTEBOOKs use socketed processors now. (I just replaced the 1.4GHz Celeron in my $350 Staples-post-Thanksgiving-special with a top-of-the-line 2.26GHz Pentium M. No more difficult than replacing the hard drive in my PowerBook. Yes, not for the true newbie, but not impossible, either.)

The move to Intel will either force Apple to go back to easily upgraded processors, or they'll have to pay extra to solder the processors in place.
 
mdavey said:
Taking them in order, okay the G5s are hot, too. I have three network appliances

Network-appliance is not a computer

one with a G3 and two with Arm processors.

Both are low-powered when compared to typical PC-CPU's. Hell, ARM is designed for embedded purposes! There are plenty of PC-CPU's that can manage without fans.

Noisy. Apple has traditionally been very successful in this area. Lets hope they can be again.

MDD-PowerMacs, anyone?

Power-hungry. Apple has been very successful in this area. Most PCs have a 350W or 400W PSU. One of my network appliances has a 25W PSU, my Mac mini has a 70W PSU. Of course, one has to measure the actual consumption for a true comparisson.

You are comparing apples (no pun intended) and orages. Network appliace is still not a computer. Mac Mini definitely is, but it's hard to compare it to full tower-PC. You might want to compare it to some Mini-ITX-machine for example. How about comparing that PC to PowerMac? As it happens, PowerMac ships with either 450W or 600W power-supply. That's alot bigger than your average PC ships with!

Big. Okay, big has its place sometimes but the PC manufacturers seem to just chuck parts into mini towers because that is what they have always done. I am surprised that there hasn't been an explosion in small form-factor computers so far.

PowerMac is big as well. Mini is small (duh!) but there are smalll PC's out there as well. I find it rather strange that you compare the smallest Mac possible to average PC-tower, and then proclaim that "PC's are too big!".

ClimbingTheLog said:
This has happened to Linux users as well. They buy a Powerbook knowing it will run LinuxPPC and get the box, then turn it on to figure out how to wipe the disk, and wind up deciding not to install LinuxPPC.

Sure it has. Naturally there are few people out there who bought the laptop with the intention of wiping the HD, but didn't do it for some reason. But what usually happens is that they boot the machine, poke around in OS X, and think "cute OS. Now where did I put those Linux Install-CD's....".

I'm not really sure what you are trying to say here. There are qute a few people who buy Apple-hardware, just to run non-Apple OS on it.

Photorun said:
I'm really not sure why you're even here SiliconAddict, are you a troll with very little time on your hands (clearly by the amount of vacuous posts), a hater, both, an unenlightened PC user (clearly) or what your case is.

Again with your "You seem to like PC's. Why are you on Macrumors then?!?!?"-routine? Can't someone like PC's (form one reason or the other) and still be interested in Macs and Apple as well? Should these forums be reserved for gung-ho Mac-fanatics alone? If you don't hate PC's with passion, you have no place to be here? These forums are reserved for praising of Apple, Mac and OS X, and comments disputing the superiority of Apple and their products are strictly forbidden?

Exactly WHY are you here?

That's the EXACT same thing you told me when I dared to say that "You know, XP is a pretty stable OS" :D.

You're shtick is really old, go find a "Microsoft Windows/I (Heart) Dell" forum and leave us Mac users to babble about our own inane stuff. Save your prostelityzing to those who may give a flying crap about the droll, off-topic, misinformed and disingenuous high-and-mighty pointless stuff you drivel on and on about.

Now now, take a chill-pill. You seem to take this Mac-fanaticsm a bit too seriously. You do spend quite a bit of time and effort to disparage PC's, but the moment someone says something good about them, you start shouting "Lies! Disinformation!". Why are those comments lies and disinformation, whereas your constant disparaging of PC's is not? Seriously?

Lord Kythe said:
Oh, you don't like "big" words? Breaks your "safe" sementical boundaries? Perhaps you'd prefer something like "shallow" pc user? Or is it plain wrong to dare make the statement that Macs are better computers? We are cultists for believing a computer is better than another? I'm sorry, but I firmly believe Macs are better computers, and I think it is a reasonable statement, and if you propose a computer running Windows against one running Mac OS X, I mean it's not even a contest.

The only thing PCs are better at than Macs is gaming, and getting viruses, worms and spyware. And since the XBox arrived, I don't even know what PCs are still around for anymore; oh yeah! Making money by investing billions in advertisement.

I'm sorry, but your comment DOES sound like something a blind fanatic would say.

That said, I only use W2K to play games. My main OS is Linux. My main-machine is one of those dreaded tower-"peecees", and I haven't seen any viuses on spyware on my machine.

And FYI: consoles absolutely, positively suck for certain types of games.

dernhelm said:
What does that mean? Only that BSD chose security over speed. Not a bad thing in my opinion.

OS X doesn't use BSD-kernel, it uses BSD-userland, the kernel is something else entirely. And the kernel is the thing that is slow. And I haven't seen those security-issues on Linux (which uses monolithic kernel).

dernhelm said:
Sure, and I could find other tests where a micro-kernel architecture shines and out-performs a monolithic kernel.

Such as? Typically, microkernels carry an overhead that doesn't exist in monolithic kernels.

Again, the speed of the kernel is not in question here, the speed of the application is.

And those applications rely on the kernel. Didn't Anandtech test real-life apps in their benchmark?

In a microkernel architecture, the kernel can often fit in RAM entirely.

The kernel on my Linux-system is about 800KB in size. I have 1GB of RAM. Deciding that can that kernel fit in to the RAM, is left for the exercise of the reader.

Linux isn't quite as bad, but the bottom line is that in both systems, poorly written apps are doing a lot more in ring 0 than they would running on a micro-kernel architecture box. This means that any poorly (or maliciously) written application could cause your system to entirely crash.

I haven't seen that happen. And apps in Linux are in userspace.

In a micro-kernel architecture, this is a far more difficult thing to do, which is why OS/X is much harder for an app to bring down the OS completely than Linux or Windows.

I have had the GUI freeze on my OS X, requiring me to do a hard-reset. I have had GUI-problems in Linux as well, but the underlying system stayed up & running, and I didn't have to do a hard-reset. What you are basically saying, is the exact same thing Linux-users have been using against Windows.

The price for that robustness is a little speed reduction, but that's a trade-off I'd take 9 times out of 10.

So, you are basically saying that OS X is robust, whereas Linux is not? I'm sorry, but I just have to disagree with you there.
 
mercury26 said:
I want to know if AU and VST plugins will work through Rosetta. If not, than it makes Logic pretty unusable for most audio professionals.

Apple developer web page basically says: All plugins must be written for the same processor as the application. A native Intel application cannot run PowerPC plugins. With Universal binaries (both Intel and PowerPC versions available), you have the choice to run them through Rosetta if you need PowerPC plugins. Plugins can also be Universal binaries; such a plugin can run on PowerPC, it can run within a native Intel application, and it can be run on an Intel system within a Rosetta application.

So if there are say four plugins that you need all the time, and two have Universal versions, and the other two have not, then you need to run the application under Rosetta even though you could run it native.
 
Evangelion said:
OS X doesn't use BSD-kernel, it uses BSD-userland, the kernel is something else entirely. And the kernel is the thing that is slow. And I haven't seen those security-issues on Linux (which uses monolithic kernel).

Agreed. Security was the wrong word. I meant to say robustness.
 
Evangelion said:
And those applications rely on the kernel. Didn't Anandtech test real-life apps in their benchmark?

I think so, but apps that relied heavily on IPC and the like.


The kernel on my Linux-system is about 800KB in size. I have 1GB of RAM. Deciding that can that kernel fit in to the RAM, is left for the exercise of the reader.
You got me there. Of course I didn't mean RAM, I meant the cache on the CPU. Miicro-kernels can often live in L1 or L2 cache. I guess I was a bit tired at my last post.

I haven't seen that happen. And apps in Linux are in userspace.

I made a living writing Linux code where I was mucking about in the internals, and I can tell you it does happen. Not with frighthening regularity or anything, but it does. Userland apps almost never bring it down, but device driivers, and other low-level bits often can. I've never actually crashed a BSD-based system, so I can't tell you what that would take.

I have had the GUI freeze on my OS X, requiring me to do a hard-reset. I have had GUI-problems in Linux as well, but the underlying system stayed up & running, and I didn't have to do a hard-reset. What you are basically saying, is the exact same thing Linux-users have been using against Windows.

That's not really what I meant to say. Linux is orders of magnitude more robust than windows. Microkernels are only slightly more robust than monolithic kernels. Glitchy low-level stuff generally is less problematic for your whole system on a micro-kernel architecture than they would be on Linux. But I suppose that glitchy low-level stuff is always bad for a user anyway, so that argument is pretty lame. I'm coming at this from a developer's standpoint, and BSD is more forgiving to "not yet ready for prime time" low-level code than Linux is. That's all I'm really saying.

So, you are basically saying that OS X is robust, whereas Linux is not? I'm sorry, but I just have to disagree with you there.

Never said that Linux wasn't robust. Never meant to imply that at all. I suppose that you are correct, In the great scheme of things, for end-users there would be little or no noticable robustess difference between Linux and BSD. Developers might notice a difference, though.

I've done very little development on OS/X, so I can't really speak for Darwin. As for the GUI deadlocking the system that you've mentioned above, I'd be very interested in what was going on that a GUI could deadlock your system. I haven't had that happen yet (lucky I guess) but I use mine more for fun (iLife) than work. I have had the Linux desktop crash on me before (same with Solaris) and it more or less makes that workstation unusable but I could still telnet in from elsewhere.
 
~Shard~ said:
And one thing I do miss from my PC days is having that security/capacity/potential/option to upgrade any little piece of my machine at any time that I want, and replace any problematic components without requiring a whole new system if something fails. Not so with my iMac.

That is also one thing I miss from my PC days, with one exception. When I look back to my PC days i really only updated my disks, and memory, as do most people I know.

I had the same dang P3 450 for 4 years!
 
AidenShaw said:
How about getting rid of that big, awkward blank area underneath the screen, and putting some useful pixels there?

I don't know why anyone would buy an all-in-one, instead of getting a sleeker, more elegant monitor and an SFF system that could be hidden on or behind the desk. It also seems like such a loss to have to throw away the monitor to upgrade the CPU, or vice versa - an all-in-one is not cost-effective.

A Yonah MiniMac plus a Cinema Display makes so much more sense than the oddly proportioned iMac. Maybe we'll never see an Intel iMac - Apple will finally realize that the iMac is too much of a compromise for much of their base.

An all-in-one makes as much sense as a floppy drive - it's an antique solution to a problem that no longer exists.

There are always two sides to everything. I personally prefer the all in one design. Rather than mess around inside with upgrades etc, I'd prefer to just go out and buy a new system and sell the old one on e bay, or give it away to a friend or relative. The only exception being upgrading the RAM.

So no, not everyone wants a tower under their desk, or a mini on their deskptop for that matter. The no fuss and no clutter design of the imac is perfect in my eyes. With a wireless keyboard and mouse, and an aiport extreme for printing and wifi access, you can pretty much get away with a system that has just one cable atached to it (the power cord).

Jason
 
Most unfortunate

I would rather Apple take their time and roll out the new Intel PowerMac or its equivalent rather than gettting a system board from Intel. Apple had better not dissapoint with the next PowerMac.
 
EricNau said:
IIf Apple filled the entire iMac with a screen then it would no longer be widescreen. :(
Right, it would be BIGSCREEN !! :D

(Your DVD image would be exactly the same size when you're watching a movie, but the rest of the time you'll have more room on the screen....)
 

Attachments

  • iMac4x3.jpg
    iMac4x3.jpg
    16.2 KB · Views: 87
  • imac.jpg
    imac.jpg
    15.9 KB · Views: 85
I couldn't care less what will be inside these new Macs.

As long as they will stable and have nice casing designs who cares? Really......
 
AidenShaw said:
How about getting rid of that big, awkward blank area underneath the screen, and putting some useful pixels there?

I don't know why anyone would buy an all-in-one, instead of getting a sleeker, more elegant monitor and an SFF system that could be hidden on or behind the desk. It also seems like such a loss to have to throw away the monitor to upgrade the CPU, or vice versa - an all-in-one is not cost-effective.

A Yonah MiniMac plus a Cinema Display makes so much more sense than the oddly proportioned iMac. Maybe we'll never see an Intel iMac - Apple will finally realize that the iMac is too much of a compromise for much of their base.

An all-in-one makes as much sense as a floppy drive - it's an antique solution to a problem that no longer exists.
I really love having an all-in-one computer. Call me not a true computer person, but I love just taking a computer out of the box, putting it on the desktop, plugging it in and run a quick set-up assistant to get my Mac up and running. And then I just let it run from there. Easy. I understand that it isn't cost effective to buy an all in one... but to me it's worth it to have my small little computer all in one beautiful, sleek piece in front of me. It's hard to explain, but I have always thought that all in one computers are one of the most beautiful things.... :eek:
 
j_maddison said:
There are always two sides to everything. I personally prefer the all in one design. Rather than mess around inside with upgrades etc, I'd prefer to just go out and buy a new system and sell the old one on e bay, or give it away to a friend or relative. The only exception being upgrading the RAM.

So no, not everyone wants a tower under their desk, or a mini on their deskptop for that matter. The no fuss and no clutter design of the imac is perfect in my eyes. With a wireless keyboard and mouse, and an aiport extreme for printing and wifi access, you can pretty much get away with a system that has just one cable atached to it (the power cord).

Jason
Thank you! Couldn't have said it better myself! (in fact, tried failing miserably...) Although I have to admit I didn't go for the wireless mouse and keyboard, I love the sleekness and oneness of my new iMac. I'll consider getting wireless if/when they come out with a wireless Mighty Mouse. :D
 
Lord Kythe said:
LOL, that's one of the funniest things I've heard. You actually think that "95% of the computer buying public have (Windows-based) PCs" means that 95% of them don't want a mac? They just don't know about it. I worked in several retail environments (internet, educational and retail stores). Man, about 25 customers asked me if their iPod would work on a Mac in the Nov.-Dec. period alone. I'm serious.

I know many who want Mac OS X, but they don't like what Apple's hardware has to offer.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.