Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Just wondering how many folks had a look at this over on 9-5 Mac

http://9to5mac.com/mac-mini-midi-micro-nano

Looked, but compared to what's available in other mini-towers - it doesn't make sense. (Not that "making sense" has ever been an important factor in Apple's lineup.)

9to5 thinks that a mini with a PCIe slot is worth $1K.

Let's review what $800 buys us on the other side:

attachment.php
 
Wow thanks macrumours for informing me that this new computer will be made out of matter :O
 
Hurry Up!!!

I feel like Bill the steam shovel... Seriously if they are going to release this, they should just hurry up and do it. Maybe they will release a low-end one with the Atom and a more "mid" machine with the normal processors and better graphics etc. Either way just get on and do it :apple: cos I think most people are wondering now after how long since it has been updated? Oh! Only 529 days... Maybe at 1000 days they'll update it? I mean, that is only 3 years! :eek:
 
They're virtually identical. The memory interface is different, other than that it's the same as a 7900GT.

Go read Sony's formerly official specs and then the specs for the 7000 series. Both are available at wikipedia, but no longer available at the Playstation site.

That it's pretty much a 7900 class part, yes.

Don't twist my words.

If the PS3's GPU was "7900 class" then it would be able to run GTA4 at a resolution higher than 640p. The same way the Xbox360 and actual 7900 series cards can ;)

And yet amazingly IGN's review says it looks better. Also remember that the Playstation 3 doesn't have a scaler built in.

Look at the screenshots on the link I posted.

Also, go read everyone elses opinion. Especially those who have played both games.

Sony never claimed that, and the specs don't back it up. It's basically a 7900GT.

Since you like to ignore the links I posted before, I'll post again: http://en.wikipedia.org/wiki/GeForce_7_Series http://en.wikipedia.org/wiki/RSX_'Reality_Synthesizer'

Yup, looks more like a cross between the 7600 and 7800. It even says "Based on NV47 Chip (Nvidia GeForce 7600 Architecture)"

Oh and go find Sony's actual video of the first PS3 demonstration. They say there that it was between a 7800 and 7900.

http://www.theinquirer.net/inquirer...gpu-slightly-less-powerful-than--geforce-7800 Your argument is finished ;)

That's why I said "at best". The 360 and PS3 weren't a match for the best PCs when they launched in '05 and '06 respectively, but they were at least competitive. That's why I said the original Xbox was probably the most powerful console relative to when it launched.

But in reality, it wasn't.

It was a Celeron. It was the exact same chip as Intel sold as a 733MHz Celeron, other than they allowed it to run with the (by today's standards virtually identical) FSB speed of the equivalent Pentium 3. But yes, Celerons have always been stripped down versions of Intel's current chip design.

Again, the Celeron WAS a Pentium 3. It just had half the cache running at a faster speed and SOMETIMES the FSB speed was different. http://en.wikipedia.org/wiki/Xbox#Technical_specifications At that time it was not a "stripped down version" it WAS A Pentium 3.

No it didn't. The Coppermine Pentium 3 and it's Celeron equivalent had on die L2 cache that ran at the same clock speed as the rest of the chip.

I had Coppermine Celerons at that time. http://www.cpu-world.com/info/Intel/Celeron-vs-Pentium.html scroll down a bit. Hey look at that, we're both wrong "32 KB L1 cache. 128 KB on-die L2 "Advanced Transfer Cache"" from http://en.wikipedia.org/wiki/Xbox#Technical_specifications "Custom Intel Coppermine-based processor". Sorry, NOT a Celeron.

As I said, it was months before something that really clear cut beat the Xbox's GPU. Granted, even my Geforce 2 I ran at the time was similar, and even better in some respects, but the Xbox's GPU was pretty clear cut a high end GPU when it lauched, for several months. And at least for what I can think of right now, that's the last time that happened.

And again, the Xbox GPU was just an enhanced GeForce3 that was outperformed by the GeForce3 Ti 500.

They weren't THAT much better though, which is my point. That's the closest a console's ever come.

What? The original Athlons running at the same clockspeed as Pentium 3s used to walk all over them. An Athlon Thunderbird running at 1.4GHz would literally be more than twice as fast as the Xbox CPU.

The GeForce Ti 500, while not "that much better" than the Xbox GPU was still better enough to push it over the edge.

What "real world performance" backs that up?

Games. Look at the links I posted in my post which you are replying to but seem to have ignored those links.

And so does the 7900. All that means is that these chips exceed the Direct X spec in some way. The Geforce 3/4 thing in the Xbox 1 exceeded Direct X 8, and to my knowledge, every GPU ever made exceeds the current Direct X spec in some way. They're never exactly built to the spec.

The GeForce 3 in the Xbox didn't exceed specs. It just had an extra pixel shader unit.

And, again, the PS3's RSX is based off the 7600 but weaker than the 7800 according to nvidia. Here are the links again: http://www.theinquirer.net/inquirer...gpu-slightly-less-powerful-than--geforce-7800

http://en.wikipedia.org/wiki/RSX_'Reality_Synthesizer' It clearly says "Based on NV47 Chip (Nvidia GeForce 7600 Architecture)"

No, Obivion and Bioshock did NOT have an extra year of development on the PS3. They had LESS development time on the PS3. Development was not done concurrently, which is what would have to be the case for your claim of an extra year's time.

I like how you skipped over the whole GTA4 on both platforms link I posted, while still trying to stick to the "it looks better on the PS3" argument earlier in your post, despite the fact that my link clearly showed that GTA4 on the Xbox360 looked considerably better.

And yes, Oblivion and Bioshock did receive an extra year of development time. First of all, they used a multi-platform engine. Second of all, the games core features were already finished a year or more before the PS3 version's release. The character models, world, EVERYTHING was already done. Porting to the PS3 was a matter of translating some code and, in the case of Oblivion, enhancing textures. You'd be a fool to believe that they were developed from the ground up from scratch.

No, it does not. The only claims of better color on the 360 I've ever heard were back when the PS3 was fairly new, and critics has misconfigured systems.

How convenient.

And yet of the several critics I've heard, not one has claimed that. Quite clear cut the opposite.

Really? On my LED backlit screen that Xbox VGA stream has considerably better color definition and sharpness. No Playstation Vasoline.

No it wasn't, and no it doesn't. At the time I thought Gears was more visually interesting, but not better looking, and it's second gen software.

Gears of War was a first generation game. It was the first game for the Xbox360 by Epic and it was the first game to use the Unreal Engine 3. Therefore it is a first generation game in the sense that it is their first game and it was the first game to use that engine.

And yes, Gears of War looks better than Resistance. It even looks better than Resistance when running on my aluminum MacBook.

Yikes, okay, now it's pretty clear what you are, which explains all these random backwards claims. NO ONE claims Halo 3 looks particularly good. It's fine, but no one claims it looks as good as Gears or Resistance 1, and here you are claiming it looks better than Resistance 2.

Hello Fanboi!

You just automatically disqualified yourself from having any sort of credibility.

Halo 3's multiplayer isn't impressive, but show me a PS3 game with a single player campaign that has equally impressive lighting. Oh thats right you can't.

Please show me where Resistance 1 or 2 looks better than a 2004 era PC gamea s well.

And yet, mysteriously, critics (like IGN) who have had time with it say it's the best looking console game to date. Hmm.

They also gave the Legend of Zelda: Ocarina of Time a perfect 10 back in the day. Their credibility is where?

Fanboi! Fanboi! Geez you're hilarious. Um, yeah, aside from seeing it with my own eyes, and every critic I've read raving about it, it just looks "average". Actually critics (like the Giant Bomb guys) tend to say things like it's the most flawless graphic execution they've ever seen, with everything, every surface just perfect. Now personally I thought the virtual acting was better done in Mass Effect, but it's still one of the best looking games in that regard too. "Average" is clearly not an apt description for it regardless of whether you like the esoteric stories in those games.

You further discredit yourself and your entire argument by resorting to "fanboi" remarks.

http://www.gametrailers.com/game/1743.html show me the high resolution textures please.

http://www.gamespot.com/ps3/adventu...convert&om_clk=gsimage&tag=images;header;more here too please.

http://image.com.com/gamespot/images/2008/164/926596_20080613_screen008.jpg i particularly like that shot. Reminds me of SOCOM 3.

http://image.com.com/gamespot/images/2008/164/926596_20080613_screen011.jpg looks like SOCOM 3 meats Half-Life 2 on low settings.

http://image.com.com/gamespot/images/2008/141/926596_20080521_screen012.jpg FF12 quality character models?

He's exposed himself as a Fanboi. His religious feelings explain why he's making these claims. It's too bad, as he's right about the hardware in terms of PCs being better. Unfortunately his religious convictions make him see 360 games as looking better than PS3 games regardless of the reality.

Thats funny, because I've provided actual screenshots and videos in both of my posts. All you can do is quote a reviewer that lost its credibility more than a decade ago. And on top of that, you drop to personal attacks.
 
yet completely OT and pointless.

Agreed.

Now, I think the topic was how we were going to set the Apple campus in flames for not updating the Mini for 18 months and counting.

If that fails, we go into an Apple Store and take hostages.

Wow, what if that really happened? I mean, what if a bunch of people with guns held people hostage and said, "we will release everybody when Apple announces an update to the Mac Mini." Would Apple cave in? Or would they say they don't respond to the demands of terrorists and such?
 
Apple does not want the Mac Mini to cannibalize iMac and Macbook sales (seeing it will be a lower priced machine) so they are going to neuter it.

Magnus, great to hear from you. So how are things in Cuppertino. How's the new job up there at Apple. Glad to see that they skipped any confidentiality agreements. So you can tell us all the 'facts' about what Apple is and is not going to do.

I mean you must work there since you talk like you know exactly what is going on.

You know that this stupid Atom rumor isn't just a rumor. that the important detail is that so far no one has claimed that they actually ordered these parts, rather than took them for a test drive only.

Since you are so in the know,what's the scoop on the Apple TV. I hear they are going to ditch the itunes only and make it compatible with Netflix, Hulu etc as well. Any comment oh insider wise one.
 
Also, go read everyone elses opinion. Especially those who have played both games.

No thanks. I think I'll actually take the word of critics, and my own eyes, over those of raving zealots.

Since you like to ignore the links I posted before, I'll post again:

I'm already completely familiar with them.

Yup, looks more like a cross between the 7600 and 7800.

No it doesn't. It looks like a modified 7900. The memory interface is different, and there's been a claim it only has the same number of ROPs as the 360, but that's not substantiated in your link. Regardless, it's basically a 7900GT.


That gives no actual information, and the The Inquirer isn't a valid source, particularly when it comes to Nvidia. In this case though they don't even claim to be a valid source for anything.

Again, the Celeron WAS a Pentium 3. It just had half the cache running at a faster speed

The cache was *NOT* running at a faster speed. It did NOT run at a different clock speed than the Pentium 3 it's based on. The half cache *IS* what makes it a Celeron. Obviously it's a Pentium 3 variant, but the point is it's not even as powerful as a Pentium 3 733.

and SOMETIMES the FSB speed was different. http://en.wikipedia.org/wiki/Xbox#Technical_specifications At that time it was not a "stripped down version" it WAS A Pentium 3.

It was Pentium 3 DERIVED. Its cache makes it a Celeron. It's the exact same hardware you'd get if you walked into a store and bought a 733mhz Celeron (aside from them allowing the FSB to run 33% faster, which isn't really anything to do with the hardware).

I had Coppermine Celerons at that time. http://www.cpu-world.com/info/Intel/Celeron-vs-Pentium.html scroll down a bit. Hey look at that, we're both wrong "32 KB L1 cache. 128 KB on-die L2 "Advanced Transfer Cache"" from http://en.wikipedia.org/wiki/Xbox#Technical_specifications "Custom Intel Coppermine-based processor". Sorry, NOT a Celeron.

Huh? You just posted specs for that Celeron that are identical to the Xbox's CPU, and then claim I'm wrong? Well, of course you did...

And again, the Xbox GPU was just an enhanced GeForce3 that was outperformed by the GeForce3 Ti 500.

What does this have to do with my argument? Ditto for your CPU stuff. I've said exactly what my point with that was multiple times, so I shan't repeat it again.

The GeForce 3 in the Xbox didn't exceed specs. It just had an extra pixel shader unit.

Yes it did exceed Direct X 8. Just as the Geforce 4 did. ALL GPUs do. My point is your claim that the 360's GPU is special because it does is meaningless, given that so does the 7900, and do does the Geforce 4 and the Xbox 1's GPU. But then I already made my point...

And here you go again with your Inquirer links. Geez.

I like how you skipped over the whole GTA4 on both platforms link I posted

Because I'd already addressed it. Fanbois say one thing. Critics say another. I don't personally care, but I'll take the word of IGN and other critics I've read over people with religious feelings.

And yes, Oblivion and Bioshock did receive an extra year of development time. First of all, they used a multi-platform engine. Second of all, the games core features were already finished a year or more before the PS3 version's release. The character models, world, EVERYTHING was already done.

That still doesn't make it an "extra year" of development time.

You'd be a fool to believe that they were developed from the ground up from scratch.

Obviously not completely, but it was a substantial rewrite to run better than the 360 version. Just slapping the same code on results in something that runs worse.

Gears of War was a first generation game. It was the first game for the Xbox360 by Epic and it was the first game to use the Unreal Engine 3.

That's not how these things are defined, though as you say I suppose you could use the phrase "in a sense". But then in that sense Halo 3 is also first gen, etc., etc., etc.

And yes, Gears of War looks better than Resistance. It even looks better than Resistance when running on my aluminum MacBook.

No it doesn't, any more than any of these other claims you've made. I've little doubt it does *TO YOU*, but your faith makes it so.

Halo 3's multiplayer isn't impressive, but show me a PS3 game with a single player campaign that has equally impressive lighting. Oh thats right you can't.

I've never, ever, even among zealots heard the claims you're making about Halo 3. Never have I heard it can compare graphically to the games you're comparing it too. You're really out on a limb, even by the standards of your faith.

Please show me where Resistance 1 or 2 looks better than a 2004 era PC gamea s well.

2004 era PC games were where the current gen really started, so I could argue the same thing about virtually any console game. I think there's an argument to be made though that Gears and Resistance look better than Far Cry and Half Life 2 and Doom 3, though it's not a night and day thing.

They also gave the Legend of Zelda: Ocarina of Time a perfect 10 back in the day. Their credibility is where?

Yikes. Oh the faithful. Here in reality land Ocarina of Time was a significant work.

Thats funny, because I've provided actual screenshots and videos in both of my posts. All you can do is quote a reviewer that lost its credibility more than a decade ago. And on top of that, you drop to personal attacks.

I've seen many of these games first hand, read reviews, seen shots of them running, and they all conflict with your faith.

You don't see your fervor. Sure I could be nice to you, and was when I just thought you were confused. But it's pointless to argue with the faithful Your convictions prevent you from changing your mind. You ceased having any credibility when you started making claims as you did about Halo 3, and now again with Zelda: OOT. Those are out there even by zealot standards. As I said, I should have seen it earlier when you were making wild claims about the PS3's GPU, etc., but I wasn't expecting your kind here, and just assumed you didn't know.
 
Magnus, great to hear from you. So how are things in Cuppertino. How's the new job up there at Apple. Glad to see that they skipped any confidentiality agreements. So you can tell us all the 'facts' about what Apple is and is not going to do.

Well, I find it funny that when the fanboys throw this argument in my face when I complained about the lack of a mini-tower or low-end tower that it was A-OK, but when I point out the same thing against a better Mac-Mini and knowing full well the stupid decisions Apple makes (e.g. selling one with a COMBO drive in 2009 for $599 (cough cough)), well, it seems like fair game to me. Apple isn't going to sell what is essentially an iMac without a monitor for $599. I mean that is like pretty much admitting their iMacs are WAY overpriced (the fact that they ARE overpriced is besides the point; this is Apple we're talking about; they charge what they want to charge). It's pretty clear that's why it hasn't gotten an update until this point. Now it's also obvious that only a fool pays $799 for something in the computer world that sold for $799 two years earlier. Two years is a long time in the computing world. So while it would be STUPID for Apple to sell a Mini with an Atom, I wouldn't put it past them by a long shot. Apple often does stupid things. If the thing can't do 1080P, it shouldn't even be in the next generation AppleTV.

Since you are so in the know,what's the scoop on the Apple TV. I hear they are going to ditch the itunes only and make it compatible with Netflix, Hulu etc as well. Any comment oh insider wise one.

Not going to happen. Apple doesn't compete with Apple. They want to sell you things. If they put Hulu on AppleTV, it defeats the entire point of AppleTV which is a front for their store. Sure, it's also useful to stream music around the house, but they want you to stream music you bought at the iTunes store around the house. That's why they make it so difficult to convert video to AppleTV acceptable formats or more to the point why AppleTV doesn't support any other standards when hacks show it obviously COULD. Apple doesn't like competition and will do everything in its power to KILL competition. The whole Psystar thing should have made that obvious by now. And no it doesn't take a spy to figure out the obvious like the above. Apple has demonstrated the same patterns over and over. No Flash or competing browser platforms on iPhone/Touch, no Hulu...heck they don't even want to give you copy and paste for fear you might use it to somehow circumvent something in the iPhone that will cost them money. Notice how you are not allowed to use the iPhone as a flash drive like the other iPods. There is ZERO technical reason why it cannot be used that way. But that would make it easy for you to get files onto the phone and thus circumvent their money making decisions. If Apple could get away with it, they would make a single app store for Mac lines as well. It's pretty *DARN* lucrative to shave 30% off the backs of every programmer out there! That's the scam they're being allowed to get away with for the iPhone/Touch with no other software outlets. In the free market that would be called a monopoly but it's lowly Apple, not Microsoft so it's OK to take 30% of the profits for just hosting a file with no other alternatives. :rolleyes:

Yeah, I work at Cupertino because everything I said is SO top secret.
 
They're selling enough of them, otherwise they would've killed off the Mini awhile ago.

I am sure people bought them when they were last updated but the thing is close to two years old. I don't think they are flying off the shelves now.
 
Agreed.

Now, I think the topic was how we were going to set the Apple campus in flames for not updating the Mini for 18 months and counting.

If that fails, we go into an Apple Store and take hostages.

Wow, what if that really happened? I mean, what if a bunch of people with guns held people hostage and said, "we will release everybody when Apple announces an update to the Mac Mini." Would Apple cave in? Or would they say they don't respond to the demands of terrorists and such?

Apple doesn't negotiate with their customers. It's unlikely they'll take orders from rude, fickle, bad tempered, know-it-alls no matter how clever they are. Oh, wait a minute, they already do...
 
They're selling enough of them, otherwise they would've killed off the Mini awhile ago.

I agree with you 100% I think everyone has lost sight of what the Mini was supposed to be. The current model is fine for kids rooms checking email and doing light surfing which is all it was ever design for. I think the bigger problem is that so many of you are trying to make it a primary production machine which it was never design for.
 
I agree with you 100% I think everyone has lost sight of what the Mini was supposed to be. The current model is fine for kids rooms checking email and doing light surfing which is all it was ever design for. I think the bigger problem is that so many of you are trying to make it a primary production machine which it was never design for.

And the winner is..........! :D
 
Because Apple has never designed...

I agree with you 100% I think everyone has lost sight of what the Mini was supposed to be. The current model is fine for kids rooms checking email and doing light surfing which is all it was ever design for. I think the bigger problem is that so many of you are trying to make it a primary production machine which it was never design for.

...because Apple hasn't produced an affordable desktop mini-tower, which is an even bigger problem.
 
I'm sitting on the fence waiting for the next Mac desktops. I need something with way more horsepower than my Macbook that I am using now to hold me over (2.16 GHz C2D, 2GB ram). I need something that can take a boatload of ram, and has quad-core processing power, along with good graphics.

A Mac Pro would fit me perfectly, but I don't have $3000 for a computer. The only choices Apple offers me now are desktops with outdated laptop hardware, and a desktop with high-end workstation hardware (from a year ago).

I hate to say it, but if Apple doesn't get out a good desktop machine within the next month, I am likely going back to Windows, where I can have a 3 GHz Quad-core system, 8GB ram, 1GB 4870 video card, 2x640GB hard drive for around $1100. For a hair under $1600, I can have a Core i7 system with 12GB DDR3, 2x640GB hard drive, 1GB 4870 video card, case, mobo, DVD+-RW drive, etc... With that, I can take advantage of Adobe CS4 64-bit, and also handle bigger video editing projects, etc...

There is no reason why Apple can't do a good desktop system for around $1700-$2000 with quad core, lots of ram, and finally a good video card, which I have not seen offered in Apple systems in ages.

That's a view from a long time user, who is now getting squuezed out of Apple because the only type of system they have that would fit my need is going to cost me around $3000+.
 
Apple TV and a media server/home server makes sense to me. An earlier poster mentioned that it would fit well in a time capsule... which would make a good form factor for a media server in my mind.

My thoughts exactly. So far most of the people I know that have a Mac mini, use them as a home/small business server. Actually I was waiting for Apple to announce a new Mac mini or Mac Home-server of some kind at MacWorld '09.

I don't know that much about the Atom CPU, but would it be fast enough for serious networking and server-like tasks? Like 50-80MB/sec read/write over gigabit networks. I mean the current TimeCapsule or Airport Extreme certainly isn't fast for that matter.
 
I'm sitting on the fence waiting for the next Mac desktops. I need something with way more horsepower than my Macbook that I am using now to hold me over (2.16 GHz C2D, 2GB ram). I need something that can take a boatload of ram, and has quad-core processing power, along with good graphics.

A Mac Pro would fit me perfectly, but I don't have $3000 for a computer. The only choices Apple offers me now are desktops with outdated laptop hardware, and a desktop with high-end workstation hardware (from a year ago).

I hate to say it, but if Apple doesn't get out a good desktop machine within the next month, I am likely going back to Windows, where I can have a 3 GHz Quad-core system, 8GB ram, 1GB 4870 video card, 2x640GB hard drive for around $1100. For a hair under $1600, I can have a Core i7 system with 12GB DDR3, 2x640GB hard drive, 1GB 4870 video card, case, mobo, DVD+-RW drive, etc... With that, I can take advantage of Adobe CS4 64-bit, and also handle bigger video editing projects, etc...

There is no reason why Apple can't do a good desktop system for around $1700-$2000 with quad core, lots of ram, and finally a good video card, which I have not seen offered in Apple systems in ages.

That's a view from a long time user, who is now getting squuezed out of Apple because the only type of system they have that would fit my need is going to cost me around $3000+.

Welcome to my world! long time Apple user here also that jumped ship a few months ago and now the proud owner of two XPS boxes both with Quad processors and both were under $1000 I also bought the top of line Studio Hybird for the family box and we are just tickled to death! As stated in a post I made late last year I spent tons of my hard earn money with Apple but jump ship after the comments made by SJ during the conference call late last year which can be found here on MR
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.