Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
slow but really cheap

This wouldn't be a traditional desktop, but it could be extremely cheap for a mac desktop. It could be a "nettop". I'm all for it just to be able to run a family mac for a lesser price. I'm still using a 4 1/2 old powerbook because I simply can't afford anything new. This could be a great product for me, just not for the vast majority of current mac users. It would also be a great product for kids, true switchers (due to price cut), and your basic computer user. Most of my friends with macs only use it for word processing, music, and internet anyway. Hell, if I can still use photoshop and basic movie editing on my powerbook, I could do it on this.
 
I personally believe netbooks are just a fad. They are today's subnotebooks of the past.

I remember 5+ years ago when subnotebooks hit the market. Then soon died the following year. Though I do recall subnotebooks at that time were pricey. It was around $1000 for low end and $3000 for high end. Netbooks (todays subnotebooks) made a come back because of their low cost. But after people realize how these machines under perform I don't think anyone will be buying them. That's my guess. That's from my experience in buying one. I bought one and never used it cause it was easier to do work with my notebook. So, yes the price of these netbooks is making it fly off the shelve. But after 1 year of usage people will trade them in for something with little bit more bigger display, longer keyboard and speed.

Subnotebooks from major vendors years ago.

Toshiba
http://w3.ualg.pt/~rsolivei/libretto.jpg

IBM
http://www.bigkey.com/pic/ibm/laptop/5069_ibm_thinkpad_560e_p166-48-2.1gig.jpg

Sony
http://pdaphonehome.com/forums/atta...oking-phone-swap-my-sony-subnotebook-63_1.jpg
 
I doubt they'll be any AMD processors on a Mac Mini.. Reason is because intel chips are extremely cheap, at the moment Core 2 Duo 2.2GHZ processors are for $87.00CAD, has less cache tho at 1MB, but still should be fine unless your into video encoding.. Some people don't know this but the NEW Intel Dual Core processors are actually C2D, its the same chip as the C2D but with less cache, intel doesn't like people to know this so ppl buy there higher end CPU's.. You can google it and find out. Btw Im not referring to the Pentium D's those are slow, but the new "Dual Cores", like E2200 are actually C2D, i myself have an E2160 @ 3.2GHZ :].
 
One more thing

This definitely wouldn't cannibalize imac or mac pro sales for a few reasons. People like imacs because it is an all-in-one system. They love that they get the screen with it and the looks of it, plus its very powerful for an all-in-one. Mac pro will always stay the same because pros need something powerful. People don't buy iMacs because they are the cheapest mac. Apple thought that this would be the reason to buy the mac mini, but its not cheap enough to compete with PC manufacturers cheapest desktops. A slower mac mini doesn't mean it will be crap, just crap if you try to run pro apps on it. I think this is a great way to boost sales and bring people into Apple to eventually want the big expensive computers.
 
I'm not sure with the fad thing. A lot of people might have purchased subnotebooks before as a third or fourth computer if not for the super high prices. Start them out at $250 though and people may find uses for them.
 
I believe I'll be pretty disappointed with this update. Shouldn't the speed of the newer model be faster than the one it replaces.

I guess I'll save the money for my new macbook or ipod Tablet.:mad:
 
Check the benchmark table.... It isn't slow.
 

Attachments

  • ani-mouse.gif
    ani-mouse.gif
    21.1 KB · Views: 628
Apple is well known for making thin devices, so that can reduce the hardware specs of an Apple netbook.

Second, and I think more importantly, is OS X mobile. Netbooks right now are shoehorning a desktop OS onto a tiny device. What you need/want is an OS designed around a small screen.
I believe that is exactly what will happen. The rumored mini-tablet might have first (2007 AppleInsider rumor) been designed as an Apple MID or something, with a 5.5" display, but then was changed in 2008 to an Apple netbook-like device with a 7"~9" display.

The mini-tablet does not have to be using the same OS as the iPhone and iPod touch though. It will likely, and should, have additional features like multitasking, more editing features, mobile iLife/iWork, and windowed / split screen views.
 
So you're comparing a high end expensive CPU from 2006 to a console CPU from 2006? You're comparing an 8800GT to a 7800 console GPU.

Well, according to Sony, the "Cell" was a "super computer on a chip" and "Faster than anything else".

Even "low end" processors from 2006 were faster, such as Athlon64 X2s.

Don't forget that the actual 7800 series was a high end PC product at the time as well.

The SPUs are very powerful, and Cell gets over 200 GFLOPS for things like physics and video processing. It's a 2006 console CPU designed to a price point.

Designed to a price point? Sony dumped over a billion dollars into the Cell's development. It's probably the most expensive CPU design ever. Which makes things even funnier, considering its just a PPC with a bunch of co-processors.

Also, Sony is the only company to ever claim 200GFLOPs of performance. Neither IBM nor Toshiba, co-developers of the Cell, have come out and said such a thing. Sony has also removed all but one reference to that kind of processing power from the Playstation3 website. It's buried back in the old original press release for the PS3 from 2005. But Sony has been known to remove press releases if they point out product shortcomings. Back when the PS2 HDD was released, there was a big stink at the forums as to why the HDD was lacking all of the features Sony promised. The forum posters grabbed all of the press releases where Sony promised the world and posted them asking why we didn't get those features. Sony's response? Remove the press releases.

Sony also made outrageous claims regarding the RSX in the PS3 that nvidia has never backed up. If you were to believe Sony's claims, the RSX should be almost as powerful as the GeForce GTX 295 when it comes to FLOP performance.

No, it's a GeForce 7800. http://en.wikipedia.org/wiki/RSX_'Reality_Synthesizer'

That's why your PC is faster, it's got a GeForce 8800, which was a far faster GPU when it came out than the 7800.

Actually, no. The RSX is based on the technology, but not even nvidia backs up those specs. http://en.wikipedia.org/wiki/GeForce_7_Series nvidia's 7800 and 7900 series specs are quite a bit different than Sony's claimed specs, which have also been removed from the Playstation3 website.

If you look at those specs, it actually seems to be a cross between the 7600GT and 7800.

But real world performance would suggest it being closer to the 7600GT, considering the PS3 couldn't even run GTA4 at a solid 30fps at 1024x640, with lower resolution textures than the Xbox360 version and the patented Playstation Vasoline Effect smearing it all.

Most developers say that both consoles are fairly equal overall, with pros and cons, basically the 360 could look a bit better, but the PS3 game could have better physics.

Every single multi-platform developer interview I have ever seen has stated that the Xbox360 is the better overall platform and far more capable.

Well now this is really getting off topic, but it's pretty clear both from actual game and developer comments that the PS3 can look better, but is harder to code for (and if care isn't taken it can look worse).

Thats false. Developer comments have stated otherwise.

Plus the games themselves prove it as well. Again, GTA4. Case in point. Runs at 1024x640 versus 1280x720 on the Xbox360, plus the Xbox version has higher resolution textures and no blurring effect.

But yeah, the PS3 uses basically a 7900 class GPU.

Actually, Sony's former comments were that it was based on 7800 technology but shares more in common (looking at the specs) with the 7600GT than the 7800 series.

But just like with the Cell specs, Sony has retracted all but one buried comment regarding the PS3's power.

This is really the same situation we've always seen-at best consoles are roughly equivalent to PC hardware when they launch, and then quickly fall behind.

Well, equal PC hardware at the time of the PS3's launch could at least push UT3 at 60fps at the same resolution as the PS3, but higher detail settings.

It took several months after the launch of the Xbox 1 before there was a significantly more powerful GPU available on PCs, and the 733mhz Celron was pretty solid for when it launched.

Actually, the CPU in the Xbox wasn't a "Celeron". Many people mistakenly called it a "Celeron" or a "Pentium/Celeron hybrid". Most people fail to realize that, at that time, the Celeron WAS a Pentium 3. The difference was that the Celeron had half the amount of cache as the Pentium 3, disabled either by choice or by defect, but the cache ran at full chip speed. The Pentium 3 had double the cache but ran at half chip speed. Other than that, the processors were exactly the same. Intel would just take that P3 with the defective cache, disable the defective half, run it at full speed, and put a Celeron sticker on it. The Xbox CPU just had the full system cache running at full chip speed.

The Xbox GPU was just a GeForce 3 with an extra pixel shader unit. The GeForce 3 Ti 500 was available before the Xbox and more powerful.

Plus 2001 saw the "Thunderbird" based Athlons running at 1.4GHz. When the Xbox launched in late 2001, we already had Athlon XPs running at 1.5GHz and the GeForce Ti 500, so we had PCs then that were considerably more powerful than the original Xbox itself.

(In the case of the 360, the Geforce 7800/7900 series was already out, in the case of the PS3, the 8800 series was already out.

According to ATI (and Microsoft hasn't changed or removed specs, like Sony has), the Xbox360 GPU is more powerful than the 7900 series (and real world game performance backs that up). ATI also claims that the Xbox360 GPU has "some DirectX 10 features".

Okay, this is where your argument falls apart. As that other guy mentioned, it's basically a 7900GTX, and while it's not a unified architecture, it's higher end than the 360's GPU.

Actually, no its not. Again, Sony's own specs (which have since been recanted and removed, also linked earlier in the thread by another poster) claim its based on the 7800. But if you go to the wikipedia links for the GeForce 7 series and the RSX link and look at the two, you'll find that its in-between the 7600 and 7800.

Off hand I can think of Bioshock, Mercenaries 2, Burnout Paridise, and Grand Theft Auto 4 (there's some debate with that one, but I'm going by IGN's review).

Grand Theft Auto 4? Are you kidding me? http://www.gamespot.com/features/6202552/p-4.html GTA4 on the PS3 looks absolutely TERRIBLE.

Bioshock? http://gamevideos.1up.com/video/id/21388 Xbox360 has better coloring, but otherwise looks the same. Plus it also came out what? A full year after the Xbox360 version did. Just like Oblivion, it had an extra year in development.

Mercenaries 2? http://www.gametrailers.com/player/39321.html take a look for yourself (HD video) The Xbox360 version has noticeably better color definition.

Burnout Paradise? http://www.gametrailers.com/player/29926.html?r=1&type=wmv The Xbox360 VGA connection (no HDMI at that time) looks significantly better all around.

Plus if you look at PS3 exclusives, and even first generation PS3 games versus first generation 360 games, it's pretty clear it does have more potential

Really?

Gear of War was a first generation game for the Xbox360 and exclusive. It looks FAR better than Resistance.

Halo 3 looks better than Resistance 2, despite being a year older.

Gears of War 2 looks better than Resistance 2 and KillZone 2. KillZone 2 doesn't even look as good as many of the games out there, and it has so much Playstation Vasoline that you can't even really get a clear look at anything. It doesn't even look half as good as the trailer from 2005, which the developers finally admitted was not in-game footage.

What else is there? Lair? That game was a slideshow. MGS4? It looks no better than what we've seen in average games for years now. In fact, it has lower resolution textures than most average games.

Gran Turismo 5? Sure the car models look fantastic, but you still have those 2 polygon trees and the environmental detail is so low that even GRID running on my MacBook on medium settings has significantly higher environmental detail than that game.

(I suspect a lot of developers initially were just dumping 360 code deigned to run on three CPUs onto the PS3's single main CPU-which would pretty much be exactly in line with the performance difference we saw in some early multiplatform games).

CPU code has nothing to do with the fact that games like GTA4 run at 640p on the PS3 compared to 720p on the Xbox360. Thats just a weaker GPU. Thats a difference of roughly 350,000 pixels you know. Absolutely nothing to sneeze at.

And no more taking this thread off topic ;) Thats it on the whole PS3 discussion.
 
GPU power

You won't see these before the release of Snow Leopard. Snow Leopard is the perfect platform for a relatively weak CPU coupled with a comparatively strong GPU. A micro mac should be able to handle web browsing, email, iWork, (HD) video, picture management (including fast, high quality scaling, facial recognition, ...) just fine. For the first three Atom would be just enough, for the latter you need either a strong cpu or a GPU tightly integrated into your media applications. On Snow Leopard most people wouldn't really find an Atom Mac to be too slow for their matters.

The only things you couldn't do comfortably: legacy video & audio encoding apps, games, and only partly accelerated pro apps like Photoshop. The former not being Apple's "dream" mass customer base anyway (they rather like them to buy iTunes).
 
And no more taking this thread off topic ;) Thats it on the whole PS3 discussion.

I'll ignore that comment and congratulate you on your decent summary :) but you forgot to mention how the limited memory of both systems came into play - Dedicated video memory (they seemed to designing a spec sheet more than a games console?) on the ps3 coupled with sonys legendary software engineering has caused some pretty serious headaches, limitations and delays. Separately, I wish beyond belief (and certainly wouldn't be considering computers like the mini if they had) that they had both opened up their systems - Microsoft specifically designed a system that you couldn't and sony did something unexplainable with theirs :( That's my simpletons addition to the subject.


You won't see these before the release of Snow Leopard. Snow Leopard is the perfect platform for a relatively weak CPU coupled with a comparatively strong GPU. A micro mac should be able to handle web browsing, email, iWork, (HD) video, picture management (including fast, high quality scaling, facial recognition, ...) just fine. For the first three Atom would be just enough, for the latter you need either a strong cpu or a GPU tightly integrated into your media applications. On Snow Leopard most people wouldn't really find an Atom Mac to be too slow for their matters.

The only things you couldn't do comfortably: legacy video & audio encoding apps, games, and only partly accelerated pro apps like Photoshop. The former not being Apple's "dream" mass customer base anyway (they rather like them to buy iTunes).

Need to wait a long time for snow leopard to be release and for software to mature - even then I can't see a gpu helping much with half the things you mentioned, and can with half the ones you did :)
Simple web browsing, email, iwork etc are fine on the atom as they are on pretty much any cpu made for a fair while, things like scaling is practically free on any gpu. Can't really see any huge advances coming for average desktop use, despite the silly claims some people make.
I'm not really interested in video decoding till it's done with a nice open spec and not reliant and limited on/by specific encodes with some propitiatory gpu driver. Irrelevant of the os libraries, I'm pretty sure all the (decent) work (is) will be done by oss groups like ffmpeg etc, and i'll join in with you guys saying "who needs a beefy cpu to encode/decode hd video? The atom is fine and reasonably priced" and buy into it. Until then I'll stick with a 3+ghz c2d and the software I want or super cheap sigma/etc based gizmo.
 
call me confused but the quote in the article says that Apple received samples and prototypes of the chipset. NOT that they have ordered supplies. right

so basically they took a look, but no one is saying they agreed that it was fantastic and lets get a million of them to use in something.

or did I read that quote wrong
 
I Pentium 4 with GMA950 won't play 1080p whereas an Atom with nVidia Ion can... :eek:

Christ ...
Since when are there Mac Minis with a Pentium 4 ?
Last time I remember it had a Core 2 Duo with at least 1.8 Ghz

And still, your Pentium 4 (with at least 2 GHz) can play 1080p just fine if you use a decent codec. Try CoreAVC.
 
Not anymore. It WAS a custom package when it first launched (the CPU itself wasn't any smaller) but it's available to everyone for ages now.


Let's agree that it is not customly designed for Apple anymore (as it was the case for the old MBA). It still is a customly designed CPU for SFF packages. It is NOT the same CPU as in the MB.
 
If Apple doesn't get into this market then the macbook sales will die anyway. Why? Because most consumers don't need a $1400 macbook when they can surf the internet, check email, run general applications, etc on a $500.00 netbook.


Sorry, but there were notebooks available for 600 USD long time before the netbook mania started. Long time before Macbooks sold really well actually. Your point doesn't hold.

Netbooks are aimed at and bought for different purposes. Usually they are complemented by a "real" computer. No one would buy a netbook and not own a PC/Mac to store data, do real work etc.
 
And still, your Pentium 4 (with at least 2 GHz) can play 1080p just fine if you use a decent codec. Try CoreAVC.

coreavc is quite decent and pretty much the only solution for older systems - but it is commercial, windows only and cheats at decoding.
 
Every single multi-platform developer interview I have ever seen has stated that the Xbox360 is the better overall platform and far more capable.


I wonder who you were talking to. There is no doubt whatsoever that the PS3 has far more powerful hardware than the Xbox360. Check the numbers. However, it also seems to be the case that programming the PS3 is a pain in the butt.

BTW, the most expensive (and most useless) CPU design for sure is the Itanium.
 
I wonder who you were talking to. There is no doubt whatsoever that the PS3 has far more powerful hardware than the Xbox360. Check the numbers. However, it also seems to be the case that programming the PS3 is a pain in the butt.

I wonder who you were talking to. Or are you just looking at the (wrong) numbers :)
 
coreavc is quite decent and pretty much the only solution for older systems - but it is commercial, windows only and cheats at decoding.


I'm sure you can enlighten us on that last statement. I'm interested.

Considering that the poster mentioned a P4, I assumed there was a Windows machine at work anyway. I wish Apple had picked up Core's work. AVC decoding on Intel CPUs has always been a weak spot for them. G5 Quicktime was so much more efficient.
 
Last time I checked coreavc wasn't standards compliant - it used to be open source too (still waiting for the vaporware betaplayer!) if apple had wanted to help out.
 
This logic does not now, nor has it ever flown. As always, by that logic you could also use a Pentium 2, or G3, or whatever.

And actually it's MUCH more than 10% that use their computer for games or other things that need power. So tired of that myth.

actually - as far as i can tell - yes, most computer users would be fine with a p2 or g3 (its only bloated operating systems and game and pro apps that blew computer requirements up so much).

My wife uses my g4 macmini exclusively - no problems at all.

their are around 60 staff at work with computers - the only ones that are newer than 5 years old, are ones that replaced pcs that eventually died.
Other than that, we have only replaced all the crts with lcd screens.

my parents, all of them use computers that should be expired by a pro users standpoint. The only exception is my mum who won a macbook air from a radio comp.

WE ARE NOT MOST PEOPLE - We are the whiny 10%

BTW, I personally wanted to buy a new high power macmini, so would be as dissaponted as you guys if it only came out as an atom based model.

But from a marketing standpoint - if they release an atom model for less than $499 - I can see the business case for apple to do it....
 
Nvidia 9400m is a nice step forwards.

The 1.6ghz atom is a huge step backwards.

This makes it something I am no longer interested in.
 
What I want in a mini is:

  1. Dual DVI to support multiple monitors
  2. More modern graphics to support modern video games
  3. eSATA port to support large, fast, external drives
  4. Faster dual or quad core CPU
  5. Ability to expand addressable RAM to 8GB or more
  6. Gigabit Ethernet

That's pretty much identical to what I want. And I've been waiting for it for about 2 years now. If Apple fails to finally release such a machine I'm afraid it will be Hackintosh time. I'm not dropping $2500 on a tower.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.