Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
When Is The Mac Pro Getting An Upgrade

MINI< MINI MINI

When is the Mac pro coming out? Looks like the new i7 chips are out. Any ideas!!
 
Nope. The other poster is completely correct and you have serious misconceptions about the computer marketplace.

Sorry, but I disagree with you and I see no reason to change my mind based on your ranting. I understand the computer marketplace just fine. I was never suggesting that a Netbook should be used to replace a computer for those who need real computing power.

The origin of this post came from a comment that Apple would never make a Netbook because it would eat into Macbook sales. My opinion was (and still is) that Apple needs to enter the Netbook market because sales of traditional notebooks are going to be threatened anyway. I stand by the assertion that most consumers do not need the full power of a traditional computer. 95% of what consumers want to do can be done with a combination of Netbook power and "Cloud Computing".

The build quality of most of the netbooks is much poorer than that of recent generation $600 notebooks. That's because they are being built to a much lower price point and trying to find every way to shed weight, including going to flimsier construction.

This is ridiculous. Is the build quality of the touch ipod worse than the build quality of the ibook? Is the build quality of the Dell 9" mini worse than the full sized Inspiron? I see no evidence for your claim.


SSD drives save power, but have much lower write performance and much more limited capacity. 64GB capacity is pretty sad when my iPod has 120GB.

SSDs have no moving parts and are more durable than magnetic head HDD. They also have much better read speeds. In fact the newest SSDs when connected to a PCIe bus are obtaining throughput speeds of over 800Mbps. Your classic ipod 120GB cost as much if not more than a Dell mini Netbook and the magnetic drive it uses is more susceptible to failure. The future is moving toward SSD storage as prices drop and storage space increases at a faster rate than HDD storage.

Marketers love people like you. They invent terms like "cloud computing" and you're right on board. Ever try "cloud computing" when you're actually in the clouds on a multi-hour plane flight (where there is no Internet connectivity)? Ever try to copy a DVD operating system ISO to "mass storage on the web" using the typical hotel "broadband" connection (which is often one cable modem shared between a hundred or more rooms)? I bet you were one of those guys telling everyone how tablet computers were going to replace notebooks and how the wave of the future was "Internet appliances."

First, I am not right on board with "Cloud Computing" because of marketing. You do not know me, so please do not make statements that presume so, I am not a fan of tablet computers or Internet Appliances. Obviously you use the Cloud when it is feasible. If it isn't feasible don't use it that is what local storage is for. I want to point out that most consumers do not need to be copying DVD OS ISOs. Besides a better way would be to copy the OS ISO to an SD card. That way when you need it, you plug it into card reader, boom you have your ISO.


I've got a DVD that I would like to watch on my flight to from Virginia to California on Monday. Later on, I'd like to rip it and convert it to h.264 to put on my iPod Classic. I'm probably going to be burning some MP3s to play in my rental car which has an in-dash MP3 player that reads CD-R discs. While I'm on my business trip, I'll probably want to play some first-person shooters (Quake III, Unreal Tournament, etc.) in my hotel room. There's also a multiplayer game I've been hearing about called Armada Online and it requires at least 1024x768 resolution, so I'll need a netbook that can do that. I'm hoping to edit together a video using footage from the HD camcorder I'm taking. I'll have to work on some work-related documents and I will be typing for several hours, so I need a decent keyboard and display.

So, which netbook do you recommend for the above? It will need at least 1024x768 resolution, a combo DVD/CD-R drive, a full-size keyboard, enough processing power to do video conversions to put on an iPod. It will have to have good enough CPU horsepower and 3D video acceleration to play a first-person shooter. It will need to be capable of light video editing.

Look if that is what you "REALLY" want to do while on the go, which seems like it is very contrived by the way, then you need a $2000 laptop. But most consumers could do similar tasks with a $400 - $500 Netbook. For instance, instead of a DVD, why not download the movie and install it on your ipod and netbook. Or rip the DVD in advance and install it on you ipod. The same thing with the music, buy it from the itunes music store, you could then play it on your ipod or your netbook. Why burn to a CD-r? Most mp3 playing stereo radios also have connections for ipods. Some even use wireless options for music library connections. As for the games, they won't play much better on a $1500 laptop anyway. If Apple designed an accelerometer controller in their version of the Netbook then you could play high quality games like those for the touch ipod. Hopefully Apple will release a mini macbook and it will have a full size keyboard similar to the Dell 9" mini, a high-res led display, and a GPU equivalent to the 9400M for light video editing and light 3D game playing.

All of those examples are pretty typical mainstream things done by consumers. I didn't get into esoteric examples like 3D rendering, high-end video editing, film and audio restoration, etc.

No, they are not typical consumer tasks and yes you did go into the higher-end areas. You mentioned video editing, 3D video game playing, etc which do utilize a higher performance GPU or CPU.

Typical consumers and I am talking about the typical person off of the street use the computer for checking email, browsing the web, playing video, playing music, personal finance management, light word processing, social networking, telecommuting, teleconferencing, etc.

Look, if you need full computing power on the go, at home, and at work then a Netbook is not the answer. But if Apple designed a Netbook utilizing their multi-touch iphone technology and innovative design aesthetics they would have a winner on their hands, and many typical consumers would be wondering if they really need to spend $1300 on a Macbook or $500 on a mini macbook.
 
After so many posts, it becomes very easy to speculate fairly accurately. Remember Apple is very conservative (just look at MBP 17).

- No Atom, they will use the SAME C2D processors. No speed bump either.
- No DDR3 RAM , they will stick to DDR2 (as above).
NCP 79M Nvidia chipset but they will clock it lower than full speed. Maybe same of Rev B MBA or even slower. Combined speed still faster than MBA but slower than MB 2.0Ghz Unibody.
SLight case re-design but NO aluminum slides or top. They would keep FW 400 port, and potentially drop the DVD drive algother or make that a BTO option.

Same price. :apple::apple:

I don't believe Apple will use Firewire 400 on any new products. It will be Firewire 800 or nothing, as we have seen with the notebooks.
 
Atom ? lol Sorry but do they smoke crack while they work ? I want a small desktop solution and not the atom crap. Jesus christ...Apple guys, go and sell mellons on the highway.
 
Atom ? lol Sorry but do they smoke crack while they work ? I want a small desktop solution and not the atom crap. Jesus christ...Apple guys, go and sell mellons on the highway.
They have and they are called Pods, everyones got one. I think the Atomic Mac Mini will be a pod like computer capable of doing anything in a small package. Plus everyone will be able to afford one. Take it with you etc. Makes more sense to sell something everyone can buy then one only a couple percent of the population can afford. MacPro Numbers of sales have to be dismal compared to Mini and iMac.:apple:
 
How about an Atom based Apple TV that uses the iPhone OS? It would be controlled by the Touch/iPhone as well as share processing. With an Nvidia chip it would handle H.264 fine-hopefully 1080p.

Gaming, media hub for your TV and living room internet. App store [all same apps as Touch/iPhone], but the Apple TV could handle running multiple apps. Same capacity [and type of storage] as the Touch/iPhone with the OS/apps.

This would be a more popular 'switcher' model than the mini-the iPhone and touch are in loads of Windows users hands, and if they could combine it into a multipurpose stripped down mac/media hub and gaming machine it would fly off the shelves. Especially if they could offer it at a netbook region price point.

Sort out iTunes as the media database-so it's more suited to the uses people are putting it too. Another box that's a RAID media server with a new iTunes server and time machine backup server. Which can handle streaming multiple feeds through a decent network.

Then upgrade the mini as it needs to a decent spec'd [add eSata], small box mac that would handle more than the current iMacs [These will go Quad and all 3 may ship when Snow Leopard/iPhone Snow Leopard launches]. It would not be any larger than the current mini-other than possibly the footprint: in line with Apple TV. Oh yeah that'll be the next machine for the halo people-sits on top of their Apple TV.
 
They could put an Atom into a 15" or 17" iMac and come out with a cheaper one. Just a thought...

i can see it now...

"Introducing the NEW iMac, now slower than your old G3's"... no thanks.

(Yes i know it wont be slower than G3's, but it will definately be slower than the last 2 iMac revisions, which ISNT a very good thing).
 

Attachments

  • old_imac.png
    old_imac.png
    89.1 KB · Views: 73
Even "low end" processors from 2006 were faster, such as Athlon64 X2s.

Yep.

nvidia's 7800 and 7900 series specs are quite a bit different than Sony's claimed specs, which have also been removed from the Playstation3 website.

They're virtually identical. The memory interface is different, other than that it's the same as a 7900GT.

But real world performance would suggest

That it's pretty much a 7900 class part, yes.

it being closer to the 7600GT, considering the PS3 couldn't even run GTA4 at a solid 30fps at 1024x640, with lower resolution textures than the Xbox360 version and the patented Playstation Vasoline Effect smearing it all.

And yet amazingly IGN's review says it looks better. Also remember that the Playstation 3 doesn't have a scaler built in.

Every single multi-platform developer interview I have ever seen has stated that the Xbox360 is the better overall platform and far more capable.

The only one I've EVER seen that makes such a claim is some Harmonix person, years ago, which for a variety of reasons isn't worth much. I've NEVER seen anyone else claim the 360 is more capable. Even developers who like it better don't make that claim.

Actually, Sony's former comments were that it was based on 7800 technology but shares more in common (looking at the specs) with the 7600GT than the 7800 series.

Sony never claimed that, and the specs don't back it up. It's basically a 7900GT.

Well, equal PC hardware at the time of the PS3's launch could at least push UT3 at 60fps at the same resolution as the PS3, but higher detail settings.

That's why I said "at best". The 360 and PS3 weren't a match for the best PCs when they launched in '05 and '06 respectively, but they were at least competitive. That's why I said the original Xbox was probably the most powerful console relative to when it launched.

Actually, the CPU in the Xbox wasn't a "Celeron". Many people mistakenly called it a "Celeron" or a "Pentium/Celeron hybrid". Most people fail to realize that, at that time, the Celeron WAS a Pentium 3. The difference was that the Celeron had half the amount of cache as the Pentium 3, disabled either by choice or by defect, but the cache ran at full chip speed.

It was a Celeron. It was the exact same chip as Intel sold as a 733MHz Celeron, other than they allowed it to run with the (by today's standards virtually identical) FSB speed of the equivalent Pentium 3. But yes, Celerons have always been stripped down versions of Intel's current chip design.

The Pentium 3 had double the cache but ran at half chip speed.

No it didn't. The Coppermine Pentium 3 and it's Celeron equivalent had on die L2 cache that ran at the same clock speed as the rest of the chip.

The Xbox CPU just had the full system cache running at full chip speed.

No it didn't. It was a Celeron-it had half the cache of the equivalent Pentium 3.

The Xbox GPU was just a GeForce 3 with an extra pixel shader unit. The GeForce 3 Ti 500 was available before the Xbox and more powerful.

As I said, it was months before something that really clear cut beat the Xbox's GPU. Granted, even my Geforce 2 I ran at the time was similar, and even better in some respects, but the Xbox's GPU was pretty clear cut a high end GPU when it lauched, for several months. And at least for what I can think of right now, that's the last time that happened. The 360 and PS3 GPUs were both more 'mid range' by the time they launched in '05 and '06, which the Xbox 1's GPU was not.

Plus 2001 saw the "Thunderbird" based Athlons running at 1.4GHz. When the Xbox launched in late 2001, we already had Athlon XPs running at 1.5GHz and the GeForce Ti 500, so we had PCs then that were considerably more powerful than the original Xbox itself.

They weren't THAT much better though, which is my point. That's the closest a console's ever come.

According to ATI (and Microsoft hasn't changed or removed specs, like Sony has), the Xbox360 GPU is more powerful than the 7900 series (and real world game performance backs that up).

No, and no. What "real world performance" backs that up? And what claim backs that up? Random made up numbers these console companies always throw out? I'm sure you could find one that makes the 360 look more powerful than a current PC, but it's not. It wasn't on par with high end 2005 class stuff.

ATI also claims that the Xbox360 GPU has "some DirectX 10 features".

And so does the 7900. All that means is that these chips exceed the Direct X spec in some way. The Geforce 3/4 thing in the Xbox 1 exceeded Direct X 8, and to my knowledge, every GPU ever made exceeds the current Direct X spec in some way. They're never exactly built to the spec.

Bioshock? http://gamevideos.1up.com/video/id/21388 Xbox360 has better coloring, but otherwise looks the same. Plus it also came out what? A full year after the Xbox360 version did. Just like Oblivion, it had an extra year in development.

I already covered that other game. No, Obivion and Bioshock did NOT have an extra year of development on the PS3. They had LESS development time on the PS3. Development was not done concurrently, which is what would have to be the case for your claim of an extra year's time.

Side by side Mercenaries 2 videos are pretty clear cut. They either look basically identical in some spots, or the textures are considerably better on the PS3 in others. I've only played the demo of the PS3 Bioshock, and couldn't tell any visual difference, but I'm sure the reviewers are correct that it has better textures in places too.

...The Xbox360 version has noticeably better color definition.

No, it does not. The only claims of better color on the 360 I've ever heard were back when the PS3 was fairly new, and critics has misconfigured systems.

Burnout Paradise? http://www.gametrailers.com/player/29926.html?r=1&type=wmv The Xbox360 VGA connection (no HDMI at that time) looks significantly better all around.

And yet of the several critics I've heard, not one has claimed that. Quite clear cut the opposite.

Gear of War was a first generation game for the Xbox360 and exclusive. It looks FAR better than Resistance.

No it wasn't, and no it doesn't. At the time I thought Gears was more visually interesting, but not better looking, and it's second gen software.

Halo 3 looks better than Resistance 2, despite being a year older.

Yikes, okay, now it's pretty clear what you are, which explains all these random backwards claims. NO ONE claims Halo 3 looks particularly good. It's fine, but no one claims it looks as good as Gears or Resistance 1, and here you are claiming it looks better than Resistance 2. :D

Hello Fanboi!

Gears of War 2 looks better than Resistance 2 and KillZone 2. KillZone 2 doesn't even look as good as many of the games out there

And yet, mysteriously, critics (like IGN) who have had time with it say it's the best looking console game to date. Hmm.

MGS4? It looks no better than what we've seen in average games for years now. In fact, it has lower resolution textures than most average games.

Fanboi! Fanboi! Geez you're hilarious. Um, yeah, aside from seeing it with my own eyes, and every critic I've read raving about it, it just looks "average". Actually critics (like the Giant Bomb guys) tend to say things like it's the most flawless graphic execution they've ever seen, with everything, every surface just perfect. Now personally I thought the virtual acting was better done in Mass Effect, but it's still one of the best looking games in that regard too. "Average" is clearly not an apt description for it regardless of whether you like the esoteric stories in those games.

I wonder who you were talking to. There is no doubt whatsoever that the PS3 has far more powerful hardware than the Xbox360. Check the numbers. However, it also seems to be the case that programming the PS3 is a pain in the butt.

BTW, the most expensive (and most useless) CPU design for sure is the Itanium.

He's exposed himself as a Fanboi. His religious feelings explain why he's making these claims. It's too bad, as he's right about the hardware in terms of PCs being better. Unfortunately his religious convictions make him see 360 games as looking better than PS3 games regardless of the reality.

Here in reality land, virtually every 360 and PS3 game I've seen looks amazing, though the very best looking PS3 games do look a bit better than what the 360 can do...not that it's not a great system!

I should have picked up on what he was back when he was claiming the PS3's GPU was two 5200s cobbled together, but I guess I didn't expect to see a zealot here, and I just assumed he didn't realize what it actually was.

The Mac Pro will not use the Core i7 CPUs. It will use the Gainestown CPUs.

You'll benefit from reading this.

Core i7 is stupid.

Umm...Core i7 is phenomenal. And yes, the Mac Pro will use Core i7, unless Apple's going lower end with it. (Yes, technically it will be the server version, blah blah blah, but IMO it makes sense to just refer to things by the main trade name or core name, since there are endless revisions of these things.)
 
To compete with cheap all-in-one nettops that are coming out?

Even if they do release something to compete with cheap all in ones, they're presumably going to keep selling a more normal/capable all in one.

As far as I'm concerned, Atom in anything but a tiny notebook or tablet device is a gimmick and a terrible waste.
 
Even if they do release something to compete with cheap all in ones, they're presumably going to keep selling a more normal/capable all in one.

As far as I'm concerned, Atom in anything but a tiny notebook or tablet device is a gimmick and a terrible waste.
Which is why I'm somewhat perturbed by the users that can't seem to understand this.
 
i can believe the part about the atom, but i can do without a smaller mini,
what i would hope would happen is,

atom (low end), atom (med range), core duo (high end) option

to me that would be great

oh wait a minute that would sux

the atom n270 has the power of a P4 2.2ghz single core, it would be like down grading the power of the current mini.
 
the atom n270 has the power of a P4 2.2ghz single core, it would be like down grading the power of the current mini.

It shouldn't be anywherenear a 2.2Ghz Pentium 4. Clock for clock it should be slower, especially compared to a Northwood or better. It's very primitive compared to a Pentium Pro or later (other than SIMD and a few other modern things).
 
the atom n270 has the power of a P4 2.2ghz single core, it would be like down grading the power of the current mini.
no, n270 is far from a p4 2.2.

apparently this needs to be repeated every few pages in this thread:

atom is derived from an intel's architecture predating the ppro. any comparison between atom and a desktop chip from p3 onwards would be pointless. particularly, comparing atom to a core2 - there are several generations of intel cores standing between these two, memory subsystem and simd extensions notwithstanding.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.