Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
wdlove said:
It's all going to depend what what actual specifications that Steve ends up offering. I'm always amazed at this marketing ploy of $499 so that we don't catch that its $500. This concept bothers me.

It was a concept started by an English gentleman who was upset that there were never enough pennies around to buy a newspaper with. So, he convinced some merchants to start this practice in order to increase the number of pennies in use. Today, that practice holds through but with a different reason...

wileypen said:
If Hitler were alive today, he'd want this computer.
wtf? What the hell does this mean, and what the hell is Shard's Law, and what the hell does this have to do with the new, rumored, Macintosh???? :confused: :confused:
 
Mechcozmo said:
wtf? What the hell does this mean, and what the hell is Shard's Law, and what the hell does this have to do with the new, rumored, Macintosh???? :confused: :confused:

Shard's Law is that computing power doubles every 18 months and the bloatedness of Microsoft Office quadruples every 24 months.
 
Rootman said:
Shard's Law is that computing power doubles every 18 months and the bloatedness of Microsoft Office quadruples every 24 months.
But i bought an iBook 700Mhz 3years ago and it stayed at 700Mhz until the hard drive failed and then it went to 0Mhz which is a decrease of 100%, kinda makes a fool of Shard and his theroy no?
 
Shard's Law is that when you buy a Mac, a better one will be announced the next month, but it won't ship for a long time.
 
Rootman said:
Shard's Law is that when you buy a Mac, a better one will be announced the next month, but it won't ship for a long time.

No no no, it's that the better Mac will only ship quickly if you just bought one, but will take months if you're holding out for it.

~J
 
Sir_Giggles said:
No no. This has become Shard's Law. By stating Godwin's law, you supersede the possibility of enacting said law to end a thread, thus making it impossible to thereby use Godwin's law as the mechanism to proceed with said changes.

Cool, I'm a Law now - thanks, Sir_Giggles! ;) :cool:

(Was just bugging ya before too, hope ya figured that out... ;))
 
tsk, tsk

Photorun said:
... the peecee luser lot ...

This is a really powerful argument - when you use a couple of childish non-words it really helps your case!

Perhaps if you'd develop your arguments rather than resort to simple insulting name-calling we'd be able to understand where we've strayed. :p
 
Rootman said:
Shard's Law is that when you buy a Mac, a better one will be announced the next month, but it won't ship for a long time.

Kagetenshi said:
No no no, it's that the better Mac will only ship quickly if you just bought one, but will take months if you're holding out for it.

I like both of these as well. Once again, thanks guys, I think I'll have to start referencing this thread in the future when the different variants of my Law become applicable. :cool:
 
iMac mini + iRemote?

geekmeet said:
The source of the aforementioned rumour is THINK SECRET.
Nick deplume.
He is by far the most accurate rumour monger......almost ALL of his prophecies have come to pass.
The fact that he dropped the "sub-$500 mac bomb" upon us makes it even more compelling.
He is saying this device is an imac class COMPUTER.
He did not say it was a DLD(Digital Lifestyle Device).
I must admit I am skeptical about this rumour.
We shall see.

I think the only difference between a DLD and a computer with these specs is whether it sits on your desk and plugs into a monitor, or sits under your TV and plugs into your TV. I see this as a crossover DLD.

In fact, the continuing absence of the iRemote (for airtunes) suggests to me that they might come out together. With this as a web/email/music device, with an iRemote, it would have serious appeal to people in the living room, or in the office.
 
fight the "MHz Myth" - part 2

dcollierp said:
Where can you get a $499 PC with a graphics card that is not "Integrated" with out having to build it yourself?

Have you actually used a recent "Intel Extreme" motherboard? You might be surprised to find out that they'll tromple some of the low end "non-integrated" chips that Apple's fond of using.

Did you know who was among the first to the market with "integrated graphics"? It was Silicon Graphics, no slouch in the graphics performance arena, which was a pioneer in using system RAM to give the graphics as much memory as it needed.

Of course, it's not a 3D gaming engine, but for the tasks that a $500 PC would be used for it is definitely a cost effective solution.

Having a "litmus test" for "integrated graphics vs discrete graphics" is as silly as blind belief in the "MHz Myth".

If good 3d graphics, damn the price, is important - then don't get a mobo with Intel Extreme.

If you're doing email, surfing, digital photo editing, DV movie editing and the like, however, integrated graphics can be a great way to shave the cost without any perceived loss of performance.

(In my end-of-year "spend-it-or-lose-it" equipment purchases, I bought 24 systems with embedded graphics, and 4 with 256 MiB Quadro FX 3400 PCI Express cards. Didn't waste money when it didn't matter, but bought high end when it did.)
 
if it's a "fact", then show us a link

GFLPraxis said:
The $499 PC's generally have 2 GHz Celerons, when in fact a 2.5 GHz Celeron is outperformed by a 1.8 GHz P4.

Prove it, please! Show us a link.

This does not jibe with other available information.

A current Celeron is in fact a P4 chip - it has a smaller cache and (usually) a slower bus. It's performance will vary from equal to an equivalently clocked P4 (for CPU-intensive applications) to significantly less (for cache or memory-bound applications).

Your "2.5 GHz == 1.8 GHz" sounds like a possible "worst case" scenario, not a typical case.

(BTW - even at 1.8 GHz doesn't this best the 1.25GHz G4 ???)
 
AidenShaw said:
(BTW - even at 1.8 GHz doesn't this best the 1.25GHz G4 ???)

Nope, not even close. I've found a good rule of thumb is to multiply a G4's clock speed by ~2 to get an equivalent for the P4, though there's some swing for certain types of tasks.

~J
 
AidenShaw said:
Prove it, please! Show us a link.

This does not jibe with other available information.

A current Celeron is in fact a P4 chip - it has a smaller cache and (usually) a slower bus. It's performance will vary from equal to an equivalently clocked P4 (for CPU-intensive applications) to significantly less (for cache or memory-bound applications).

Your "2.5 GHz == 1.8 GHz" sounds like a possible "worst case" scenario, not a typical case.

(BTW - even at 1.8 GHz doesn't this best the 1.25GHz G4 ???)

What he said was perfectly true.There was a good piece about the performance of "buget" processors on annandtech.com and maybe tomshardware.The first p4 celerons perform like a dog (a dead one).The new celeron D family is better-mainly because they have more level 1 as well as level 2 cache.
 
Kagetenshi said:
Nope, not even close. I've found a good rule of thumb is to multiply a G4's clock speed by ~2 to get an equivalent for the P4, though there's some swing for certain types of tasks.

~J

And the "swing" comes mainly on scalar fp tasks (because of the nature of the code) and doesnt affect overall performance that much.Some people really like to dump on the G4! Still,it is lame of Apple to think of putting out a machine with the guts of a now 9 month old emac,even if it is a few hundred cheaper.Lets hope they are smarter than that!
 
GFLPraxis said:
The $499 PC's generally have 2 GHz Celerons, when in fact a 2.5 GHz Celeron is outperformed by a 1.8 GHz P4. They also have no graphics card at all, 40 GB hard drives, and the crappiest components available. But hey, 2 GHz looks big and people don't realize just how bad Celerons are, soo...the computer with the big fat number must be better!

Well, I wonder if people are finally waking up from that MHz race craze...

Even AMD and Intel are trying to steer away from the clock numbers (they name the CPUs with "relative numbers" now, like "Celeron 330M" or something.

And with the AMD/Intel clock numbers vs actual work being done, quite a few users now know clock speed is pretty meaningless, even within the same "kind" of CPUs (x86).

Take the Windows client for the World Community Grid for exemple. They clock the P4/1.5GHz at "100", and my AMD Athlon XP 2400+ (actual clock is 2GHz I think) gets a "192". Now, if clock speed was everything, I should only get 192 if my CPU was running at around 3GHz. I'm also afraid to see what kind of score my Epia M10000 would get (C3/1GHz), but it sure would be (a lot) lower than what it should be for a P4/1GHz. Again, clock speed ain't everything.

P4/1GHz != AMD/1GHz != VIA/1GHz != G4/1GHz != G5/1GHz...

Anyway, we shall see in a few days (I hope).
 
Clock speed schlock speed. For all but the most extreme tasks/power users, any new computer has a fast enough processor for daily usage. I'd love to see Apple unveil a $599 1.6GHz G5 headless, too. But a 1.25GHz G4 is plenty fast for 95% of users. If Apple can find a way to market based on what can be accomplished with the Mac, instead of relying on admittedly older specifications, they can sell a ton of these and double their market within a few years' time.
 
bree said:
Another requirement for this rumor to be true is cheap monitors. You can't sell a computer without at least the option of matching monitors. The problem here is that Apple has effectively dropped all CRT production from its line up, and its cheapest LCD is the 20 incher at $1300 – more than double the estimated price of the headless mac.

(Let me start by saying most of my comments don't directly apply to bree, but to some Mac users in general)

This is something I really don't get from most of the Mac crowd. Why do you think Apple has to sell you everything?

If this really is a low-cost, entry-level option, nothing has to "match". You act as if a computer was a fashion statement! A lot of people want to switch to dump the microsoft problems, not to have a nicer-looking computer (that's the LAST of our problem, heck I wouldn't care if Apple only painted a regular PC case white instead of beige).

Buy the Mac mini, buy any monitor you want (VGA connector, maybe DVI too, but DVI is so damn expensive, I wouldn't understand why they'd add that in the first place). Also, don't forget a lot of people already have monitors (switchers already have a lot of gear, remember? Re-using our gear is our main goal, to lower the overall cost of upgrading).

Apple doesn't want to sell 100$ CRTs? A lot of companies do. Not everything has to come from Apple. Heck, I don't get why people keep complaining about that one-button mouse. Apple has switched to industry standards. Don't like the mouse? Buy another one. Heck, even a damn Microsoft mouse will work (I prefer Logitech). And OS X supports the scroll wheel, extra buttons, etc.

I bought an iPod. But it wasn't because it's "cool" or looks "fashion", it's because I had 3 other mp3 players before my iPod, and the iTunes integration added to the simple/fast interface (I keep finding new tricks even after one year) is what made me buy the iPod. If it looks good, it's just a bonus.

Regular people are starting to buy Apple stuff, because the alternatives are starting to sink way below "good enough". So please drop the "BMW vs Ford" (Apple vs PC) mentality. Technology is not a fashion nor a statement.

Sorry if I offended anyone, especially Ford owners. ;)
 
Sir_Giggles said:
No no. This has become Shard's Law. By stating Godwin's law, you supersede the possibility of enacting said law to end a thread, thus making it impossible to thereby use Godwin's law as the mechanism to proceed with said changes.

Argh.

Where's my Tylenol?
 
Rootman said:
Shard's Law is that computing power doubles every 18 months and the bloatedness of Microsoft Office quadruples every 24 months.

So, that means that every X months, the computing power required for MS Office goes up by Y%...

Ugh, my head.

Were are my Tylenols again?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.