Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Originally posted by Rai
I bought 5,000 apple stock, a few weeks ago, when the rumors, about the music service

I bought 40 call options (equivalent to 4000 shares) with a $15 strike price a week and half ago. I have already more than doubled my money. Not a bad return. Now I have to decide whether to cash out or hold on for the long haul. ;)
 
Originally posted by Rai


At 3.30 when i returned the stock was up 2 dollars.

Around 3:15 something caused this stock to sky rocket, what is was i'm not sure, no more finicial news of apple was released to finicial sites, and the info about ipods and music service where known since the beginning of the day.

What time did the story get posted ?

But it could have been just market flucations

Well, according to the times at MacRumors, it got posted at the same time as the iTunes story (at little after noon...I assume Eastern Time?).

But I doubt that most investors read MacRumors.

What might be more significant is that the 970 story got posted at 2:50 or so on Slashdot? That is the "biggest" site I have seen it posted on.

Since the 1 million tracks sold PR came out at 8:30 am, it really shouldn't be a response to that. Plus they already knew there would be big sales based on the 275k songs sold in the first 18 hours. So it wasn't that much in the way of new information.

It therefore seems at least conceivable that the stock runup was a response to the 970 rumors. This would be very disturbing, as it would suggest that these results were not just faked as a prank (which is what I had been assuming the motivation was), but rather there were part of a plot to game the market. I wonder if anyone bought a bunch of AAPL call options....

Does anyone recall whether there were similar movements in the stock price back in 2001 with the Moto G5 rumors? I have no recollection at all.
 
wow, well these were tested with only 32bit apps, it apple made someway for developers to easily bridge 32 to 64bit. It would wipe the floor with the pentium (in which it already does)

Well apple seemlessly moved from 68k to PPC, I"m sure this move will be the same, difficult but indeed not hard at the same time.

Wouldn't the apps need to be compiled at 64bit? then thats it
 
Originally posted by mac15
wow, well these were tested with only 32bit apps, it apple made someway for developers to easily bridge 32 to 64bit. It would wipe the floor with the pentium (in which it already does)

Well apple seemlessly moved from 68k to PPC, I"m sure this move will be the same, difficult but indeed not hard at the same time.

Wouldn't the apps need to be compiled at 64bit? then thats it

They would only need to be recompiled if they wanted to be 64-bit.....which 99% of applications would not need to be, so its pretty pointless to recompile for the 970.
 
I WANT a 970 PB this summer, so I hope that MB is partially wrong, but mostly right :) They said that no 970 in PB's till march, which is too long for me to wait :( On the other hand, I hope we are getting the 970's this summer, even if it is only in powermacs. Anyways, here's to hoping for 970 PB with Mobilitiy radeaon 9600 :)

Tony
 
Originally posted by macrumors12345

Okay, this is getting absurd. No offense, but what is wrong with you people?!? Are you so drunk on the Kool Aid and so desperate for hope of a faster Mac that you will believe anything that anyone tells you, no matter how absurd? ...

You know, I find it hard to swallow that you say 'no offense' but then proceed to make personal attacks.

Now, the reason that what you are suggesting wrt Bryce is completely absurd...

Wow, that is SUCH a realistic scenario. Because I am sure that NOBODY at Apple would EVER have access to a DP G4 to use as an actual control for this test. It is not like the DP G4 is PRODUCED BY APPLE COMPUTER or anything like that!! Clearly they would need to go to an outside source to get these results, even though that would totally compromise the validity of the test.

I can much more easily see the following situation: Somebody copies stuff from BareFeats and makes up the 970 results, but is too dumb to even make them consistent or change them slightly so they are not an exact copy of BareFeats. ...

I was simply trying to present a plausible scenario for this kind of information coming forward. The fact that Bryce didn't show the benefits of the DP G4 was accounted for by the fact that these figures were taken from the BareFeats article. Now, let's see, if I was an informant for a rumor site, would I want to make a big production about getting together a bunch of benchmark tests, or would I prefer to do something quick and dirty that would get the point across? I'd probably go with quick and dirty for any number of reasons, not least of which being that it would be much harder to trace it back to me.


...In fact, according to your story, it would be a WILDLY INACCURATE test for comparing the machines, because he would be comparing the SP version of Bryce running on a DP G4 1.42 to the DP version of Bryce running on a DP 970 1.8....

Yes, this method of comparison (particularly the Bryce comparison) would have some inherent inaccuracies. But that wouldn't invalidate the overall point of the bench marks. Nor would it invalidate the comparison between the SP 1.4GHz 970 and the DP G4. The DP 1.8GHz 970 would only provide true comparison to the SP 1.4GHz 970, not properly to the DP G4.


...
Great. This is like saying, "I have no opinion, one way or the other, about whether the world is flat or round." One shouldn't be agnostic about things when the evidence is quite clear. And the weight of the evidence on this one is such that it is very hard to NOT unamibiguously conclude that these are fakes.


No, it's not. I'm saying that I DON'T KNOW, and no matter how much you wine, snivel and puel about it, neither do you (unless you actually work at Apple, and/or are the person who provided these benchmarks to MacB). I can see a plausible situation that would lead to these numbers being presented. I can also see that it is quite plausible that they have been faked. I don't know, so I won't pass judgement. I'll wait and see.

So, please get off your high horse, and stop insulting me.
 
Re: Nothing to support this rumor!

Originally posted by siberian
I have never seen apple release a dramatically new product like a 970 PPC and not radically trim inventory early.
1) Mac staff is getting a 30% rebate on current G4 machines AFAIK.
All those machines are on short supply anywhere else in the world.
And anyway, we're likely to see a dual (or triple) approach with the G4 remaining in the low end line and the portables for some time.
2) As for the numbers: It might be that these are rounded, internal figures that are supposed to give a rough idea of the performance gains.
3)The European market accounts for about 35-40% of Apple's share.
4) The reason why this is published on a French site may be exactly *because* it's far away from Cupertino. Supervision is probably not as tight in Europe as it is in the States. I could well imagine some of Apple's top brass in Europe enjoying a little leak here and there.
 
Re: Re: Nothing to support this rumor!

Originally posted by dekator
1) Mac staff is getting a 30% rebate on current G4 machines AFAIK.
All those machines are on short supply anywhere else in the world.
And anyway, we're likely to see a dual (or triple) approach with the G4 remaining in the low end line and the portables for some time.
2) As for the numbers: It might be that these are rounded, internal figures that are supposed to give a rough idea of the performance gains.
3)The European market accounts for about 35-40% of Apple's share.
4) The reason why this is published on a French site may be exactly *because* it's far away from Cupertino. Supervision is probably not as tight in Europe as it is in the States. I could well imagine some of Apple's top brass in Europe enjoying a little leak here and there.


Agreed. Plus throwing one or more false numbers in for good measure would make it look like a guess, therefore keeping Apple legal at bay.
 
Originally posted by dekator
Apple's top brass in Europe enjoying a little leak here and there.
I dunno, you'd think they'd wear some Depends.

I'm sorry, I couldn't resist.



------
I imagine that the 970 will have similar performance to the Opteron, with the addition of whatever buss, ram and other technologies Apple implements.
 
Any posts that include personal attacks or attacks on any nationality will be edited or deleted. Please respect one another. If you can not refrain from posting these sorts of comments you will be banned.
 
Re: I like my FRENCH fries, thank you.

On the 970s, I think we can all agree:

sluggo.jpg


YEAH! (Sorry to anyone who's seen this a billion times. And to the moderators if they snuff it)
 
Originally posted by chetwilliams
I bought 40 call options (equivalent to 4000 shares) with a $15 strike price a week and half ago. I have already more than doubled my money. Not a bad return. Now I have to decide whether to cash out or hold on for the long haul. ;)

Whether you hold or not is always a personal call. In any case, I'd set a protective stop and not give back too much should this market want to retrace. You can always get in again later when the market pulls back.

I bought on the rumors a few weeks ago as well. I bought May 12.5 Calls at .90 and Jan 12.5 LEAPS at 2.15 on the 16th or 17th of April. Both positions are sitting quite nice. I'll dump the May calls on the slightest pullback and take the profits. The LEAPS I'm going to keep in the bin for the long haul.

Best of luck in your trading!
 
64-bit vs. 32-bit benchmarks

Originally posted by mac15
wow, well these were tested with only 32bit apps, it apple made someway for developers to easily bridge 32 to 64bit. It would wipe the floor with the pentium (in which it already does)

Well apple seemlessly moved from 68k to PPC, I"m sure this move will be the same, difficult but indeed not hard at the same time.

Wouldn't the apps need to be compiled at 64bit? then thats it

Please don't perpetuate this myth. Making something 64-bit does not make it faster-- usually the opposite. It CAN be faster if you're doing certain operations, such as large bitfield operations that can't be vectorized, or memory copies if you're not already using the FPU, or if you need the extra bits and had been doing hacks previously... but in general it slows things down. Why? Twice the data to pump around, most of which is going to be unused in an app that is designed for 32 bits.

64 bits increases computational accuracy and/or range, not speed.
 
Re: 64-bit vs. 32-bit benchmarks

Originally posted by Booga
Please don't perpetuate this myth. Making something 64-bit does not make it faster-- usually the opposite. It CAN be faster if you're doing certain operations, such as large bitfield operations that can't be vectorized, or memory copies if you're not already using the FPU, or if you need the extra bits and had been doing hacks previously... but in general it slows things down. Why? Twice the data to pump around, most of which is going to be unused in an app that is designed for 32 bits.

64 bits increases computational accuracy and/or range, not speed.

Right. For example, a 64-bit add using 32-bit instructions takes about 5-6 instructions to carry out.....a 64-bit divide is much worse....so 64-bit only improves speed when working with 64-bit numbers.....for example, in my situation, on my xserves running MySql...I use 64-bit keys for indexes in the database. My speed would greatly improve with the 970. Most applications would not.
 
64 vs 32 bit processing

Boy! It sure is good to get back to some technical discussions. Although being able to make a joke in good taste would be nice, but oh well...

So, Mr. 64 bits smartypants (Booga), would an application like Photoshop or Bryce benefit largeley from moving data in 64-bit chunks?

I mean, you say if a program was designed for 32 bits it wouldn't benefit, but what if it was totally rebuilt to be 64 bits? Would only really complex apps benefit from this?

BTW, I havn't gone really processor-deep in my techno since I read "Birth of a New Machine", so riddle me this -- does this mean 64 bit instructions are to be used?
 
Re: 64 vs 32 bit processing

Originally posted by BaghdadBob
Boy! It sure is good to get back to some technical discussions. Although being able to make a joke in good taste would be nice, but oh well...

So, Mr. 64 bits smartypants (Booga), would an application like Photoshop or Bryce benefit largeley from moving data in 64-bit chunks?

I mean, you say if a program was designed for 32 bits it wouldn't benefit, but what if it was totally rebuilt to be 64 bits? Would only really complex apps benefit from this?

BTW, I havn't gone really processor-deep in my techno since I read "Birth of a New Machine", so riddle me this -- does this mean 64 bit instructions are to be used?

Depends.....most people say 32-bit color is enough, so in that sense no....you cant use a 64-bit add to add two pairs of 32-bit numbers, the carry bits would mess up the results. However, NVidia and ATI have been making the move to 64-bit color in their GPUs, and I beleive rendering engines for things like Maya, are 64-bit, however they utilize vector engines and FPU for decimal precision, so they already have that power.

Most applications have no need for 64-bit numbers. Memory access is only needed in rendering and database applications. And if the OS is 64-bit....and the applications are 32 bit, each app can access 4GB of memory....since the OS provides protected memory regions, the app gets a dedicated 4GB memory map. Thats where the largest benifit wil be seen! Most people dont even realize this.
 
Sorry for the controversy

As for my post suggesting that:

MacBidouille ... was probably laughing their heads off and saying, "Boy, these ... sure are gullible!"

Yes, I made that comment and I truly apologize if it offended anyone. However, it was only a JOKE! In fact I made it somewhat ridiculous (but, I thought, completely inoffensive) only to try to emphasize how over-the-top I believe these rumors are concerning the impending release of a PPC970-based Mac. In any case, let's all try to remain polite and keep a little sense of humor concerning these posts, after all, given the current unknowns that's about all these rumors are worth. So, I urge everyone to calm down a little, and no need to comment further on this issue.
 
One more thing... A rumor this far reaching can either be very very good for Apple, or very very bad.

Let me explain.

With all these tech sites reporting this rumor (Macrumors, OS News, Ars, Slashdot, etc..), it's been quite far reaching. Needless to say, by now a bunch of people have heard about it, mainly the guys who will need such power later down the road. So now, even though many discount the rumor, there may be an expectation of the chip to perform a certain way. If it proves to be less than what was "leaked," people may thumb their noses at Apple and buy the next Pentium. If the rumors prove true, well, then a lot of people may then switch over to the Mac side of things. After all, if the machine is decently priced, and a bunch more powerful than a PC, why not go with it?
 
On 32 vs. 64 bit applications

I worked on the KSR 64-bit supercomputers circa 1992. Simply converting most applications from 32 to 64 bits generally hurts performance, depending on the architecture of the hardware. If 32-bit reads and writes are reasonably efficient, then you can adopt a C programming model where "int" is 32 bits and "long" and pointers are 64 bits and not suffer much of a penalty; on the other hand, that usually has a hidden penalty in the hardware which handles halfword memory writes, which slows down full 64 bit mode. That might be a reasonable tradeoff for a consumer machine, rather than for a dedicated supercomputer; I don't know which way the 970 goes on that. The "I32LP64" model tends to upset a lot of poorly-written C code, due to inappropriate assumptions about the sizes of data objects.

There are some very specific application domains which can greatly benefit from a 64-bit integer size. Some cryptographic algorithms benefit from being able to do logic operations in 64-bit chunks (of course, many algorithms are designed not to need 64-bit arithmetic, and wind up not benefitting if it's available). It also turns out that the TCP/IP checksum benefits from wide word widths, even though it's technically defined as a 16-bit ones-complement sum. And it turns out there's a lot of perfectly ordinary code (filesystems, network protocols) which has been written assuming that 64-bit integers exist in order to simplify writing the code, at the cost of complicating the compiler's job; throw a 64x64 multiply in an inner loop on a 32-bit machine and you can seriously slow an application down -- in a way that real 64-bit integers can positively affect.

The leap from 16 bits to 32 bits was an important one, because there is a huge array of problems interesting to typical computer users which need data sets bigger than 65536 bytes. (*) There are scarcely any applications with an urgent need to address more than 2 gigabytes of RAM, and relatively few which could even sensibly use even that much memory. However, no doubt new applications will be found once developers can count on even cheap systems having 8GB or so (and of course we can always count on programmers' laziness to bloat just about any software ;) ).

(*) I can remember the flame wars between Intel supporters and Motorola supporters back in the mid 80s, where very carefully chosen 16-bit 8086 programs would run faster than equivalent "32-bit" 68000 (**) programs -- as long as you kept the data set small enough that it would run at all, of course.

(**) And of course, there were the endless discussions about whether the 68000 was "really" 16 bits, 24 bits, or 32 bits. Or 16/32 bits. Fortunately, the mighty 68020 made all those arguments moot.
 
Re: On 32 vs. 64 bit applications

Originally posted by jfw
I worked on the KSR 64-bit supercomputers circa 1992. [snip]

There are some very specific application domains which can greatly benefit from a 64-bit integer size. Some cryptographic algorithms benefit from being able to do logic operations in 64-bit chunks (of course, many algorithms are designed not to need 64-bit arithmetic, and wind up not benefitting if it's available). It also turns out that the TCP/IP checksum benefits from wide word widths, even though it's technically defined as a 16-bit ones-complement sum.
You know were are in 2003 and a lot has changed since 1992, even a Pentium IV or a G4 can handle 128 bits data chunks in their SIMD units (SSE2 or AltiVec). eg a G4 can crunch RC5-72 keys 3 times faster than a G3 at the same clock speed:
http://n0cgi.distributed.net/speed/query.php?cputype=all&arch=2&contest=rc572

Some speed gains previously expected from 64 bits computing are available on 32 bits CPUs since a few years due to the introduction of SIMD units and their larger (64 or 128 bits) registers.

Nevertheless pure 64 bits integer arithmetic will definitely benefit from 64 bits CPU.
 
Re: Re: Re: I do not believe any of this

Originally posted by jelloshotsrule
.. though what's up with "idiot cheese"??t.

SImple, if you think that it is actually cheese you are an idiot. It is a fitting name to me. I have never liked the stuff, not even when I was little. Oh, I guess I should say that I am an American before my countrymen attack me.
 
Re: On 32 vs. 64 bit applications

Originally posted by jfw
There are scarcely any applications with an urgent need to address more than 2 gigabytes of RAM, and relatively few which could even sensibly use even that much memory. However, no doubt new applications will be found once developers can count on even cheap systems having 8GB or so (and of course we can always count on programmers' laziness to bloat just about any software ;) ).

You will find that there are very many uses and needs for large amounts of memory (ie, more than 2-4 Gig).

The probelm is not that any one particular task requires that much memory (although I'm in architecture - and it would help us in 3d tasks greatly) - but by the time you let the system take as much as it needs, have some music playing, have internet and mail running, and then have 2 or 3 applications that you are using concurrently open....well, you aren't left with that much.

Again, as an architect using 3d I often hit 100% cpu and memory usage. Sure, you might say I could use a pen and paper to do what I'm doing - but it would so close to impossible as to be pointless.

My point - it should be my imagination that limits what is possible, not my computer.

BTW, what would the effect of 64bit processes have on speech recongition and AI? There are two pretty fundamental functions that are bound to benifit from 64bit (or more) computation.

a.

[edit - fixed grammer, damnit]
 
Yes, but did you know that a single 970 at 1.4GHz is faster than a speeding bullet? Please............please pass this news on to other rumour sites and have them post it. If you/they don't believe me, I can easily throw together a few Excel graphs. ;)
 
Re: Re: I do not believe any of this

In the Apple rumor industry, there are 3 big sites. Macrumors ( basically a big site that report daily rumors ), thinksecret who's always right and macbidouille, who is also always right. (mdd pics, mdd board pix), etc.

Trust me, they would not deceive us like that. [/B]

Except they are by no means "always right" and they, nor any site that I can recall has been right about benchmarks on an unreleased chip months ahead of its introduction. What you mention are things that came out not very long before the MDDs were released, and everyone knew an Xserve like powermac was due any day.

These are fake, fake, fake. Total waste of time.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.