Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

doctoree

macrumors 6502
Original poster
Jun 28, 2008
406
0
Secret lair/ Earthmiddlepoint
I have to tell you a something else first:

There is a tech company called ElcomSoft, they make state-of-the-art tools for the computer forensics market which includes the retrieving of wifi passwords of encrypted hotspots.
With the release of the latest version of this software they also released a press information stating that they managed to make their software take advantage of modern GPU's when doing its brute-force-based approach to the decryption. The state that this measurement makes getting the job done up to 100x faster.

Now, you might ask what this has to do with Snow Leopard? Well, if there is ONE computing assignment whihc has nothing to do with graphics, then it is brute force decrypting of wifi passwords. And if this cool russion company manages to pull this of, I think Apple can pull this of aswell.

Snow Leopard may make even 2006 low end MBP's suddenly MUCH faster than even the latest MacPro WITHOUT Snow Leopard. This would give Apple another huge advantage over Windows as MS is not even planning a comparable solution for their next OS "Windows 7".

I'm really looking forward to January!

Doc

Edit:
Only recently a new patent application by Apple was found. In this Application they stated that they managed to make somethng like a "performance-box" in which the Mac (with SNow Leopard installed) "throws in" the available performance of the CPU as well as the GPU. What this means is that third party developers won't even have to alter their apps to take advantage of the GPU because the performance is just there. So even writing a word document would "automatically" take advantage of the GPU.

Edit 2:
Check out the third page of this thread to see what Nvidia officials said about the possibilities:
"seeing up to a 20-200x speed-up in their applications with CUDA over a CPU."

Edit 3:
http://www.physorg.com/news146247669.html "Mathematica Users Get 100x Performance Boost From NVIDIA CUDA"
 
It isn't new news that Snow Leopard will be streamlined by using the GPU for tasks too.
 
Now, you might ask what this has to do with Snow Leopard? Well, if there is ONE computing assignment whihc has nothing to do with graphics, then it is brute force decrypting of wifi passwords. And if this cool russion company manages to pull this of, I think Apple can pull this of aswell.

Actually, if there is ONE computing assignment which is EXTREMELY similar to graphics it is encryption/decryption. Both are very well suited to GPU computing nature and stand to gain greatly from GPU acceleration.

iWork won't be speeded up by 100x. Neither will the Finder. Time Machine? Nope. Xcode? Eh, no.
 
It isn't new news that Snow Leopard will be streamlined by using the GPU for tasks too.

You are right but it's new that the speed up may be so huuugge. And I previously also thought that third party apps would have to be optimised for this, but they don't.
I'm amazed.

Actually, if there is ONE computing assignment which is EXTREMELY similar to graphics it is encryption/decryption. Both are very well suited to GPU computing nature and stand to gain greatly from GPU acceleration.

iWork won't be speeded up by 100x. Neither will the Finder. Time Machine? Nope. Xcode? Eh, no.

I'm not an expert in decryption. But brute force is simply trying out all possible figure combinations until you hit the right one. I can't see anything graphical there...

Edit:
Of course I don't expect things like Time Machine to be sped up because there is the bottle neck of hard drives and others. I'm just talking about the sheer raw performance like rendering in Maya for example.
 
I'm not an expert in decryption. But brute force is simply trying out all possible figure combinations until you hit the right one. I can't see anything graphical there...

what does grpahical have to do with it? gpus are VERY beastly doing certain calculations and if that type of calculation is supported by the gpu....watch out
 
Edit:
Of course I don't expect things like Time Machine to be sped up because there is the bottle neck of hard drives and others. I'm just talking about the sheer raw performance like rendering in Maya for example.

Well, then that would not make Macs 100x faster. That can make any system doing specific operations faster, not just limited to Mac. Of course, Snow Leopeard is limited to Mac, but GPU calculations have been happening for some time on many platforms. It's all dependent on the calculation at hand andwhat GPU is installed.
 
what does grpahical have to do with it? gpus are VERY beastly doing certain calculations and if that type of calculation is supported by the gpu....watch out

What I'm trying to say is just that ElCom Soft prooved that GPUs can also speed up task which don't have anything to do with graphics at all. And that the speed up is incredible.
 
What I'm trying to say is just that ElCom Soft prooved that GPUs can also speed up task which don't have anything to do with graphics at all. And that the speed up is incredible.

Again, this has been happening for a few years now, so this company isn't proving anything new to anyone. Take a look at Stanford's Folding@Home, and you'll see they passed some non-graphical calculations off to the GPU as well, at least since 2005 or 2006.
 
Well, then that would not make Macs 100x faster. That can make any system doing specific operations faster, not just limited to Mac. Of course, Snow Leopeard is limited to Mac, but GPU calculations have been happening for some time on many platforms. It's all dependent on the calculation at hand andwhat GPU is installed.

But the great thing is that if my theory is right, for the first time this wouldn't be "dependent on the calculation at hand andwhat GPU is installed" anymore but systemwide instead!
 
But the great thing is that if my theory is right, for the first time this wouldn't be "dependent on the calculation at hand andwhat GPU is installed" anymore but systemwide instead!

I believe your theory won't hold up, because currently, you need specific GPUs to offload these calculations. You cannot just throw any old GPU at it and get great benefits.
 
What I'm trying to say is just that ElCom Soft prooved that GPUs can also speed up task which don't have anything to do with graphics at all. And that the speed up is incredible.

No one disagrees with you. A few small niche applications will be remarkably benefited by this (when the app does a kind of computation at which GPUs are fast and the GPU resources are relatively idle). Other apps will see no benefit at all (you won't see much benefit in games, for instance, where the GPU is already being driven as hard as possible to make the graphics themselves work).

But unless what you do is break passwords, the improvement from GPU usage won't be that dramatic very often.

Still, all in all, it's a neat trick, it makes sense (in that GPUs are increasingly powerful and are often sitting relatively idle) and it looks like OS X will implement it fairly well.

P.S. Isn't it kind of sad that you researched this company so heavily, have such high regard for them, but can't even spell "Russian" correctly to describe their national origin? Or will Snow Leopard dramatically improve spell checking performance also?
 
I believe your theory won't hold up, because currently, you need specific GPUs to offload these calculations. You cannot just throw any old GPU at it and get great benefits.

You are only partly right. It doesn't work with older gpus because they aren't very programmable but you don't need "special hardware". It's working with the normal off-the-shelf hardware from nvidia and probably also ati. Consumer hardware.
 
You are only partly right. It doesn't work with older gpus because they aren't very programmable but you don't need "special hardware". It's working with the normal off-the-shelf hardware from nvidia and probably also ati. Consumer hardware.

I know that, and I am fully correct. I didn't say you needed specialized hardware. But, again, it may only work on certain GPUs, like NVidia and maybe only a certain subset of those GPUs.
 
No one disagrees with you. A few small niche applications will be remarkably benefited by this (when the app does a kind of computation at which GPUs are fast and the GPU resources are relatively idle). Other apps will see no benefit at all (you won't see much benefit in games, for instance, where the GPU is already being driven as hard as possible to make the graphics themselves work).

But unless what you do is break passwords, the improvement from GPU usage won't be that dramatic very often.

Still, all in all, it's a neat trick, it makes sense (in that GPUs are increasingly powerful and are often sitting relatively idle) and it looks like OS X will implement it fairly well.

P.S. Isn't it kind of sad that you researched this company so heavily, have such high regard for them, but can't even spell "Russian" correctly to describe their national origin? Or will Snow Leopard dramatically improve spell checking performance also?

My mother language is not english and additionally I'm very tired right know. So please excuse me. But back to the topic, I think you should take a look at the patent application I just mentioned in my first post:
https://www.macrumors.com/2008/10/24/apple-patent-provides-peek-at-snow-leopard-technologies/
As you can see, Apple plans to put everything into one box making the power of the GPU approachable by all apps not only optimised ones (like previously games, AfterEffects or Maya)

To stop my writing errors I will go to bed now and come back later.

Doc
 
My mother language is not english and additionally I'm very tired right know. So please excuse me. But back to the topic, I think you should take a look at the patent application I just mentioned in my first post:
https://www.macrumors.com/2008/10/24/apple-patent-provides-peek-at-snow-leopard-technologies/
As you can see, Apple plans to put everything into one box making the power of the GPU approachable by all apps not only optimised ones (like previously games, AfterEffects or Maya)

To stop my writing errors I will go to bed now and come back later.

Doc

You can't just offload any old instruction to the GPU, otherwise they would just replace the CPU.
 
For the first time we have a figure: 100x speed up possible in non-graphical, non-optimised apps across the whole os!

Please stop saying that. It's not accurate, as I stated above. There's a reason there's a CPU and a GPU. The GPU is not replacing the CPU entirely.
 
Please stop saying that. It's not accurate, as I stated above. There's a reason there's a CPU and a GPU. The GPU is not replacing the CPU entirely.

Yes, right. We don't "have a figure" "across the whole os." We have a figure for one specialized kind of application. Most applications aren't doing things like this.

It's still a very interesting technology and I think it'll end up being used a fair amount when all is said and done, and it will make computers faster without having to upgrade anything.

P.S. Okay, Doctoree, sorry about insulting your English. Go to sleep and come back and talk about it tomorrow. ;)
 
Yes, right. We don't "have a figure" "across the whole os." We have a figure for one specialized kind of application. Most applications aren't doing things like this.

It's still a very interesting technology and I think it'll end up being used a fair amount when all is said and done, and it will make computers faster without having to upgrade anything.

P.S. Okay, Doctoree, sorry about insulting your English. Go to sleep and come back and talk about it tomorrow. ;)

which is strongly limited by what kind of graphics chip you have so there wont be any uniform increase % on all machines to the same extent
 
Russians also figured out a way to expand our 200GB hard drives to 200TB. This technology will be in Snow Leopard!!!
 
This isn't exactly new. You know where else this works? Folding@Home, as specified earlier. People who want to do serious folding use a GTX280, so yeah this crap already works, in fact you don't NEED the OS to hand it off to the GPU, in fact it would be better if it weren't, cause not all applications will benefit.

Some mathematical number crunching can benefit, you can run a PPU on an nVidia videocard if you have 2 in your system. Either way, shush, you're giving people the wrong idea about what this will be capable of.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.