Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Nice leap to assume this makes the gpu bigger. A company that has spent seven years trying to deliver this to mobile gpus did not likely say "oh screw it just make the chip bigger and use more resources". They could have done that seven years ago.

The diagram shows that it's an existing Rogue product with RT hardware added. You don't get something for nothing.

This is the first commercially viable hardware implementation of RT hardware. So no, they couldn't.

They can do that now with stencil shadows.

Course raytracing can do it far more accurately and produce a wider variety of effects without as much memory overhead, since it's more or less emulating radiosity/photon bounces. But it does have its costs. Unlike rasterization, raytracing performance doesn't scale well to higher resolutions. Assuming they're able to make a mobile GPU powerful enough to do it, a raytraced iPad game running at 2048x1536 would probably eat through your battery in about 5 minutes.

That's the major reason why raytracing hasn't replaced rasterization just yet. It'd be nice being able to make models where the reflectivity of its texture is accurately determined by light bouncing off the material as opposed to a shader that convincingly fakes it (and requires considerably more video memory to do so), but you'd have a difficult time rendering it at even current HD standards while maintaining a good framerate, let alone retina displays.

As I told the other poster, I was talking about why you'd want better shadows. Not stuff that RT can do that rasterization can't.
 
That article is a year old and says this technology can't come to mobile devices for another 4-5 years. Subtract a year because it's a year old and we're looking at 3-5 years before this technology is available on mobile devices. If Apple aggressively pursues this, we might see it in Fall 2016.

It seems much more likely that Apple will add these things to the MacPro as a BTO this year. Maybe the iMac or the MBP will get it as an option next year.

And what you're looking at is a first rev that was built using older tech. I'm sure it's only a matter of time before they miniaturize that card into something that could fit into an ITX PC, and then into a new Mac Pro.

Though as for it coming to mobile, I'd say it'll be on the outside of that 3-5 year development time. Like the second article states, it doesn't require much power to do its thing. But relative to mobile, it's still a massively power hungry piece of tech. They'll have to not only make it small enough, but light enough on the battery before it'll be viable for a phone or tablet. That'll probably take a bit.
 
You're joking right? Ray tracing software came out for the Amiga running a now ancient Motorola 68000 way back in the late 1980s (see http://en.wikipedia.org/wiki/Sculpt_3D). Do you not remember "the juggler": http://www.youtube.com/watch?v=-yJNGwIcLtw (story: http://home.comcast.net/~erniew/juggler.html ) It blew everyone's mind back then and I can't believe there is something to be built into a phone- A PHONE!!!- to do it quickly (and on portable battery power & mobile chips). WOW!

Dude, do your homework. I know what I'm talking about. Read the article that you provided. That animation is pre-rendered in still images and processed to avi.

I'm talking about real time ray tracing. It's super hardware intense.
 
You're joking right? Ray tracing software came out for the Amiga running a now ancient Motorola 68000 way back in the late 1980s (see http://en.wikipedia.org/wiki/Sculpt_3D). Do you not remember "the juggler": http://www.youtube.com/watch?v=-yJNGwIcLtw (story: http://home.comcast.net/~erniew/juggler.html ) It blew everyone's mind back then and I can't believe there is something to be built into a phone- A PHONE!!!- to do it quickly (and on portable battery power & mobile chips). WOW!

You're mixing things up here. Ray Tracing has been around for decades, it is true. Pixar had a lab full of beefy computers that rendered every frame in Toy Story (the first one) using Ray Tracing. It took them several weeks to render the entire 90 minute movie.

This article is about real time ray tracing - a 90 minute movie should be rendered in 90 minutes. We're talking about going 40,000 times as fast. Moore's law says that making things run that much faster will take about 30 years. Toy Story was made 20 years ago, so that means they managed to beat Moore's law by several years, even if this technology isn't ready for another 3-5 years.

I have software on my computer that does Ray Tracing. It takes about 3 seconds to render a frame the size of my screen. They're talking about doing frames in 0.03 seconds - an improvement of being 100x as fast.
 
You're mixing things up here. Ray Tracing has been around for decades, it is true. Pixar had a lab full of beefy computers that rendered every frame in Toy Story (the first one) using Ray Tracing. It took them several weeks to render the entire 90 minute movie.

This article is about real time ray tracing - a 90 minute movie should be rendered in 90 minutes. We're talking about going 40,000 times as fast. Moore's law says that making things run that much faster will take about 30 years. Toy Story was made 20 years ago, so that means they managed to beat Moore's law by several years, even if this technology isn't ready for another 3-5 years.

I have software on my computer that does Ray Tracing. It takes about 3 seconds to render a frame the size of my screen. They're talking about doing frames in 0.03 seconds - an improvement of being 100x as fast.

And 100x isn't unheard of with dedicated hardware. They claim the hardware does 300 million rays per second. Of course, there will still need to be source restrictions and reflection limits.
 
I have software on my computer that does Ray Tracing. It takes about 3 seconds to render a frame the size of my screen. They're talking about doing frames in 0.03 seconds - an improvement of being 100x as fast.

Well, it all depends on what you're doing, and how detailed you're going. You have modelling programs that can do quick and dirty renders in a split second to give you an idea of how the final image will turn out. Like the article I linked to above mentions the new version of Unity being able to spit out real time global illumination as a work in progress. That's what this tech is designed for.

It'll eventually be leveraged to speed up rendering times on complex projects, and it'll certainly find more than a few uses for games. But for something like this...

photoreal.jpg


...it'll still take hours to render out at high resolutions. We won't be seeing detail that fine in realtime raytracing for years to come yet. Even Toy Story, which is relatively less complex, is still a ways away from being able to be done in realtime via pure raytracing.
 
This tells me you don't get the difference between Rasterization and Ray Tracing.

Of course I get the difference. Doesn't mean I find hardware support for raytracing meaningful at this point. Not only hybrid solutions suck because of their increased complexity (you need to implement a rasteriser rendering path AND the raytracing rendering path) but also the whole thing will very likely have massive problems with large and/or dynamic scenes. Just look at the source code of their samples. Raytracing will become viable when we can have really fast multiprocessors with full virtual resource support.
 
more useful than ray-tracing would be a real-time global illumination model engine.
 
I would say it's highly unrealistic that this is coming anytime soon too (It takes my MacBook Pro 10 hours to render three minutes of ray tracing video in Maya), but our current graphics will continue to get better and better.

Seeing this post, I instantly thought about Lili, because it's one of the most beautiful games I've played on iOS. Just found out it's coming to Steam, and it's actually a pretty good example at how this new GPU would benefit a game like this, visually anyway.

Lili on iOS / upcoming Steam (PC) port:

Lili_Steam_Edition_004.jpg
 
Last edited:
You're mixing things up here. Ray Tracing has been around for decades, it is true. Pixar had a lab full of beefy computers that rendered every frame in Toy Story (the first one) using Ray Tracing. It took them several weeks to render the entire 90 minute movie.

That is not quite correct. Pixar uses their own render engine called PR-Man (or often referred to it's API name 'Renderman') which was originally developed by Ed Catmul, Rob Cook and Loren Carpenter. Back on Toy Story, PR-Man did not have Ray-Tracing capabilities yet. Instead it utilised a 'REYES' algorithm (an abbreviation for 'Render Everything You Ever Saw') which is very much scanline. It's only much later that Pixar started adding Ray Tracing algorithms into PR-Man (Cars, I think it was... edit: Finding Nemo more accurately), turning it into a Hybrid Engine, which many in the know will argue is what's causing it's current decline in Visual Effects industry use.

There are a lot of misunderstandings and incorrect statements being thrown around in this thread regarding what Ray Tracing is, what it's not and where it's currently at. Ray Tracing is a vast field in CGI that is still enjoying a lot of research and development.
For those who'd like a better understanding, I urge you to read the article The Art of Rendering over on the FX-Guide website. It's considered by many in the industry as one of the better write-ups of the current state of the art.

One exciting example of current day "real-time" ray tracing is the 'Brigade' renderer, which is a Path Tracer. Path Tracing is but one of many ways to do ray tracing and in this case one of the most photographically accurate (albeit computationally also one of the most expensive) as it's a technique that naturally resolves Global Illumination, Depth of Field, Motion Blur, soft shadows, caustics, etc.
Here's an example of it in action in real-time. As you can see, we've got a ways to go yet, but then again, as I said, Path Tracing is one of the most expensive ways to do ray tracing. There are many cheats and shortcuts that can be made to improve render times. So it's not quite as black and white some here think it is.
 
Last edited:
In all those posts about new mobile GPU chips I am thinking more and more that there wont be an Apple TV set or Apple game console, but iPhone will become a console when combined with Apple TV.

Add this new GPU and new 64bit A8/A9 processor, combine this with AirPlay technology and MFi game controller and you already have a game console in iPhone or iPad.

The next step will be more AAA titles for iPhone!
 
Man, how fast mobile has caught up is amazing. It won't be long before they enter the state pc's are in, and only get marginally better (Sandy to Ivy, Kepler to Maxwell etc).
 
Amazing developments! Ina year or two, who knows what they could do. A glass-less 3D iPhone ? Kinda like :

The_Ultimate_Weapon.jpg
 
Apple is preparing for new IOS 8 home screen with more perpective and lense flare effects. Do not worry, it will be magical. (Some people may suffer minor sea sickness.)
 
That is not quite correct. Pixar uses their own render engine called PR-Man (or often referred to it's API name 'Renderman') which was originally developed by Ed Catmul, Rob Cook and Loren Carpenter. Back on Toy Story, PR-Man did not have Ray-Tracing capabilities yet. Instead it utilised a 'REYES' algorithm (an abbreviation for 'Render Everything You Ever Saw') which is very much scanline. It's only much later that Pixar started adding Ray Tracing algorithms into PR-Man (Cars, I think it was... edit: Finding Nemo more accurately), turning it into a Hybrid Engine, which many in the know will argue is what's causing it's current decline in Visual Effects industry use.

There are a lot of misunderstandings and incorrect statements being thrown around in this thread regarding what Ray Tracing is, what it's not and where it's currently at. Ray Tracing is a vast field in CGI that is still enjoying a lot of research and development.
For those who'd like a better understanding, I urge you to read the article The Art of Rendering over on the FX-Guide website. It's considered by many in the industry as one of the better write-ups of the current state of the art.

One exciting example of current day "real-time" ray tracing is the 'Brigade' renderer, which is a Path Tracer. Path Tracing is but one of many ways to do ray tracing and in this case one of the most photographically accurate (albeit computationally also one of the most expensive) as it's a technique that naturally resolves Global Illumination, Depth of Field, Motion Blur, soft shadows, caustics, etc.
Here's an example of it in action in real-time. As you can see, we've got a ways to go yet, but then again, as I said, Path Tracing is one of the most expensive ways to do ray tracing. There are many cheats and shortcuts that can be made to improve render times. So it's not quite as black and white some here think it is.

Then you'll like this:

http://www.blendernation.com/2014/0...ndering-with-combined-rendering-technologies/

Volcano-Studios is experimenting with combining different rendering technologies like OpenGL, Blender Internal and Cycles. The result: impressive image quality and very fast rendering.

Cycles Rendering of course is a Ray-Tracing based real-time renderer from Blender.
 
Current GPUs can do real-time RT, but only with simple scenes. Here is one WebGL example, runs fine on my rMBP: https://www.shadertoy.com/view/4dsGRn

The technology Imagination has developed is combining traditional rasterized graphics with specialized "shaders" that are more efficient in ray tracing math. Seems like a clever way to produce some reflections and shadows without needing a massive amount of computing power.
 
that'd be a nice addition but what i really want, mainly for the iPad, is some type of pressure sensitivity. on the line of a cintiq digitizer. aside from that maybe up the screen resolution again. even if it's only for a "PRO" model. give it better camera's and up to 256GB of flash. i wouldn't even care if it was a tad thicker to make room for a larger battery to power all this added power. 13 inch screen, quad core mobile, better graphics chipset, more ram, and all under $1599. am i dreaming?
 
I don't see how this improves upon the iPhone. How many people have played a game and said to themselves, "Gee, this game would be SO much better with more realistic lighting and shadows."?

That's not exactly how Apple works, though, is it? They build products that give a great user experience, even if the user can pick out the exact features that make it so great.
 
Dude, do your homework. I know what I'm talking about. Read the article that you provided. That animation is pre-rendered in still images and processed to avi.

I'm talking about real time ray tracing. It's super hardware intense.

I know my homework. And I agree that RT can be super hardware intense. But who said it has to be maximized RT for real-time photo-realistic renders? Even those old Amigas had render modes in the middle where you could preview the animation before you had the system do the full RT (everything) scene. I recall that simple versions of the animation could be animated in real-time and that was on about 1988 tech. Step forward 25+ years and maybe we can get more complex versions of middle-ground RT renders that are not Pixar animation (perfect) detail but are a big step forward from faking it with raster & vector algorithms.

I have zero expectations of pixel-perfect (maximum) RT renders in real-time. BUT, even 25% of that would probably be a huge step forward vs. the workarounds trying to achieve something photo realistic. The article itself seems to key on lighting & shadows with no mention of photo-realistic, Pixar-level scenes in real time. Maybe this will be a better way to put shadows and lighting-related effects in a scene?

I hope there's truth behind the rumor and that an incarnation of this is coming to iDevices. Even if it was only semi-accurate shadow, semi-accurate reflections, etc via a RT algorithm, I bet "semi-accurate" would be superior to best efforts in raster & vector graphics animation faux-RT employed now.

After all, if there is truth behind this rumor, why bother if it's not yielding some improvement over currently-available solutions? I doubt the marketing spin of the words "Ray Trace" is enough and, if it was about that, they could probably spin those words with existing iDevice hardware*

*with the right software, just like Amigas did in 198X (without any dedicated RT hardware).
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.