Hello all! Long time reader, first time poster - yadda yadda.
I have an early 2009 24" iMac (2.93 GHz model, with a GeForce GT 120 graphics card) running Mountain Lion.
A couple months ago it started exhibiting some weird behavior when loading World of Warcraft. Namely, everytime the game's opening it will "flash" some random image (that I believe is still saved up on the GPU RAM) right before the login screen comes up.
So for example, if I were browsing on Mozzila and decide to play, it'll flash a website I visited like 10 minutes ago.
It's not affecting anything else (so far) and I have no weird lines on my screen or anything that would suggest a faulty GPU, but I'm worried this may be the early stage of something worse. I did have a feeling that it was performing a little worse in game after that started happening, but I that's hard to measure with the constant patches / expansions bringing graphical improvements.
For what is worth, I did some CUDA programming for my graduation thesis some time before that - could it be that some bad code didn't properly get released from the GPU RAM, or is this nothing to be worried about?
I also do not remember this ever happening in Snow Leopard (I skipped 10.7).
I have an early 2009 24" iMac (2.93 GHz model, with a GeForce GT 120 graphics card) running Mountain Lion.
A couple months ago it started exhibiting some weird behavior when loading World of Warcraft. Namely, everytime the game's opening it will "flash" some random image (that I believe is still saved up on the GPU RAM) right before the login screen comes up.
So for example, if I were browsing on Mozzila and decide to play, it'll flash a website I visited like 10 minutes ago.
It's not affecting anything else (so far) and I have no weird lines on my screen or anything that would suggest a faulty GPU, but I'm worried this may be the early stage of something worse. I did have a feeling that it was performing a little worse in game after that started happening, but I that's hard to measure with the constant patches / expansions bringing graphical improvements.
For what is worth, I did some CUDA programming for my graduation thesis some time before that - could it be that some bad code didn't properly get released from the GPU RAM, or is this nothing to be worried about?
I also do not remember this ever happening in Snow Leopard (I skipped 10.7).
Last edited: