View Full Version : 7900 GTX Shaders
03-11-2009, 04:57 PM
Does anyone here use GPU-Z? It's a simple but cool little utility that gives you information about your video card and GPU:
In particular, I was wondering about my own video card. If you look at that link, there will be a screenshot of the utility. In the lower right, you will see two fields for 'Shader'.
When I use GPU-Z to check my video card (NVIDIA GeForce 7900 GTX), those two fields are blanked out....they are empty.
I was just wondering if this is normal for my card?
03-11-2009, 10:34 PM
i think thats just the speed of the shaders
mine is blank as well i'm running a 4670
03-12-2009, 10:35 AM
It's blank because Nvidia GeForce cards before the 8xxx series have no shader clocks. My GeForce 6600, which is 2 years older than your 7900 GTX, also has no Shader Clock frequencies.
That only applies to directX10 cards which use Stream Processors as opposed to separate pipelines for the different parts of a scene.
Cards before the Geforce 8xxx/ Radeon hd2xxx series used separate pipelines to calculate the different parts of a particular scene. Once DirectX10 came out the card manufacturers started using what are called stream processors, which can calculate ANY part of a scene. The card automatically allocates more for the more demanding parts of a scene. (A frame takes lots of vertex processing power, the card uses more stream processors for vertex processing, etc..) The stream processors run off of a multiplier of the gpu core itself, thus "shader clock" is a separate number. This is a new feature in directX10 cards and it is better explained here:
SO what you're really doing when you clock shaders is changing the multiplier that it's running off of. If you change the whole core clock without increasing the shaders you are lowering the multiplier at the same time, if you increase the shaders without increasing the core clock you are just increasing the multiplier.
The ATI/Nvidia cards before directX 10 didn't use a separate multiplier for shaders, thus you don't see a value there.
Hope this clears things up.
Oh, and the ATI cards seem to have locked multipliers, which is why you wouldn't see it on them.
03-12-2009, 09:19 PM
i was about to say...my card is dx10.1 ... lol
03-13-2009, 02:12 AM
But it supports DX10 too , plus there prettymuch aren't any games out there that support DX10.1 . Bleh I should check the forums more often ...
vBulletin® v3.6.7, Copyright ©2000-2013, Jelsoft Enterprises Ltd.