NVIDIA’s GeForce GT 220: 40nm and DX10.1 for the Low-End
by Ryan Smith on October 12, 2009 6:00 AM EST- Posted in
- GPUs
DirectX 10.1 on an NVIDIA GPU?
Easily the most interesting thing about the GT 220 and G 210 is that they mark the introduction of DirectX 10.1 functionality on an NVIDIA GPU. It’s no secret that NVIDIA does not take a particular interest in DX10.1, and in fact even with this they still don’t. But for these new low-end parts, NVIDIA had some special problems: OEMs.
OEMs like spec sheets. They want parts that conform to certain features so that they can in turn use those features to sell the product to consumers. OEMs don’t want to sell a product with “only” DX10.0 support if their rivals are using DX10.1 parts. Which in turn means that at some point NVIDIA would need to add DX10.1 functionality, or risk losing out on lucrative OEM contracts.
This is compounded by the fact that while Fermi has bypassed DX10.1 entirely for the high-end, Fermi’s low-end offspring are still some time away. Meanwhile AMD will be shipping their low-end DX11 parts in the first half of next year.
So why do GT 220 and G 210 have DX10.1 functionality? To satisfy the OEMs, and that’s about it. NVIDIA’s focus is still on DX10 and DX11. DX10.1 functionality was easy to add to the GT200-derrived architecture (bear in mind that GT200 already had some DX10.1 functionality), and so it was done for the OEMs. We would also add that NVIDIA has also mentioned the desire to not be dinged by reviewers and forum-goers for lacking this feature, but we’re having a hard time buying the idea that NVIDIA cares about either of those nearly as much as they care about what the OEMs think when it comes to this class of parts.
DX10.1 in a nutshell, as seen in our Radeon 3870 Review
At any rate, while we don’t normally benchmark with DX10.1 functionality enabled, we did so today to make sure DX10.1 was working as it should be. Below are our Battleforge results, using DX10 and DX10.1 with Very High SSAO enabled.
The ultimate proof that DX10.1 is a checkbox feature here is performance. Certainly DX10.1 is a faster way to implement certain effects, but running them in the first place still comes at a significant performance penalty. Hardware of this class is simply too slow to make meaningful use of the DX10.1 content that’s out there at this point.
80 Comments
View All Comments
abs0lut3 - Tuesday, October 13, 2009 - link
When is GT 240 coming out and when are you going to review it? I had expected the GT 220 to be as low as it comes (reaaallly low end), however, I saw some preliminary reviews on other forums on the GT 240, the supposedly new Nvidia 40nm mainstream card with GDDR5 and quite fascinate with result.MegaSteve - Tuesday, October 20, 2009 - link
No one is going to buy one of these cards by choice - they are going to be thrown out in HP, Dell and Acer PCs under a pretty sticker saying they have POWERFUL GRAPHICS or some other garbage. Much the same as them providing 6600 graphics cards instead of 6600GTs, then again, I would probably rather have a 6600GT because if the DirectX 10 cards that were first released were any indication this thing will suck. I am sure this thing will play Bluray...Deanjo - Tuesday, October 13, 2009 - link
"NVIDIA has yet to enable MPEG-4 ASP acceleration in their drivers"Not true, they have not enabled it in their Windows drivers.
They are enabled in the linux drivers for a little while now.
ftp://download.nvidia.com/XFree86/Linux-x86_64/190...">ftp://download.nvidia.com/XFree86/Linux-x86_64/190...
VDP_DECODER_PROFILE_MPEG4_PART2_SP, VDP_DECODER_PROFILE_MPEG4_PART2_ASP, VDP_DECODER_PROFILE_DIVX4_QMOBILE, VDP_DECODER_PROFILE_DIVX4_MOBILE, VDP_DECODER_PROFILE_DIVX4_HOME_THEATER, VDP_DECODER_PROFILE_DIVX4_HD_1080P, VDP_DECODER_PROFILE_DIVX5_QMOBILE, VDP_DECODER_PROFILE_DIVX5_MOBILE, VDP_DECODER_PROFILE_DIVX5_HOME_THEATER, VDP_DECODER_PROFILE_DIVX5_HD_1080P
*
Complete acceleration.
*
Minimum width or height: 3 macroblocks (48 pixels).
*
Maximum width or height: 128 macroblocks (2048 pixels).
*
Maximum macroblocks: 8192
Deanjo - Tuesday, October 13, 2009 - link
I should also mention XBMC already supports this as well in linux.Transisto - Tuesday, October 13, 2009 - link
zzzzzzzzzzzzzzzzz...............Souleet - Monday, October 12, 2009 - link
I guess the only place that actually selling Palit right now is newegg. http://www.newegg.com/Product/ProductList.aspx?Sub...">http://www.newegg.com/Product/ProductLi...&Des...MODEL3 - Monday, October 12, 2009 - link
Great prices, lol (either they have old 55nm stock or the 40nm yields are bad or they are crazy, possibly the first)Some minor corrections:
G 210 ROPs should be 4 not 8 (8 should be the Texture units, GT220 should have 8 ROPs and 16 Texture units)
http://www.tomshardware.co.uk/geforce-gt-220,revie...">http://www.tomshardware.co.uk/geforce-gt-220,revie...
(Not because tomshardware is saying so, but because otherwise, it doesn't make sense NV architects to designed a so bandwidth limited GPU) (and based on past architecture design logic)
G 210 standard config CPU core clock is 589MHz, shaders 1402MHz.
(check Nvidia's partner sites)
9600GSO (G94) Memory Bus Width is 256bit not 128bit.
http://www.nvidia.com/object/product_geforce_9600_...">http://www.nvidia.com/object/product_geforce_9600_...
58W should be the figure NV is giving when GT 220 is paired with GDDR3, with DDR3 the power consumption should be a lot less.
Example for GDDR3 vs DDR3 power consumption:
http://www.techpowerup.com/reviews/Palit/GeForce_G...">http://www.techpowerup.com/reviews/Palit/GeForce_G...
http://www.techpowerup.com/reviews/Zotac/GeForce_G...">http://www.techpowerup.com/reviews/Zotac/GeForce_G...
Souleet - Monday, October 12, 2009 - link
I'm sure there is cooling solution but it will probably going to hurt your wallet. I love ATI but they need to fire their marketing team and hire some more creative people. Nvidia needs to stop under estimating ATI and crush them, now they are just giving ATI a chance to steal some market share back.Zool - Monday, October 12, 2009 - link
Its 40nm and has only 48sp 8rop/16tmu and still only 1360MHz shader clock.Is the TSMC 40nm this bad or what. The 55nm 128sp gt250 has 1800 Mhz shaders.Could you please try out some overckocking.
Ryan Smith - Tuesday, October 13, 2009 - link
We've seen vendor overclocked cards as high as 720MHz core, 1566MHz shader, so the manufacturing process isn't the problem. There are specific power and thermal limits NVIDIA wanted to hit, which is why it's clocked where it is.