NVIDIA's Fermi: Architected for Tesla, 3 Billion Transistors in 2010
by Anand Lal Shimpi on September 30, 2009 12:00 AM EST- Posted in
- GPUs
Final Words
Today's launch is strange. I tried to convince NVIDIA to release more information about Fermi but was met with staunch resistance from the company. NVIDIA claims that by pre-announcing Fermi's performance levels it would seriously hurt its existing business. It's up to you whether or not you want to believe that.
Last quarter the Tesla business unit made $10M. That's not a whole lot of money for a company that, at its peak, grossed $1B in a single quarter. NVIDIA believes that Fermi is when that will all change. To borrow a horrendously overused phrase, Fermi is the inflection point for NVIDIA's Tesla sales.
By adding support for ECC, enabling C++ and easier Visual Studio integration, NVIDIA believes that Fermi will open its Tesla business up to a group of clients that would previously not so much as speak to NVIDIA. ECC is the killer feature there.
While the bulk of NVIDIA's revenue today comes from 3D graphics, NVIDIA believes that Tegra (mobile) and Tesla are the future growth segments for the company. This hints at a very troubling future for GPU makers - are we soon approaching the Atom-ization of graphics cards?
Will 2010 be the beginning of good enough performance in PC games? Display resolutions have pretty much stagnated, PC games are first developed on consoles which have inferior hardware and thus don't have as high the GPU requirements. The fact that NVIDIA is looking to Tegra and Tesla to grow the company is very telling. Then again, perhaps a brand new approach to graphics is what we'll need for the re-invigoration of PC game development. Larrabee.
If the TAM for GPUs in HPC is so big, why did NVIDIA only make $10M last quarter? If you ask NVIDIA it has to do with focus and sales.
According to NVIDIA, over the past couple of years NVIDIA's Tesla sales efforts have been scattered. The focus was on selling to any customers that could potentially see a speedup, trying to gain some traction for the Tesla business.
Jen-Hsun did some yelling and now NVIDIA is a bit more focused in that department. If Tesla revenues increase linearly from this point, that's simply not going to be enough. I asked NVIDIA if exponential growth for Tesla was in the cards and if so, when would it happen. The answer was yes and with Fermi.
We'll see how that plays out, but if Fermi doesn't significantly increase Tesla revenues then we know that NVIDIA is in serious trouble.
The architecture looks good, Fermi just needs to be priced right. Oh and the chip needs to hurry up and come out.
415 Comments
View All Comments
palladium - Monday, October 5, 2009 - link
Not quite:http://www.dailytech.com/article.aspx?newsid=16410">http://www.dailytech.com/article.aspx?newsid=16410
Scroll down halfway thru the comments. He re-registered as SilicconDoc and barks about his hatred for red roosters (in an Apple-related article!)
johnsonx - Monday, October 5, 2009 - link
that looks more like someone mocking him- Sunday, October 4, 2009 - link
According to this very link http://www.anandtech.com/video/showdoc.aspx?i=3573...">http://www.anandtech.com/video/showdoc.aspx?i=3573... AMD already presented a WORKING SILICON at Computex roughly 4 months ago on June 3rd. So it took roughly 4 and a half months to prepare drivers, infrastructure and mass production to have enoough for the start of Windows 7 and DX11. However, Nvidia wasnt even talking about W7 and DX11 so late Q1 2010 or even later becomes more realistic than december. But there are much more questions ahead: What pricepoint, Clockrates and TDP. My impression is that Nvidia has no clue about this questions and the more I watch this development, the more Fermi resembles to the Voodoo5 Chip and the V6000 card which never made into the market because of its much to high TDP.silverblue - Sunday, October 4, 2009 - link
Nah, I expect nVidia to do everything they can to get this into retail channels because it's the culmination of a lot of hard work. I also expect it to be a monster, but I'm still curious as to how they're going to sort out mainstream options due to their top-down philosophy.That's not to say ATI's idea of a mid-range card that scales up and down doesn't have its flaws, but with both the 4800 and 5800 series, there's been a card out at the start with a bona fide GPU with nothing disabled (4850, and now 5870), along with a cheaper counterpart with slower RAM and a slightly handicapped core (4830/5850). Higher spec single GPU versions will most likely just benefit from more and/or faster RAM and/or a higher core clock, but the architecture of the core itself will probably be unchanged - can nVidia afford to release a competing version of Fermi without disabling parts of the core? If it's as powerful as we're lead to believe, it will certainly warrant a higher price tag than the 5870.
Ahmed0 - Saturday, October 3, 2009 - link
Nvidia wants it to be the jack of all trades. However, they are risking with being an overpriced master of none. Thats probably the reason they give their cards more and more gimmicks to play with each year. They are hoping that the cards value will be greater than the sum of its parts. And that might even be a successful strategy to some extent. In a consumerist world, reputation is everything.They might start overdoing it at some point though.
Its like mobile phones nowadays. You really dont need to have a radio, an mp3-player, a camera nor other such extras in it (in fact, my phone isnt able to do anything but call and send messages). But unless you have these features, you arent considered as competition. It gives you the opportunity to call your product "vastly superior" even though from a usability standpoint it isnt.
SymphonyX7 - Saturday, October 3, 2009 - link
Ahh... I see where you're coming from. I've had many classmates who've asked me what laptop to buy and they're always so giddy when they see laptops with the "Geforce" sticker and say they want it cause they want some casual gaming. Yes, even if the GPU is a Geforce 9100M. I recommended them laptop using AMD's Puma platform and many of them ask if that's a good choice (unfortunately here, only the Macbook has a 9400M GPU and it's still outside many of my classmates' budgets). Seems like brand awareness of Nvidia amongst many consumers is still much better than AMD/ATI's. So it's an issue of clever branding then?Lifted - Saturday, October 3, 2009 - link
A little late for any meaningful discussion over here as AT let the trolls go for 40 or so pages. I doubt many people can be arsed to sort through it now, so you'd be better off going to a forum for a real discussion of Fermi.neomocos - Saturday, October 3, 2009 - link
if you missed it then here you go ... happy day for all of us :quote from comment posted on page 37 by Pastuch
" Below is an email I got from Anand. Thanks so much for this wonderful site.
-------------------------------------------------------------------
Thank you for your email. SiliconDoc has been banned and we're accelerating the rollout of our new comments rating/reporting system as a result of him and a few other bad apples lately.
A- "
james jwb - Saturday, October 3, 2009 - link
Some may enjoy it, but this unusual freedom that blatant trolls using aggressive, rude language are getting lately is making a mockery of this site.I don't mind it going on for a while, even 20 pages tbh, it is funny, but at some point i'd like to see a message from Gary saying, "K, SiliconDoc, we've laughed enough at your drivel, tchau, banned! :)"
That's what i want to see after reading through 380 bloody comments, not that he's pretty much gotten away with it. And if he has finally been banned, i'd actually love to know about it in the comments section.
/Rant over.
Gary Key - Monday, October 5, 2009 - link
He is gone as are a couple of others. We have a new comments system in final development now that should take care of this problem in the future.