Intel's Larrabee Architecture Disclosure: A Calculated First Move
by Anand Lal Shimpi & Derek Wilson on August 4, 2008 12:00 AM EST- Posted in
- GPUs
The Future of Larrabee: The Many Core Era
I keep going back to this slide because it really tells us where Intel sees its architectures going:
Today we're in the era of the multi-core array. Next year, Nehalem will bring us 8-cores on a single chip and it's conceivable that we'll see 10 and 12 core versions in the two years following it. Larrabee isn't actually on this chart, it remains separate until we hit the heterogeneous multi-threaded cores (the last two items on the evolutionary path).
It looks like future Intel desktop chips will be a mixture of these large Nehalem-like cores surrounded by tons of little Larrabee-like cores. Your future CPUs will be capable of handling whatever is thrown at them, whether that is traditional single-threaded desktop applications, 3D games, physics or other highly parallelizable workloads. It also paints an interesting picture of the future - with proper OS support, all you'd need for a gaming system would be a single Larrabee, you wouldn't need a traditional x86 CPU.
This future is a long time from now, but just as Pentium M eventually evolved into the future of desktop microprocessors from Intel today, keep an eye on Larrabee, because in 5 years it could be behind what you're running everything on.
Changing the Way GPUs Are Launched?
Here's an interesting thought. By the time Larrabee rolls out in 2009/2010, Intel's 45nm process will have been able to reach maturity. It's very possible that Intel could launch Larrabee much like it does its CPUs, with many SKUs covering a broad range of market segments. Intel could decide to launch $199 all the way up to $999 Larrabee parts, instead of the more traditional single GPU launch (perhaps with two SKUs) and waiting months before the technology trickles down to the mainstream.
Intel could take the GPU industry by storm and get Larrabee out into the wild quicker if it launched top to bottom, akin to how its CPU introductions work.
101 Comments
View All Comments
phaxmohdem - Monday, August 4, 2008 - link
Can your mom play Crysis? *burn*JonnyDough - Monday, August 4, 2008 - link
I suppose she could but I don't think she would want to. Why do you care anyway? Have some sort of weird fetish with moms playing video games or are you just looking for another woman to relate to?Ooooh, burn!
Griswold - Monday, August 4, 2008 - link
He is looking for the one playing his mom, I think.bigboxes - Monday, August 4, 2008 - link
Yup. He worded it incorrectly. It should have read, "but can it play your mom?" :pTilmitt - Monday, August 4, 2008 - link
I'm really disappointed that Intel isn't building a regular GPU. I doubt that bolting a load of unoptimised x86 cores together is going to be able to perform anywhere near as well as a GPU built from the ground up to accelerate graphics, given equal die sizes.JKflipflop98 - Monday, August 4, 2008 - link
WTF? Did you read the article?Zoomer - Sunday, August 10, 2008 - link
He had a point. More programmable == more transistors. Can't escape from that fact.Given equal number of transistors, running the same program, a more programmable solution will always be crushed by fixed function processors.
JonnyDough - Monday, August 4, 2008 - link
I was wondering that too. This is obviously a push towards a smaller Centrino type package. Imagine a powerful CPU that can push graphics too. At some point this will save a lot of battery juice in a notebook computer, along with space. It may not be able to play games, but I'm pretty sure it will make for some great basic laptops someday that can run video. Not all college kids and overseas marines want to play video games. Some just want to watch clips of their family back home.rudolphna - Monday, August 4, 2008 - link
as interesting and cool as this sounds, this is even more bad news for AMD, who was finally making up for lost ground. granted, its still probably 2 years away, and hopefully AMD will be back to its old self (Athlon64 era) They are finally getting products that can actually compete. Another challenger, especially from its biggest rival-Intel- cannot be good for them.bigboxes - Monday, August 4, 2008 - link
What are you talking about? It's been nothing but good news for AMD lately. Sure, let Intel sink a lot of $$ into graphics. Sounds like a win for AMD (in a roundabout way). It's like AMD investing into a graphics maker (ATI) instead of concentrating on what makes them great. Most of the Intel supporters were all over AMD for making that decision. Turn this around and watch Intel invest heavily into graphics and it's a grand slam. I guess it's all about perspective. :)