AMD in Consumer Electronics
The potential of Fusion extends far beyond the PC space and into the embedded space. If you can imagine a very low power, low profile Fusion CPU, you can easily see it being used in not only PCs but consumer electronics devices as well. The benefit is that your CE devices could run the same applications as your PC devices, truly encouraging and enabling convergence and cohabitation between CE and PC devices.
Despite both sides attempting to point out how they are different, AMD and Intel actually have very similar views on where the microprocessor industry is headed. Both companies have stated to us that they have no desire to engage in the "core wars", as in we won't see a race to keep adding cores. The explanation for why not is the same one that applied to the GHz race: if you scale exclusively in one direction (clock speed or number of cores), you will eventually run into the same power wall. The true path to performance is a combination of increasing instruction level parallelism, clock speed, and number of cores in line with the demands of the software you're trying to run.
AMD has been a bit more forthcoming than Intel in this respect by indicating that it doesn't believe that there's a clear sweet spot, at least for desktop CPUs. AMD doesn't believe there's enough data to conclude whether 3, 4, 6 or 8 cores is the ideal number for desktop processors. From our testing with Intel's V8 platform, an 8-core platform targeted at the high end desktop, it is extremely difficult finding high end desktop applications that can even benefit from 8 cores over 4. Our instincts tell us that for mainstream desktops, 3 - 4 general purpose x86 cores appears to be the near term target that makes sense. You could potentially lower the number of cores needed if you combine other specialized hardware (e.g. an H.264 encode/decode core).
What's particularly interesting is that many of the same goals Intel has for the future of its x86 processors are in line with what AMD has planned. For the past couple of IDFs Intel has been talking about bringing to market a < 0.5W x86 core that can be used for devices that are somewhere in size and complexity between a cell phone and an UMPC (e.g. iPhone). Intel has committed to delivering such a core in 2008 called Silverthorne, based around a new micro-architecture designed for these ultra low power environments.
AMD confirmed that it too envisions ultra low power x86 cores for use in consumer electronics devices, areas where ARM or other specialized cores are commonly used. AMD also recognizes that it can't address this market by simply reducing clock speed of its current processors, and thus AMD mentioned that it is working on a separate micro-architecture to address these ultra low power markets. AMD didn't attribute any timeframe or roadmap to its plans, but knowing what we know about Fusion's debut we'd expect a lower power version targeted at UMPC and CE markets to follow.
Why even think about bringing x86 cores to CE devices like digital TVs or smartphones? AMD offered one clear motivation: the software stack that will run on these devices is going to get more complex. Applications on TVs, cell phones and other CE devices will get more complex to the point where they will require faster processors. Combine that with the fact that software developers don't want to target multiple processor architectures when they deliver software for these CE devices, and by using x86 as the common platform between CE and PC software you end up creating an entire environment where the same applications and content can be available across any device. The goal of PC/CE convergence is to allow users to have access to any content, on any device, anywhere - if all the devices you're trying to gain access to content/programs on happen to all be x86, it makes the process much easier.
Why is a new core necessary? Although x86 can be applied to virtually any market segment, the range of usefulness of a particular core can extend throughout an order of magnitude of power. For example, AMD's current desktop cores can easily be scaled up or down to hit TDPs in the 10W - 100W range, but they would not be good for hitting something in the sub-1W range. AMD can easily address the sub-1W market, but it will require a different core from what it addresses the rest of the market with. This philosophy is akin to what Intel discovered with Centrino; in order to succeed in the mobile market, you need a mobile specific design. To succeed in the ultra mobile and handtop markets, you need an ultra mobile/handtop specific processor design as well. Both AMD and Intel realize this, and now both companies have publicly stated that they are doing something about it.
The potential of Fusion extends far beyond the PC space and into the embedded space. If you can imagine a very low power, low profile Fusion CPU, you can easily see it being used in not only PCs but consumer electronics devices as well. The benefit is that your CE devices could run the same applications as your PC devices, truly encouraging and enabling convergence and cohabitation between CE and PC devices.
Despite both sides attempting to point out how they are different, AMD and Intel actually have very similar views on where the microprocessor industry is headed. Both companies have stated to us that they have no desire to engage in the "core wars", as in we won't see a race to keep adding cores. The explanation for why not is the same one that applied to the GHz race: if you scale exclusively in one direction (clock speed or number of cores), you will eventually run into the same power wall. The true path to performance is a combination of increasing instruction level parallelism, clock speed, and number of cores in line with the demands of the software you're trying to run.
AMD has been a bit more forthcoming than Intel in this respect by indicating that it doesn't believe that there's a clear sweet spot, at least for desktop CPUs. AMD doesn't believe there's enough data to conclude whether 3, 4, 6 or 8 cores is the ideal number for desktop processors. From our testing with Intel's V8 platform, an 8-core platform targeted at the high end desktop, it is extremely difficult finding high end desktop applications that can even benefit from 8 cores over 4. Our instincts tell us that for mainstream desktops, 3 - 4 general purpose x86 cores appears to be the near term target that makes sense. You could potentially lower the number of cores needed if you combine other specialized hardware (e.g. an H.264 encode/decode core).
What's particularly interesting is that many of the same goals Intel has for the future of its x86 processors are in line with what AMD has planned. For the past couple of IDFs Intel has been talking about bringing to market a < 0.5W x86 core that can be used for devices that are somewhere in size and complexity between a cell phone and an UMPC (e.g. iPhone). Intel has committed to delivering such a core in 2008 called Silverthorne, based around a new micro-architecture designed for these ultra low power environments.
AMD confirmed that it too envisions ultra low power x86 cores for use in consumer electronics devices, areas where ARM or other specialized cores are commonly used. AMD also recognizes that it can't address this market by simply reducing clock speed of its current processors, and thus AMD mentioned that it is working on a separate micro-architecture to address these ultra low power markets. AMD didn't attribute any timeframe or roadmap to its plans, but knowing what we know about Fusion's debut we'd expect a lower power version targeted at UMPC and CE markets to follow.
Why even think about bringing x86 cores to CE devices like digital TVs or smartphones? AMD offered one clear motivation: the software stack that will run on these devices is going to get more complex. Applications on TVs, cell phones and other CE devices will get more complex to the point where they will require faster processors. Combine that with the fact that software developers don't want to target multiple processor architectures when they deliver software for these CE devices, and by using x86 as the common platform between CE and PC software you end up creating an entire environment where the same applications and content can be available across any device. The goal of PC/CE convergence is to allow users to have access to any content, on any device, anywhere - if all the devices you're trying to gain access to content/programs on happen to all be x86, it makes the process much easier.
Why is a new core necessary? Although x86 can be applied to virtually any market segment, the range of usefulness of a particular core can extend throughout an order of magnitude of power. For example, AMD's current desktop cores can easily be scaled up or down to hit TDPs in the 10W - 100W range, but they would not be good for hitting something in the sub-1W range. AMD can easily address the sub-1W market, but it will require a different core from what it addresses the rest of the market with. This philosophy is akin to what Intel discovered with Centrino; in order to succeed in the mobile market, you need a mobile specific design. To succeed in the ultra mobile and handtop markets, you need an ultra mobile/handtop specific processor design as well. Both AMD and Intel realize this, and now both companies have publicly stated that they are doing something about it.
55 Comments
View All Comments
TA152H - Friday, May 11, 2007 - link
Actually, I do have an idea on what AMD had to do with it. You don't. If you know anyone from Microsoft, ask about it.Even publicly, AMD admitted that Microsoft co-developed it with them.
By the way, when was the last time you used AMD software? Do you have any idea what you're talking about, or just an angry simpleton?
rADo2 - Friday, May 11, 2007 - link
Oh man, AMD copied, in fact, all Intel patents, due to their "exchange". They copied x86 instruction set, SSE, SSE2, SSE3, and many others. Intel was the first to come up with 64-bit Itanium.And AMD is/was damn expensive, while it had a window of opportunity. My most expensive CPU ever bought was AMD X2 4400+ ;-)
fic2 - Friday, May 11, 2007 - link
What does the 64-bit Itanium have to do with x86. Totally different instruction set.And what would the Intel equivalent to your X2 4400+ have cost you at the time? Or was there even an Intel equivalent.
rADo2 - Friday, May 11, 2007 - link
"What does the 64-bit Itanium have to do with x86" -- Intel had 64-bit CPU way before AMD, a true new platform. AMD came up with primitive AMD64 extension, which was not innovative at all, they just doubled registry and added some more.yyrkoon - Friday, May 11, 2007 - link
You mean - A 'primitive' 64BIT CPU that outpeformed the Intel CPU in just about every 32BIT application out there. This was also one reason why AMD took the lead for a few years . . .fitten - Friday, May 11, 2007 - link
It was actually pretty smart on AMD's part. Intel was trying to lever everyone off of x86 for a variety of reasons. AMD knew that lots of folks didn't like that so they designed x86-64 and marketed it. Of course people would rather be backwards compatible fully, which is why AMD was successful with it and Intel had to copy it to still compete. So... it's AMD's fault we can't get rid of the x86 albatross again ;)TA152H - Friday, May 11, 2007 - link
AMD had no choice but to go the way they did, there was nothing smart about it. They lacked the market power to introduce a new instruction set, as well as the software capability to make it a viable platform.Intel didn't even have the marketing muscle to make it an unqualified success. x86 is bigger than both of them. It's sad.
rADo2 - Friday, May 11, 2007 - link
I bought X2 because I wanted NVIDIA SLI (2x6800, 2x7800, 2x7900, etc.) with dualcore, so Pentium D was not an option (NVIDIA chipsets for Intel are even worse than for AMD, if that is possible).X2 was more expensive than my current quadcore, Q6600, and performed really BAD in all things except games.
I hated that CPU, while paying about $850 (including VAT) for it. For audio and video processing, it was a horrible CPU, worse than my previous P4 Northwood with HT, bought for $100, not to mention unstable NVIDIA nForce4 boards, SATA problems, NVIDIA firewall problems, etc.
I never want to see AMD again. Intel CPU + Intel chipset = pure godness.
yyrkoon - Friday, May 11, 2007 - link
SO, by your logic, just because a product does not meet your 'standard' ( which by the way seem to be based on 'un-logic' ), you would like to see a company, that you do not like, go under, and thus rendering the company that you hold so dearly in your mind, a monopoly.Pray AMD never goes under, because if they do go away, your next system may cost you 5x as much, and may perform 5x worse, and there will be nothing you can do about it.
Cheers
TA152H - Friday, May 11, 2007 - link
Not only that, but HP had more to do with the design than Intel.