AMD 780G: Preview of the Best Current IGP Solution
by Gary Key on March 10, 2008 12:00 PM EST- Posted in
- CPUs
A Whole New Ball Game
As we stated earlier, the AMD 780G/780V is an all new chipset that consists of an RS780/RS780C Northbridge and SB700 Southbridge. AMD’s intent with this chipset is to offer a better alternative to the NVIDIA GeForce 8200, but more importantly to provide a total platform solution that competes directly against the current Intel G31/33/35 products.
Meant for the consumer market, the 780G has a heavy emphasis on multimedia and casual gaming capabilities via the HD 3200 graphics core with a 500MHz core clock. AMD is targeting the 780V chipset for the business crowd and it features the HD 3100 graphics core. DX10 functionality is retained but the core clock is reduced 350MHz while UVD 2.0, DisplayPort, Surround View, and Hybrid CrossFire are not supported.
780G Northbridge
AMD has a distinct manufacturing advantage in producing this Northbridge compared to Intel. The primary reason is the fact that the memory controller logic resides on the AMD processor, freeing up a considerable area on the NB die for a complete discrete solution. Compared to the 80nm manufacturing process used on the 690G, AMD has designed the 780G on a 55nm TSMC half-node process used to produce the current HD 3xxx series of cards. In the process of dropping in the RV610 core, HT 3.0, PCI Express 2.0, UVD, and other enhancements the transistor count for the 780G has grown to 205 million compared to the 72 million transistors in the 690G.
The biggest change AMD made in designing the 780G is a new integrated graphics processor featuring a slightly modified RV610 core that powers the Radeon HD 2400 series of cards. While the RV610 is an entry-level core from AMD, this core is less than a year old. Compared to the X700 design in the 690G, AMD has basically skipped two generations of budget GPU designs to incorporate this core into the 780G and has named it the HD 3200. This is the first time a manufacturer has used a current discrete graphics core directly in an IGP solution. Typically, most of the GPU designs for the IGP products have used the “reduced” feature set of a previous generation discreet design.
The Radeon HD 3200 graphics processor features a DX10 compliant unified shader model 4.0 graphics core. This architecture contains 40 stream processors from two shader SIMDs. The core features 128-bit floating point precision for all operations, a command processor for reduced overhead, DXTC and 3Dc+ texture compression, texture support up to 8192x8192, and a fully associative texture and Z/stencil cache.
The single texture and ROP units are capable of handling four texels and pixels per-clock, respectively. This allows up to 16 texture fetches per clock cycle and up to 128 textures per pixel. The one modification AMD made compared to the RV610 core is that the vertex and texture caches are fully associative instead of separate. Other technical goodies include early Z-test, Re-Z, Z range optimization, and Fast Z-clear operation.
There are eight render targets (MRTs) with anti-aliasing support, along with lossless Z and stencil compression. The HD 3200 features multi-sample anti-aliasing (up to 4 samples per pixel), custom filter anti-aliasing, adaptive super-sampling and multi-sampling, along with Super AA when using Hybrid CrossFire. HDR rendering is supported for all anti-aliasing features, although like current generation HD 2400 products, the performance hit for using AA is significant in certain games.
Texture filtering features include 2x/4x/8x/16x adaptive anisotropic filtering modes with up to 128 taps per pixel support. Bicubic, sRGB, and percentage closer filtering are featured along with 128-bit floating point HDR support. Finally, the HD 3200 includes a programmable tessellation unit and an accelerated geometry shader path for geometry amplification.
The core clock speed operates at an impressive 500MHz with most BIOS options offering the ability to overclock the core up to 1100MHz. Success in overclocking the core depends on several factors such as voltage options, thermal design, and processor choices. We have averaged around 750MHz with the X2 processors and 850MHz with the Phenom on a couple of performance-oriented boards. Unlike the 690G, increasing the core clock can actually affect frame rates a noticeable amount, at least on certain games.
AMD uses a highly optimized unified memory architecture (UMA) design, and all graphics memory is shared with system memory with the ability to access up to 512MB of system memory. SidePort memory is an available option depending upon the motherboard supplier and features a separate 32-bit DDR memory interface that the GPU can use either instead of or along with the shared memory. Depending on the processor and HT Link utilized, this option will generally improve performance across the board by about 5% - not just in games, but in video decoding capabilities as well.
The primary advantage of SidePort memory will be in the mobile 780M configuration where it will allow greater power savings and the ability to utilize a single channel DDR2-800 setup and still have performance almost equal to that of a dual controller setup. AMD expects to see significant performance gains in video decoding capabilities with a single channel memory setup in the 8.4 drivers.
Hybrid CrossFire comes of age on the 780G with the release of the 8.3 Catalyst driver set. Hybrid CrossFire supports current HD 2400 and 3400 based cards. In early testing we have seen increases of up to 30% in games like Half-Life 2: EP2 with a $45 HD3450 card. The additional card completely changes the game play dynamics of this chipset and allows several recent games to play at 1024x768 HQ settings or 1280x1024 medium quality and still keep the frame rates in the 40FPS to 60FPS range. However, smooth game play at those settings is not possible in Crysis (as one example), but what reasonable priced video card stands a chance with that game anyway?
AMD integrates their Unified Video Decoder 2.0 (UVD 2.0) capabilities into the 780G. UVD 2.0 offers hardware acceleration for decoding VC-1, H.264 (AVC), WMV, and MPEG-2 sources up to 1080p resolutions. Advanced de-interlacing is available when using a Phenom processor. We generally found CPU utilization rates and output quality to be near or equal to that of the HD 3450.
We were excited to learn that AMD was seeking HDMI 1.3b certification of the RS780, but it turns out that will not happen. xvYCC color space is fully supported and Deep Color (10-bit) was on the list, but the specification calls for 4:4:4 and the RS780 supports 4:2:2. On the audio side, the HDMI interface offers support for 2-channel LPCM, Dolby Digital 5.1, or DTS 5.1. Multi-Channel LPCM is not supported due to an available audio bandwidth of 1.6Mb/s on the chipset.
Rounding out the video capabilities of the 780G is analog output, DVI/HDMI interfaces, internal or external TMDS, and integrated DisplayPort capabilities. The HD 3200 features dual independent displays that allow resolution, refresh rates, and display data to be completely independent on the two display paths. AMD provides HDCP support with on-chip key storage for DisplayPort, DVI, or HDMI interfaces but is regulated to a single interface during playback operations. ATI SurroundView is fully supported when utilizing a Radeon based discrete card that allows the system to drive up to four monitors.
HyperTransport 3.0 capability (5.2GT/s+ interface) is included and is extremely important in extracting every last ounce of performance out of the HD 3200 graphics core. AMD recommends the Phenom processor family (Ed: not just to sell out stocks of B2 processors) in order to fully take advantage of the performance offered by the HD 3200. With a Phenom onboard, the HD 3200 will perform post-processing on high-definition content and it makes a difference in image quality and fluidity during 1080p playback. In fact, the graphics core is so sensitive to the HT interconnect link speed that simply raising the standard 1.8GHz speed to 2.2GHz on our 9600BE resulted in measurable performance differences in a few applications.
The Northbridge contains 26 lanes of PCI Express 2.0 connectivity with 16 lanes reserved for graphics via an x16 link, while six are available for x1 links for expansion slots or onboard peripherals. AMD reserves the remaining four links for use in the A-Link Express II interface to the SB700.
SB700 Southbridge
While technically a completely new part, the SB700 appears to us to be more of a big brother to the SB600. It is obvious that AMD set out to correct the ills of the SB600 but once again fell just short of having an outstanding Southbridge. Of course, AMD tells us to wait for the SB750, but we have heard that story before. Anyway, let’s get into what has changed and what has not.
The SB700 features six SATA 3.0Gb/s ports, up from four on the SB600, with the ability to reserve up to two of those ports for eSATA connectivity. Drives can be set up in RAID 0, 1, or 10, but the absence of RAID 5 still perplexes us (although it should be in the SB750).
Also missing in action is a native interface for networking support. AMD continues to use an external PHY and MAC for network operations. Although performance is similar to the NVIDIA and Intel solutions, this setup does incur a cost penalty for the motherboard suppliers.
The major improvement in the SB700 is the increase in USB 2.0 performance and the number of ports available. The new dual-channel controller features 12 USB 2.0 capable ports and two specific 1.1 ports for compatibility reasons.
A single PATA channel provides native IDE support for up to two drives. This channel supports PIO, multi-word DMA, and Ultra DMA 33/66/100/133. HyperFlash support is provided via Windows Vista ReadyBoost and ReadyDrive protocols with an IDE based HyperFlash module.
The SB700 features four PCI Express lanes for the A-Link Express II interconnect, but unlike the 780G those four lanes are based on PCI Express 1.1 specifications. That means the interconnect bandwidth is capped at 2GB/s, half of what it would be in a PCI Express 2.0 configuration. Also provided in the SB700 are five PCI lanes.
Finally we have the High Definition Audio controller that allows up to 16 channels of audio output per stream. The controller supports up to four codecs with sample rates going up to 192kHz and up to 32-bits per sample.
Overall, the combination of the 780G and SB700 brings AMD to the forefront in features and performance in the IGP market. Power management was a critical factor in the design of both chipsets with the 55nm fabrication process being a major factor in the power savings department. Both chipsets feature an idle TDP of around 1.5W with load ratings being 15W for the 780G and 4.5W for the SB700.
49 Comments
View All Comments
- Monday, March 10, 2008 - link
Where is the discussion of this chipset as an HTPC? Just a tidbit here and there? I thought that was a major selling point here. With a single core sempron 1.8ghz being enough for an HTPC which NEVER hits 100% cpu usage (see tomshardware.com) you don't need a dual core and can probably hit 60w in your HTPC! Maybe less. Why was this not a major topic in this article? With you claiming the E8300/E8200 in your last article being a HTPC dreamers chip shouldn't you be talking about how low you could go with a sempron 1.8ghz? Isn't that the best HTPC combo out there now? No heat, super low cost running it all year long etc (NOISELESS with a proper heatsink).Are we still supposed to believe your article about the E8500? While I freely admit chomping at the bit to buy an E8500 to Overclock the crap out of it (I'm pretty happy now with my e4300@3.0 and can't wait for 3.6ghz with e8500, though it will go further probably who needs more than 3.6 today for gaming), it's a piece of junk for an HTPC. Overly expensive ($220? for e8300 that was recommended) compared to a lowly Sempron 1.8 which I can pick up for $34 at newegg. With that kind of savings I can throw in a 8800GT in my main PC as a bonus for avoiding Intel. What's the point in having an HTPC where the cpu utilization is only 25%? That's complete OVERKILL. I want that as close to 100% as possible to save me money on the chip and then on savings all year long with low watts. With $200 savings on a cpu I can throw in an audigy if needed for special audio applications (since you whined about 780G's audio). A 7.1channel Audigy with HD can be had for $33 at newegg. For an article totally about "MULTIMEDIA OUTPUT QUALITIES" where's the major HTPC slant?
sprockkets - Thursday, March 13, 2008 - link
Dude, buy a 2.2ghz Athlon X2 chip for like $55. You save what, around $20 or less with a Sempron nowadays?QuickComment - Tuesday, March 11, 2008 - link
It's not 'whining' about the audio. Sticking in a sound card from Creative still won't give 7.1 sound over HDMI. That's important for those that have a HDMI-amp in a home theatre setup.TheJian - Tuesday, March 11, 2008 - link
That amp doesn't also support digital audio/Optical? Are we just talking trying to do the job within 1 cable here instead of 2? Isn't that kind of being nit picky? To give up video quality to keep in on 1 cable to me is unacceptable (hence I'd never "lean" towards G35 as suggested in the article). I can't even watch if the video sucks.QuickComment2 - Tuesday, March 11, 2008 - link
No, its not about 1 cable instead of 2. SPDIF is fine for Dolby digital and the like, ie compressed audio, but not for 7.1 uncompressed audio. For that, you need HDMI. So, this is a real deal-breaker for those serious about audio.JarredWalton - Monday, March 10, 2008 - link
I don't know about others, but I find video encoding is something I do on a regular basis with my HTPC. No sense storing a full quality 1080i HDTV broadcast using 16GB of storage for two hours when a high quality DivX or H.264 encode can reduce disk usage down to 4GB, not to mention ripping out all the commercials. Or you can take the 3.5GB per hour Windows Media Center encoding and turn that into 700MB per hour.I've done exactly that type of video encoding on a 1.8GHz Sempron; it's PAINFUL! If you're willing to just spend a lot of money on HDD storage, sure it can be done. Long-term, I'm happier making a permanent "copy" of any shows I want to keep.
The reality is that I don't think many people are buying HTPCs when they can't afford more than a $40 CPU. HTPCs are something most people build as an extra PC to play around with. $50 (only $10 more) gets you twice the CPU performance, just in case you need it. If you can afford a reasonable HTPC case and power supply, I dare say spending $100-$200 on the CPU is a trivial concern.
Single-core older systems are still fine if you have one, but if you're building a new PC you should grab a dual-core CPU, regardless of how you plan to use the system. That's my two cents.
TheJian - Tuesday, March 11, 2008 - link
I guess you guys don't have a big TV. With a 65in Divx has been out of the question for me. It just turns to crap. I'd do anything regarding editing on my main PC with the HTPC merely being a cheap player for blu-ray etc. A network makes it easy to send them to the HTPC. Just set the affinity on one of your cores to vidcoding and I can still play a game on the other. Taking 3.5GB to 700MB looks like crap on a big tv. I've noticed it's watchable on my 46in, but awful on the 65. They look great on my PC, but I've never understood anyone watching anything on their PC. Perhaps a college kid with no room for a TV. Other than that...JarredWalton - Tuesday, March 11, 2008 - link
SD resolutions at 46" (what I have) or 65" are always going to look lousy. Keeping it in the original format doesn't fix that; it merely makes to use more space.My point is that a DivX, x64, or similar encoding of a Blu-ray, HDTV, or similar HD show loses very little in overall quality. I'm not saying take the recording and make it into a 640x360 SD resolution. I'm talking about converting a full bitrate 1080p source into a 1920x1080 DivX HD, x64, etc. file. Sure, there's some loss in quality, but it's still a world better than DVD quality.
It's like comparing a JPEG at 4-6 quality to the same image at 12 quality. If you do a diff, you will find lots of little changes on the lower quality image. If you want to print up a photo, the higher quality is desirable. If you're watching these images go by at 30FPS, though, you won't see much of a loss in overall quality. You'll just use about 1/3 the space and bandwidth.
Obviously, MPEG4 algorithms are *much* more complex than what I just described - which is closer to MPEG2. It's an analogy of how a high quality HD encode compares to original source material. Then again, in the case of HDTV, the original source material is MPEG2 encoded and will often have many artifacts already.
yehuda - Monday, March 10, 2008 - link
Great article. Thanks to Gary and everyone involved! The last paragraph is hilarious.One thing that bothers me about this launch is the fact that board vendors do not support the dual independent displays feature to full extent.
If I understand the article correctly, the onboard GPU lets you run two displays off any combination of ports of your choice (VGA, DVI, HDMI or DisplayPort).
However, board vendors do not let you do that with two digital ports. They let you use VGA+DVI or VGA+HDMI, but not DVI+HDMI. At least, this is what I have gathered reading the Gigabyte GA-MA78GM-S2H and Asus M3A78-EMH-HDMI manuals. Please correct me if I'm wrong.
How come tier-1 vendors overlook such a worthy feature? How come AMD lets them get away with it?
Ajax9000 - Tuesday, March 11, 2008 - link
They are appearing. At CeBIT Intel showed off two mini-ITX boards with dual digital.DQ45EK DVI+DVI
DG45FC DVI+HDMI
http://www.mini-itx.com/2008/03/06/intels-eaglelak...">http://www.mini-itx.com/2008/03/06/intels-eaglelak...