I see that most of us still unserstimate a lot the wii, maybe you need numbers to understand the potential of the Wii Hollywood.

First of all, let´s not forget that ATI Hollywood is comparable to an ATI HD 2600 or 2400 since not only supports HDR+AA, but according to one of the displacement mapping patents of nintendo, the Hollywood may have vertex cache and can do vertex texture fetch.


HDR+AA was not possible until the ATI R520 chips came up(ATI x1000 radeon cards)


R520 DirectX 9.0c ATI's DirectX 9.0c series of graphics cards, with complete Shader Model 3.0 support. Launched in October 2005, this series brought a number of enhancements including the floating point render target technology necessary for HDR rendering with anti-aliasing. Cards released include X1300 - X1950.

ATI Hollywood has demostrated HDR and antialaising in games like Monster Hunter 3, Resident evil DarkSide Chronicles, Cursed Mountain, Silent Hill, etc.

ATI Hollywood has shown HDR and antialaising like in Monster Hunter 3


The first thing you'll notice is the spectacular graphics; no, that's no hyperbole. The lighting, the texture detail, anti-aliasing (yes, you read that right, no jaggies), HDR (High dynamic range rendering for us geeks) and other technologies make this one of the prettiest Wii games on the market, period

The first GPU chips capable and feasible for gpgpu were the ATI R520(x1000 series) and the chips that are inside the Nvidia 8800 series.



Among the GPGPU capable ATI and Nvidia chips there are those of the latter generation like the GeForce 8800 and Radeon X1000 and HD 2000s, but older chips can too be used but as they are slower don't hold your breath waiting for results. Unlike the processors that are capable only of a limited degree of parallelism, being able to execute a few SIMD instructions per clock cycle, the latest graphics chips can process hundreds. This feature is hardware implemented in all newer video chips and it is related to the fact that such a chip is made up by many identical processing units like the pixel shading units. For example, ATI's Radeon X1900 chip has 48 such units, while the GeForce 8800 numbers 128, so their parallel processing capabilities are very high.


When it talks about the speed, is not the clock speed is refering to, is the speed of the stream processor or vertex processors that come in the GPUs.


Now, what´s the adventage of using a gpu for general purpose instead of the cpu?

Chek it by yourselfs





But if you think that something as an ATI R600 or R610 would have been to expensive back then, then just check this


ATI HD 2400 production cost = ?

ATI HD 2400 price                  = 59 dollars

$59 dollars, but thats the price for us as clients, not a special price for a console maker and distributor like Nintendo. And according to some reports out there, the production cost of Hollywood is about $29.60 dollars.


An ATI HD 2400 is made at 65nm and it´s die size is about 85mm2, in 90nm2 there woud be an increase of as much as1.5x the die size, but since the Hollywood only runs at 243Mhz and the HD 2400 can go up to 525 Mhz, there would also be a decrease in die size if we decrease the clock speed since thtas like decreasing the number of transistirs that have to be implemented.

The Hollywood has two chips, vegas and napa

vegas 72mm2

napa 94.5mm2

it seems that 4.5mm2 are occupied by an embedded ARM9 called starlet(1.45mm2 in 90nm) and additional components like sd controllers, usb, etc.

It is also known thanks to ign that the specific macro used as edRAM in Wii, made with 1T-SRAM technology, is the UX6D(1T-SRAM-Q made using MIM2)


vegas is supposed to be the chip with 24MBytes of eDRAM and a DSP,

napa has about 3 Mbytes of embedded memory

1mm2 UX6D = 0.5618 MBytes

5.4 mm2        = 3            MBytes 

90mm2 - 5.4mm2 


86.4mm2 of GPU