Monday, October 6, 2008

ASUS EAH2600XT/HTDP/256M



The ATI Radeon HD 2600 delivers a remarkable combination of DirectX 10 gaming performance and hardware processed 1080p video. Connect to big-screen TVs with HDMI including built-in 5.1 surround audio to enjoy Blu-ray and HD DVD movies. Unleash the HD visual effects within Windows Vista and DirectX 10. Plug-n-play CrossFire upgradeability makes it easy to scale up graphics performance to boost the most demanding games; and built-in multi-channel 5.1 surround audio over HDMI establishes a new level of entertainment designed for tech savvy gamers.

ASUS EN8600GT/HTDP/256M GeForce 8600GT


Model
Brand ASUS
Model EN8600GT/HTDP/256M
Interface
Interface PCI Express x16
Chipset
Chipset Manufacturer NVIDIA
GPU GeForce 8600GT
Core clock 540MHz
Stream Processors 32
Memory
Memory Clock 1400MHz
Memory Size 256MB
Memory Interface 128-bit
Memory Type GDDR3
3D API
DirectX DirectX 10
OpenGL OpenGL 2.0
Ports
DVI 2
TV-Out HDTV Out
VIVO No
General
Tuner None
RAMDAC 400 MHz
Max Resolution 2048 x 1536
SLI Supported Yes
Cooler With Fan
Windows Vista Certified for
Features
HDCP Ready Yes
Features Superior Cooling Efficiency - 14°c cooler than reference design boards!
HDCP Compliant
NVIDIA Quantum Technology - Advanced Shader Processors architected for physics computation
Packaging
Package Contents EN8600GT/HTDP/256M
Driver Disk
User Manual
HDTV Cable
DVI to VGA/D-sub Adapter
CD Wallet


Review
8600 GT cards is proven to provide nice price/performance ratio. SLI capable and provide adequate gameplay for latest games for a mere $120. The fan looks solidly built yet will not consume many space inside the case.

XFX GeForce 8600GT 256 Mb DDR3 PVT84JUDD3


VIEW THIS PRODUCT IN AMAZON FOR BEST PRICE!
Model
Brand XFX
Model PVT84JUDD3
Interface
Interface PCI Express x16
Chipset
Chipset Manufacturer NVIDIA
GPU GeForce 8600GT
Core clock 620MHz
Memory
Memory Clock 1600MHz
Memory Size 256MB
Memory Interface 128-bit
Memory Type GDDR3
3D API
DirectX DirectX 10
OpenGL OpenGL 2.0
Ports
DVI 2
TV-Out HDTV Out
VIVO No
General
Tuner None
RAMDAC 400 MHz
Max Resolution 2560 x 1600
RoHS Compliant Yes
SLI Supported Yes
Cooler With Fan
Dual-Link DVI Supported Yes
Features
Features Full Microsoft DirectX 10 Support
NVIDIA Lumenex Engine
Packaging
Package Contents PVT84JUDD3
Driver Disk
User Manual
S-Video Cable
DVI to VGA/D-sub Adapter


Review
The reason I make this review is for those of us who would like DirectX 10 capable graphic while short in budget or refuse to pay for an 8800 series card at its current high prices. I understand that these two card are in different classes and that, is the point. If you are considering making this change from a nVidia 7950 to an 8600 GT simply to have DirectX 10 you may want to reconsider. You WILL sacrifice in overall performance. Also keep in mind that your processor and RAM will drastically effect any video card. Most DirectX 10 games coming out in near future will rely more on dual core processor and 2 Gb of RAM for use in high graphics settings. In my opinion, wait until you can afford what you really want/need. But if you come from ultra low end graphic cards and need significant improvement in graphic performance, 8600 GT is the one of the best solution out there.

This card was incredibly easy to install. I downloaded latest drivers from the XFX website (the disc was an older version of drivers), then uninstall old graphic card drivers in add/remove hardware, shut down, pull the old card, add new card, startup, point manually to new drivers. Total Time=15-20 minutess! This card is VERY quiet... even though I came from a passive (no fan) video setup GF7300, the fan on this is SUPER quiet... I can barely hear it. I even had to verify visually if the fan is really even moving, it really quiet! I try Battlefield 2142 with most settings on High which yield 60-70 fps at 1024x768 and at 40-50 FPS at 1280x1024 (using +szx 1280 +szy 1024).

MSI NX8600GT-T2D256E OC


Model
Brand MSI
Model NX8600GT-T2D256E OC
Interface
Interface PCI Express x16
Chipset
Chipset Manufacturer NVIDIA
GPU GeForce 8600GT
Core clock 580MHz
Memory
Memory Clock 1600MHz
Memory Size 256MB
Memory Interface 128-bit
Memory Type GDDR3
3D API
DirectX DirectX 10
OpenGL OpenGL 2.0
Ports
DVI 2
TV-Out HDTV / S-Video Out
VIVO No
General
Tuner None
RAMDAC 400 MHz
Max Resolution 2560 x 1600
SLI Supported Yes
Cooler With Fan
Dual-Link DVI Supported Yes
Features
Features NVIDIA CineFX 5.0 Shading Architecture
NVIDIA Intellisample 4.0 Technology
NVIDIA UltraShadow II Technology
NVIDIA PureVideo Technology
Packaging
Package Contents NX8600GT-T2D256E OC
HDTV Cable
S-Video Cable
2 x DVI to VGA/D-sub Adapter



Review
I achieved 63 fps average in Half-Life 2: Lost Coast @1440x900, max details with 8xAF and 4xAA and vertical sync ON! At the same time 44 fps reached with 7600GT and 49 fps with 7900GS at the same settings..and water and HDR lighting look much better than nVidia 7000 series. 128-bit memory makes no difference as it outperforms the 7900GS which had 256-bit DDR3

Diamond 9250 256 Mb S9250PCI256SB



Model
Brand DIAMOND
Model S9250PCI256SB
Interface
Interface PCI
Chipset
Chipset Manufacturer ATI
GPU Radeon 9250
PixelPipelines 4
Memory
Memory Size 256MB
Memory Interface 128-bit
Memory Type DDR
3D API
DirectX DirectX 8
OpenGL OpenGL
Ports
D-SUB 1
DVI 1
TV-Out S-Video Out
VIVO No
General
Tuner None
Max Resolution 2048 x 1536
RoHS Compliant Yes
Cooler With Fan
Operating Systems Supported Windows 2000/XP
Packaging
Package Contents S9250PCI256SB
2 x Driver Disk
S-Video Cable


Review
I bought this for a relative who desperately needed an upgrade from his GeForce MX400 256MB PCI. His motherboard lacked any AGP or PCIe slots, so finding a good PCI card that was also relatively inexpensive was wonderful. He's had it for nearly four months at the time of this review, and it outperforms ATI Radeon 9200 512MB AGP card by a high margin. I would definitely drop the cash to buy another one for any ultra-cheap PC rig.

Monday, September 15, 2008

XFX 7950 GX2 Quad SLI

XFX 7950 GX2 Quad SLI Review


Features of the XFX 7950 GX2 Quad SLI Graphics Cards

No Master Card Required

Unlike ATI's CrossFire neither Quad SLI nor normal SLI require a special master card to work. You simply get two Nvidia graphics cards that use the same GPU along with a SLI mainboard and compatible motherboard and you are ready to do SLI. Your graphics cards don't even need to have the same clock speeds as evidenced by this review. I am using an XFX 7950 GX2 XXX and an XFX 7950 GX2 Extreme edition for this review. The cards sport the same GPU, but different GPU and memory clock speeds. Moral of the story is so long as your GPUs are the same you can run SLI.

Specifications

I have reviewed the XFX 7950 GX2 XXX graphics card previously and you can see that full review here. The second graphics card in my Quad SLI rig is the XFX 7950 GX2 Extreme edition that is identical to the XXX version, save slight lower core clock speed of 520 MHz and memory speed of 1.3 GHz. I won't go over the other many features off the XFX 7950 GX2 here again, read the previous review for the gory details.

XFX 7950 GX2 Quad SLI in Use

Test System Specs

My test system for the Quad SLI review has the following specs:

  • CPU: AMD Athlon 64 X2 5000+
  • Mainboard: Asus M2N32-SLI Deluxe
  • RAM: OCZ PC2 8000 4-4-4-8 2 x 1GB
  • Graphics Cards: 1x XFX 7950 GX2 XXX; 1x XFX 7950 GX2 Extreme
  • PSU: PC Power & Cooling TurboCool 1KW
  • HDD: Seagate 750GB
  • Sound Card: Creative X-Fi
  • Display: Eizo CE240W 24" LCD
  • Display Driver: ForceWare 91.45

Bench Marks

Since I have reviewed the XFX 7950 GX2 XXX graphics card individually before, this time around for the quad SLI review I will just talk about the Quad SLI tests. I chose to use my standard bench suite consisting of Fear, Quake 4 and 3DMark06.

Fear

The first test I ran was Fear. With the review of the ATI X1900 CrossFire rig fresh in my mind, I wanted to see how Nvidia's new beastie measured up to ATI's offering. I allowed Fear to choose the settings for the first run and ended up with the following settings:

  • Single player physics: Med
  • Multiplayer physics: Med
  • Software sounds: Med
  • Particle bouncing: Med
  • Shell casings: On
  • World detail: Max
  • Corpse detail: Max
  • Effects details: Max
  • Model decals: Max
  • Water resolution: Med
  • Reflections & displays: Max
  • Volumetric lights: On
  • Volumetric light density: Med
  • FSAA: Off
  • Light detail: Max
  • Enable shadows: On
  • Soft shadow: Off
  • Texture filter: AF 4x
  • Texture resolution: Max
  • Video resolution: Med
  • Resolution: 1024 x 768
  • Shader: Max
At these settings the Quad SLI beast laughed in the face of Fear proving the only thing to Fear is Quad SLI. The minimum frame rate at these settings was 45 FPS, average frame rate was 101 FPS and the maximum frame rate was 218 FPS. Never once did the frame rates dip below the magic 40 FPS number.

Naturally with Quad SLI being the best graphics platform around and Fear being one of the most taxing games around on hardware, I turned everything to the max and ran the test loop again at 1600 x 1200. XFX's 7950 GX2 Quad SLI rig kicked Fear squarely in the testes with a fully maxed out setting run again never once dipping below 40 FPS! That's right Fear with everything on ran on the Quad SLI rig with a minimum frame rate of 45 FPS, average frame rate of 88 frickin' FPS and a max frame rate of 192 FPS!

3DMark06

For you synthetic fans out there Quad SLI again handed ATI the shortest time for a 3DMark06 record in my lab, beating it soundly right after the X1900 Crossfire review. XFX's Quad SLI rig burned through 3DMark06 at default settings and scored 8448 3DMarks. The details broke out like this:

  • Return to Proxycon: 32.301
  • firefly forest: 33.222
  • Deep Freeze: 41.018
  • CPU1: 0.613
  • CPU2: 0.960
To test the rig at my displays native resolution of 1920 x 1200 I ran 3DMark06 again at that resolution with all other settings the same. The final score was 7178 3DMarks!

Quake 4

The final benchmark that I ran on the XFX 7950 GX2 Quad SLI rig was Quake 4. I ran quake 4 at ultra quality with AA of 8x and AF of 4x, SMP enabled at a resolution of 1024 x 768 for the first test run. The average frame rate at these settings was 129.3 FPS. For the final Quake 4 test I ran the game at 1920 x 1200 with all other settings the same. The average frame rate was 41.1 FPS.

XFX NVIDIA 8800 GTX SLI

XFX NVIDIA 8800 GTX

XFX NVIDIA 8800 GTX

XFX NVIDIA 8800 GTX
benchmarks for the 8800 GTX SLI system. The test machine uses the following components:
  • CPU- Intel X6800
  • RAM- Crucial PC2 8000 2 x 1GB
  • HDD- 1x 74GB Raptor, 1x 750 GB Seagate
  • Mainboard: EVGA nForce 680i
  • PSU: Top Power 1000W

For the benchmarks I used 3DMark06, FEAR, Oblivion and Battlefield 2142. The first test up was 3DMark06 with the following results ran at defaults on 3DMark and defaults in the NVIDIA control panel as well.

  • Total 3DMarks- 13,012
  • SM2.0- 5866
  • HDR/SM3- 7018
  • CPU- 2492
  • Return to Proxycon- 47.769
  • Firefly Forest- 49.990
  • Canyon Flight- 78.886
  • Deep Freeze- 61.469
  • CPU1- 0.0796
  • CPU2- 1.249

To compare a single 8800 GTX on the same system, save for an abit AW9D-Max mainboard scored 10,355 3DMarks.

The next test I ran was with Battlefield 2142. BF2142 is a very popular game that while not overtly taxing for hardware really scales well performance wise for different graphics cards. Gamers that play BF2142 on slow machines and have to turn their settings down might as well be playing a different title graphics wise that the gamer that runs BF2142 maxed out. For this test I ran the game at max settings with the NVIDIA control panel set to 16x AA and 16xQ AF at a screen resolution of 1920 x 1200 and all game settings at max. The frame rates were recorded with FRAPS and are as follows:

  • Min- 47
  • Avg- 58
  • Max- 74

I noticed no problems whatsoever in the game play at these seriously high settings. Everything ran as smooth and fluid as it would at 640 x 480 on lesser rigs.

The next game title that I tested with was FEAR. FEAR is a very demanding game title graphically. For this test I turned all in game settings to max at 1920 x 1200 screen resolution and used 16x AA and 16xQ AF from the NVIDIA control panel. These setting would be impossible on any other gaming machine, but not so on my 8800 GTX system, look at this frame rate data:

  • Min- 31
  • Avg- 50
  • Max- 93

Those numbers are phenomenal even more so when you stop to think the settings I am using here for testing are actually higher than what the game alone can run. I could have played FEAR all night at these settings and never had a problem with frame rates.

Normally I don’t test with Oblivion, but for grins I started in up at the same super high settings of 1920 x 1200, 16x AA and 16xQ AF in an outdoor environment with lots of water, trees and moving grass. I was floored to get frame rates with 8800 GTX SLI that I typically get on other machines with settings a much lower levels, Fraps recorded an average frame rate of a bit over 43 frames per second. Most machines struggle to run Oblivion at that frame rate running at medium settings.

Overall, XFX 8800 GTX SLI will play any game about there at settings higher than what you can get in the game alone and not blink and eye. There is no doubt that this is the best performing graphics platform ever. Nothing else on the market comes close to the performance you can get with a pair of XFX 8800 GTX graphics cards in your gaming machine.

ATI PowerColor HD 2900 XT 512MB

ATI PowerColor HD 2900 XT

ATI PowerColor HD 2900 XT


Features & Specifications

The ATI PowerColor HD 2900 XT has a core clock of 740MHz and a memory clock of 1650MHz. One feature that ATI offers on the HD 2900 XT that NVIDIA doesn’t offer on any of their graphics cards is an onboard sound card. The sound card is placed on the ATI PowerColor HD 2900 XT so that you can use the DVI to HDMI adapter that is included with the card to carry both video and audio. This means you only need one connection to your HDTV or PC display and does away with the need for a sound card. The ATI PowerColor HD 2900 XT is HDCP complaint so you could use the ATI PowerColor HD 2900 XT in a seriously powerful media center computer that can play any movie or video game for PC on the current market.

Test System

Before we jump into the performance of the PowerColor HD 2900 XT, lets have a look at the test system that I am using for this review:

  • CPU- Intel QX6700
  • Mainboard- XFX 680i
  • RAM- PNY XLR8 PC2-9384 2GB
  • Display- Dell 30” 3007WFP-HC LCD
  • OS- Windows XP Pro SP2

Benchmarks & Performance

For benchmarking the ATI PowerColor HD 2900 XT I am using 3DMark06, FEAR, Battlefield 2142 and Quake 4. The first test up for the ATI PowerColor HD 2900 XT is 3DMark06.

3DMark06

I ran 3DMark06 at default settings in the application and default settings in the ATI drivers. The 3DMark06 test results were as follows:

  • Total 3DMarks- 11544
  • SM2.0- 4443
  • HDR/SM3.0- 5001
  • CPU- 4105
  • Return to Proxycon- 35.091
  • Firefly Forest- 38.938
  • Canyon Flight- 44.887
  • Deep Freeze- 55.130
  • CPU1- 1.359
  • CPU2- 1.984

To get an idea of how well the performance of the ATI PowerColor HD 2900 XT stands up against cards from NVIDIA the PNY 8800 Ultra Overclocked graphics card I tested scored 13100 3DMarks and the XFX 8800 GTX XXX I reviewed scored 11138. That puts the 3DMark06 performance of the ATI PowerColor HD 2900 XT a bit up on the overclocked 8800 GTX XXX from XFX and down significantly to the overclocked 8800 Ultra from PNY.

FEAR

The next test up was FEAR which I ran at a screen resolution of 2560 x 1600 with all settings on max and with soft shadows, 16X AF, and 4X AA. Using the FEAR in game test loop I recorded the following numbers:

  • Min- 12fps
  • Avg- 33 fps
  • Max- 72 fps

The percentages showed that frame rates were under 25 fps 16% of the time, 56% of the time frame rates were between 25 and 40 fps and 28% of the time frame rates were greater than 40 fps.

Battlefield 2142

Next up is one of my favorite games, Battlefield 2142. I ran BF2142 at a screen resolution of 2048 x 1536 with 4x AA and all settings on high. Using Fraps to record frame rate data on a single player map called “Fall of Berlin” yielded the following frame rate data:

  • Min- 20 fps
  • Avg- 44.73 fps
  • Max- 72 fps

Quake 4

The final test for the ATI PowerColor HD 2900 XT was with Quake 4, which is still a graphically demanding video game. I ran Quake 4 at 1920 x 1200 with ultra detail and 16x AA. I also had multi-core turned on as well. Again I used Fraps to record frame rate data on a single player game starting from the first scenes of the game until you are asked to go back for the medic. Fraps recorded the following frame rate data:

  • Min- 5 fps
  • Avg- 44.021
  • Max- 66 fps

Overall the game was very playable at these levels of detail, don’t be fooled by the low of 5 fps, that only happened when the game had to load more data entering new areas and only lasted for a few seconds max. The ATI PowerColor HD 2900 XT is a good performing graphics card that ATI fans will love to put into their gaming machines.

ASUS EAH3850 Top ATI HD 3850

ASUS EAH3850 Top


Specifications

The ASUS EAH3850 Top video card is overclocked by around 8% according to ASUS right out of the box. The card has a core frequency of 730MHz and a memory frequency of 1.9GHz. ASUS uses 256MB of GDDR3 RAM on my test card. Like all the Radeon cards form ATI, the ASUS EAH3850 Top has a built-in sound card allowing you to get video and sound over a single HDMI cable. The ASUS EAH3850 Top has no HDMI out, but ships with a DVI to HDMI adapter.

Test Machine

For this review I am using a new ATI test machine built on the new AMD Spider architecture. The test machine has the following specifications:

  • CPU- AMD Phenom 2.4 GHz
  • Mainboard- ASUS M3A32-MVP Deluxe
  • PSU- 700W Seasonic
  • RAM- Corsair PC2-8500 2GB
  • OS- Windows Vista Ultimate 32 bit

Benchmarks & Testing

To test the ASUS EAH3850 Top I will be using 3DMark06, Crysis, Quake Wars and Bioshock. The first test up for the ASUS EAH3850 Top is 3DMark06.

3DMark06

I ran 3DMark06 at default setting in the Catalyst control panel and default settings in the application. The ASUS EAH3850 Top scored as follows:

  • Total 3DMarks- 10257
  • SM2.0- 3979
  • HDR/SM3.0- 4547
  • CPU- 3833
  • Return to Proxycon- 32.301
  • Firefly Forest- 34.007
  • Canyon Flight- 42.040
  • Deep Freeze- 48.896
  • CPU1- 1.121
  • CPU2- 1.633

Since the processor in the test machine I use for NVIDI abased graphics cards is significantly faster than the AMD Phenom I am using in this test machine direct comparisons can’t be made. I have tested a stock clocked ATI HD 3850 in the same Spider test machine and it scored 9664 3DMarks.

Bioshock

The first gaming test up is Bioshock. I ran the game at a screen resolution of 1920 x 1200 with all settings on high in DirectX 10 mode. I used Fraps to record frame rate data starting directly after getting the first plasmid. Fraps recoded the following frame rate data:

  • Min- 19 fps
  • Avg- 32.757 fps
  • Max- 61 fps

Quake Wars

The next game up was Quake Wars, which I ran at a screen resolution of 1920 x 1200 with all setting son high. I used Fraps to record frame rate data on a single player campaign on the Africa refinery map. Fraps recoded the following frame rate data:

  • Min- 50 fps
  • Avg- 59.915 fps
  • Max- 61 fps

Crysis

The final game test was the system killer Crysis. This game brings even the fastest gaming system to their knees. I ran Crysis at a screen resolution of 1280 x 1024 with no AA and all settings on medium. I used Fraps to record frame rate data from the initial insertion point on the beach after the parachute jump until engaging the first enemies. Fraps recoded the following frame rate data:

  • Min- 26 fps
  • Avg- 49.917 fps
  • Max- 72 fps

Final Thoughts

When all the benchmarking was done the ASUS EAH3850 Top performed well. Any gamer who prefers ATI products over NVIDIA will want to check out the ASUS EAH3850 Top. This is one of the few factory overclocked ATI graphics cards you will find. The card also ships with the game Company of Heroes Opposing Fronts as well making for a great bundle.

XFX 9600 GT

XFX 9600 GT Graphics Card

XFX 9600 GT Graphics Card

XFX

Features & Specifications

The XFX 9600 GT Graphics Card we are looking at today is a stock video card that is not overclocked at all. The stock clock speed for the XFX 9600 GT Graphics Card is 650MHz on the core, 1625MHz on the shader clock and 1.8GHz on the memory. The XFX 9600 GT Graphics Card has 512MB of DDR3 RAM and a 256-bit memory bus. The card is compatible with PCI-E 2.0 as well as standard PCI-E.

Outputs include HDTV out and a pair of DVI outs capable of resolutions up to 2560 x 1600. The XFX 9600 GT Graphics Card is also HDCP compliant for playing back protected HD content. The MSRP for the XFX 9600 GT Graphics Card is $179.99. A minimum 400W power supply with one 6-pin power adapter is required.

Test Machine

The test machine used in this review has the following specifications:

  • CPU- Intel QX6850
  • Mainboard- XFX 680i
  • PSU- PC Power and Cooling 1KW
  • Display- Dell 30-inch
  • OS- Windows XP Pro
  • RAM- 2GB DDR2 PC6400

Benchmarks & Testing

To test the performance of the XFX 9600 GT Graphics Card I will be using 3DMark06, Crysis, Bioshock and Quake Wars. The first test up is 3DMark06.

3DMark06

I ran 3DMark06 at default settings in the NVIDIA control panel as well as the application. The XFX 9600 GT Graphics Card scored as follows on 3DMark06:

  • Total 3DMarks- 11397
  • SM2.0- 4665
  • HDR/SM3.0- 4403
  • CPU- 4707
  • Return to Proxycon- 37.492
  • Firefly Forest- 40.252
  • Canyon Flight- 39.817
  • Deep Freeze- 48.244
  • CPU1- 1.565
  • CPU2- 2.264

To compare scores for other recently tested video cards, the ASUS EAH3850 Top scored 10257 3DMarks (tested on different test machine), the XFX 8800 GTS 512MB scored 14012 3DMarks, and the XFX 8800 GT XXX scored 13546 3DMarks.

Crysis

I ran Crysis at a screen resolution of 1920 x 1200 with all settings on medium and 4x AA. I used Fraps to record frame rate data while playing a single player game starting from the original beach insertion until the fight with the first enemy soldiers on the beach. Fraps recorded the following frame rate data:

  • Min- 9 fps
  • Avg- 16.082 fps
  • Max- 28 fps

Bioshock

I ran Bioshock at a screen resolution of 1920 x 1200 with all settings on high. I used Fraps to record frame rate data on the Fontiane Fisheries level where you must leave your weapons behind in the pneumo. I used Fraps to record frame rate data and the numbers were as follows:

  • Min- 25 fps
  • Avg- 40.674 fps
  • Max- 69 fps

Quake Wars

The final game test for the XFX 9600 GT is Quake Wars. I ran the game at a resolution of 1920 x 1200 with all settings on high and 2x AA. I used Fraps to record frame rate data on a single player mission at the Africa refinery. Fraps recorded the following frame rate data:

  • Min- 44 fps
  • Avg- 59.697
  • Max- 63 fps

Final thoughts

The XFX 9600 GT performed very well in testing and easily beat the similarly priced ASUS HD 3850 Top that is mildly overclocked. Factory overclocked versions of the NVIDIA 9600 GT are coming and will significantly increase the performance of the 9600 GT. As it stands now it will be hard to find a better graphics card in the $150 to $200 price range than the XFX 9600 GT.

BFG GTX 280 OCX 1GB

No matter how many video card reviews we do, we know that there will always be something new and exciting just over the horizon. Just imagine, the now-legendary 8800GT was released almost a year ago and between then and now we have seen a flurry of releases from both Nvidia and ATI. While Team Red has progressed from their 3800-series directly to the new 4800-series, Nvidia has gone down a somewhat more winding road. The first 8800-series was augmented by the 8800GT and 8800GTS 512MB which were shortly joined by the 9800GTX, 9600GT and eventually the 9800 GX2. Most of these cards are still in play but have been now joined with Nvidia’s new assault on high end range with both the GeForce GTX 280 and GTX 260.

Almost since their release, the two new GTX 200-series cards have faced extremely tough competition from ATI in the form of the HD4800-series cards. Consumers have rejoiced to see the renewed performance war taking a significant toll on Nvidia’s pricing structure where cards once retailing for over $600 a few weeks prior now sometimes go for under $450. Now with last week’s formal introduction to the HD 4870 X2, Nvidia has officially lost the performance crown to a card that costs about $100 less than the GTX 280 did when it was first introduced. However, even though they no longer have the top dog on the block, Nvidia is hanging tough with their current cards while cutting prices a bit further so not all is lost…not by a long shot.

As the GTX 280 matures, Nvidia’s board partners have been able to eke a bit more performance out of their cards and have released products which carry higher and higher overclocks. While many enthusiasts may scoff at pre-overclocked cards, they hold an allure for many people out there since they offer increased performance right out of the box without having to go through the trail and error process of overclocking themselves. Through the last few years, BFG has always been at the forefront of the pre-overclocked craze and with their OCX cards, they take things to the next level. We should mention now that in our conversations with BFG they have stated that creating a highly overclocked GTX 280 isn’t as easy as it seems due to the massive amounts of heat generated by the core directly influencing the final overclock. That being said, in this review we will be looking at their GTX 280 OCX which is highest-clocked GTX 280 in their lineup that keeps the stock cooler. The only higher-clocked 280 sports a copper waterblock so it will be interesting to see how this particular air-cooled card copes with the increased heat output of the overclocked core.

While availability of this card seems extremely limited here in Canada, our friends south of the border have things a bit better with availability at several large retailers. Believe it or not, where this card was once retailing for somewhere north of $650, it seems that prices have come down enough that the GTX 280 OCX can be had for as little $450. Coupled with BFG’s lifetime warranty and newly-implemented Trade-Up program, $450 represents a surprising value in the grand scheme of things.

If the BFG GTX 280 OCX can perform up to our expectations, it may be a real winner for those of you who want some pre-overclocked goodness. Its performance however, has yet to be shown so let’s get this review under way!!

ATI Radeon HD 4670 512MB GDDR3

Every now and then both Nvidia and ATI release cards which are not necessarily at the forefront of performance, but offer something to those of us who don’t want to spend $200 or even $150 on a component which we will replace in a year. In the cut throat world of graphics cards, these little guys are often forgotten or chalked up as also-rans in the grand scheme of things. Even many of us reviewers seem to forget that there is a burgeoning market out there for cards which perform adequately while not costing a month’s worth of lunch money. Granted, readers love seeing the dizzying performance numbers the current high-end offerings achieve but there is a lot more going on behind the glitz and glitter. That is what we will be focusing on in this review: the little guys who make up the bread and butter of the ATI balance sheet. While the card reviewed here today won’t wow any of you with blistering performance numbers, it may just open your eyes to what else this diverse market has to offer.

Flush from their success with their highly-regarded HD 4800-series of cards, ATI knew that they needed to ride off their momentum and attack all parts of the market with the new R700 series architecture. The first step was to launch an assault on the mid-level market with their HD 4870 and HD 4850 which was followed by their high end offering, the HD 4870 X2. Now, in order to consolidate their stranglehold on nearly every price-point ATI is targeting the entry level price point with their HD 4600-series.

Yesterday marked the official release of the HD 4670 and the HD 4650 into the sub-$100 category which was up until now the sole domain of the Nvidia 9500 GT and before it the 8600-series. These cards are not for those of us who want the best of the best; they are for people who want a card to be a jack of all trades which can play games at lower resolutions while offering kick-ass HD video decoding. Both of these cards will have full HDMI and DisplayPort capabilities for HTPC aficionados along with a core based on the RV770 for casual gamers. All of this means that both cards should appeal to quite a few potential customers from a price / performance standpoint.

Even though there will be two cards launching today, in this review we will be looking at the HD 4670 512MB GDDR3 in its reference design. It is being released to fight the already-released 9500 GT GDDR3 on nearly every front from power consumption to performance to price. Indeed, since this HD 4670 512MB should retail for about $85, it is already ahead on the price angle considering the 9500 GT GDDR3 currently goes for around $95 when all is said and done. Another thing that we should point out is that the card we were sent is an engineering sample so when cards are released to retailers, expect quite a few different designs of both the PCB layout as well as the output selection. There will also be 1GB versions available as time goes on so the HD 4670 in one version or another should suit the vast majority of you looking for a budget graphics card.

All in all, this should prove to be a unique review for us since it is not every day that we take a look at a budget video card in detail. We usually see them trailing in our charts but to have one in the limelight is actually pretty exciting. So, let’s see what this pint-sized card has to offer.


Palit Radeon HD4870 512MB

In the last few months the mid and high end graphics card market has once again become a battleground between the two heavyweights: AMD (the artist formerly known as ATI) and Nvidia. There was a time where ATI slipped a bit with the release of their R600 and it seemed like Nvidia was going to run away with the graphics card market once and for all with the 8800-series. Luckily for all of us things began to brighten considerably for the boys in red when they introduced their RV670 core with the HD3870, HD3850 and eventually the HD3870X2. These cards were greeted with enthusiasm and completed well against their Nvidia equivalents but in the end the RV670 core was nothing more than a die shrink of the infamous R600. Hot on the heels of what could only be called an extremely successful product launch particularly with the HD3870; AMD is now poised to give us a completely new architecture in the form of the R770 core which adorns both the HD4850 and the HD4870. In this review we will be looking at the HD4870 which will be launching today.

In the hard-fought war between GPU manufacturers, both Nvidia and ATI know that it is best not to bring a knife to a gunfight or you will get stomped pretty hard by the competition. Interestingly, even though they are targeting their products to basically the same clients both companies have taken a decidedly different approach to the way they approach the market. On one hand we have Nvidia releasing ultra high-end, power hungry cards like the GTX 280 that cater those few gamers and enthusiasts who are willing to pony up $650 and more for a graphics card. On the other hand, AMD figures that the majority of gamers don’t want to spend $650 for a graphics card in this day and age so have taken a very different approach with the HD4870. With this card they are giving us a product that is supposed to be able to play the majority of games on the market with high IQ settings while retailing for around $300. This is a pretty lofty goal but it is well within the realm of possibility since the HD4850 (our review is due out shortly after this one) has proven to live up to everyone’s expectations and then some. With words like “recession” on everyone’s mind and the cost of living going through the roof due to record-high gas prices, it looks like AMD has targeted the prices of these cards at just the right place.

With ATI catapulting this product into the void left in the $300 price point by Nvidia’s seemingly knee-jerk move of reducing the price of their 9800 GTX to around $200, you would think they would want to get it out as soon as possible. While they may be chomping at the bit to get the HD4870 to market, today marks its soft launch with the majority of product only being available in fits and spurts between now and the starting of July. Believe it or not though, there is the very distinct possibility that this will turn into a hard launch since there is quite of few of these cards out there.

While many of you may be used to the Sapphire, Diamond and VisionTek cards on the market which do a good job of representing the ATI and AMD names here in Canada, there is one other player who’s HD4870 we will be reviewing today: Palit. There is not much about this card that really stands out from the competition since it is a stock-clocked product other than the fact that it is made by the world’s largest video card manufacture. Yes, that is right the company which hasn’t been heard much of here in North America is at the top of the pile when it comes to sales in Asia as well as Europe. Even though they have been in a large part absent from the North American market, they are taking us by storm with widespread availability of their cards at most leading retailers.

The HD4870 512MB represents quite a few firsts in the consumer graphics card world: the first single chip 1.2 teraflop card, the first implementation of GDDR5 and the first single chip AMD graphics card above the $290 price-point in quite some time. It definitely seems to have a lot going for it so without further ado, let’s get on with this review.

Asus EN9600GT



The Asus EN9600GT is one of the first graphics cards to use nVidia's new GeForce 9600 GT chipset. In previous generations, midrange GeForce cards like the 7600 GT and 8600 GT series were best suited for gamers who didn't mind turning down the details or resolution in a game to get a smooth frame rate. The 9600 GT series resets those expectations, offering very good gaming performance with the details cranked up, even at moderate-to-high resolutions.

The EN9600GT's GPU runs at 650MHz, with 512MB of GDDR3 memory running at 900MHz. Interestingly, the GeForce 9600 GT still lacks support for DirectX 10.1 (DX10.1), supporting the original DX10 instead. This omission isn't likely to ever be a critical issue, as the changes from DX10 to 10.1 are very minor, but it's surprising nevertheless. The EN9600GT supports High-Definition Content Protection (HDCP), ensuring full-resolution output of Blu-ray, HD DVD, and other protected high-definition formats.

This single-slot PCI Express (PCIe) x16 card uses a six-pin PCIe power connector and takes advantage of (but doesn't require) PCIe 2.0. It has a pair of dual-link DVI outputs as well as a component-video/S-Video connector. The card also includes a DVI-to-HDMI converter for connection to an HDTV, until now a rarity on GeForce cards. A small cable is included to connect the card to the digital audio connector on your motherboard, allowing the card to pass sound through the HDMI cable. The cooling fan is relatively quiet, making the EN9600GT suitable for use in a living-room PC.

The 9600 GT offers a serious performance boost over its 8600 GT predecessor, particularly at higher resolutions. Its F.E.A.R. scores of 89 frames per second (fps) at 1,280x1,024 and 54fps at 1,920x1,200 are more than double the 45fps and 20fps that the 8600 GT turned in at those resolutions. The 9600 GT even manages a playable 33fps in F.E.A.R. on a 30-inch monitor running at 2,560x1,600.

In our DX10 tests, the EN9600GT tops the similarly priced ATI Radeon HD 3850, clocking 38fps in World in Conflict and 52fps in Company of Heroes, compared to 25fps and 42fps for the HD 3850. Upping the resolution to 1,920x1,200 makes the difference even more dramatic, with the EN9600GT turning in 16fps at Very High settings in World in Conflict and 25.6fps in Company of Heroes, compared to 9fps and 11.5fps for the HD 3850.

The EN9600GT supports dual-card Scalable Link Interface (SLI) on compatible motherboards; nowadays, that's typically motherboards using the nVidia 680i or 780i chipset. The card lacks the additional connector needed to support three-card 3-Way SLI, a feature that nVidia seems to be reserving for its highest-end cards. Adding a second 9600 GT card provides a significant frame-rate boost, pushing our 1,280x1,024 Company of Heroes frame rate to 92.4fps and our World in Conflict results to 50fps.

The EN9600GT features nVidia's PureVideo HD, which lightens load on the CPU during DVD and Blu-ray playback by offering full hardware decoding of MPEG2 and H.264, and partial acceleration of VC-1 video. The 9-series cards add support for faster dual-stream decode acceleration, dynamic contrast enhancement, and automatic enhancement of green, blue, and skin tone colors. Video playback was smooth with both standard- and high-definition content, with excellent color reproduction.

Offering a level of 3D performance that until recently cost $100 more, this first entry in the 9 series is a welcome addition to the GeForce family.


XFX 8800 GTX XXX Edition


XFX 8800 GTX XXX Edition

Today we are looking at XFX's overclocked edition of the GeForce 8800 GTX, the XXX. If you thought the standard GTX was fast you haven’t seen anything yet. Armed with an additional 55MHz core and 200MHz memory speeds, you can practically hear the card begging you for more.

XFX was the first company to release their overclocked 8800's and they are certain to have your mouth watering. What's more is they also have an 8800 GTS XXX running at 550MHz core and 1.8GHz, dialed up a full 50MHz and 200MHz respectively. However, we're focusing solely on the big dog 8800 GTX XXX and a no holds barred gaming experience. Just like a sexy bartender you'd like to get triple X with serving up free Corona's as fast as you can drink them, it's everything you could dream of.

Unfortunately though the Corona’s are the only thing you’re getting for free, at about $635 from your favorite e-tailer this might be the most expensive XXX show you’ve ever paid for. The XFX 8800 GTX in overclocked form justifies the higher price with, of course, higher performance. The XXX edition picks up where NVIDIA’s standard 8800 GTX dropped off, upping the core speed of the reference model from 575MHz to 630MHz and memory speed from 1.8GHz to a full 2GHz. It comes with the same blistering performance in DirectX 9 gaming and full compliance with Microsoft’s imminent release of DirectX 10 which will deliver the ultimate 3D experience. The same 768MB of memory with a 384-bit bus lives under that sleek black heat sink, which still occupies two slots. Fan speeds do not seem to have changed at all, the XXX edition is still extremely quiet in both 2D and 3D operation; however, those three slots at the end are pumping a lot of heat into the rest of your PCI and PCI-E cards. Gone are those awesome looking black PCB’s, the standard green boards are the only option at this time. The DVI ports on the card are both Dual Link, supporting up to 2560x1600 resolution, so you can drive both of those massive 30” LCD’s without any worries.

XFX 8800 GTX XXX Edition

The bundle is no-nonsense; you’ve got the manual and install guides, two DVI to VGA adaptors, and the TV out cable. Personally, this fits my tastes just fine because I don’t see a reason to include some game or software that likely will never find its way on to my PC and if it keeps the price down all the better.

XFX 8800 GTX XXX Edition

The box art that XFX has come up with is a nice black and white box with the XFX bear done up in armor. It is a design that is easily distinguishable for anyone familiar with past XFX products and is not overdone with a dizzying array of colors and graphics. The graphics card name is clearly displayed in the middle of the package so as not to confuse it with a lesser model.

Now that we know what separates the XXX Edition from the reference 8800 GTX let’s see how it stacks up where it really counts, performance.

ASUS ROG EN9800GT



Asus Graphic Card

Following the mammoth success of the ROG (Republic of Gamers) EN9600GT MATRIX graphics card, ASUS today released the new ASUS ROG EN9800GT MATRIX/HTDI/512M. Combining cutting-edge technology and features, this innovative piece of hardware unleashes the true power of graphics cards – allowing gamers to enjoy unrivaled gaming experiences. It achieves all this by incorporating several unique features that include:

  • Integrated hardware and software for total graphics card control
  • Accurate adjustments of the GPU and memory voltages
  • Complete monitoring of the GPU/memory/Power IC/ambient temperatures
  • Complete monitoring of the GPU/memory/board power consumption
  • Automatic control of fan speeds according to advanced GPU loading detection
  • Power savings that surpasses generic boards
  • Customizable functions to add a new level of control for gamers

  • The Perfect Answer to Gaming Needs
    A stylish design that comprises of daring lines, a futuristic-looking Hybrid Cooler and a black circuit board allows astute gamers to instantly identify the ASUS ROG EN9800GT MATRIX as a piece of top-notch gaming hardware; while the ROG logo prominently displayed on the cooler is a mark of promise to signify that this graphics card will deliver extreme gaming performances. Equipped with the ASUS Super Hybrid Engine, Hybrid Cooler technology and iTracker, 15% performance boosts in 3D mode can be achieved; while 30% less power consumption and much quieter cooling can also be guaranteed under the 2D mode – allowing gamers to fulfill their every gaming requirement.
  • Auto Hardware Detection and Adjustments with Super Hybrid Engine
    The new Super Hybrid Engine technology showcases its intelligence via a two-fold process. First gathering detailed information about the GPU loading and temperature, memory and power IC; Super Hybrid Engine then calculates an optimized solution for the best performance. Furthermore, all of this happens in real time, without requiring any tinkering from users – making it easy to achieve multi-level GPU and memory voltage adjustments for the ultimate graphical performance or maximum energy savings.
  • Hybrid Cooler for Auto-managed Proactive Cooling
    The leading and intelligent thermal innovation Hybrid Cooler draws its inspiration from Hybrid cars of the same namesake. With a revolutionary combination of the fan and heatsink solutions on one card, the Hybrid Cooler targets guaranteed performance with power savings in mind. When required, the fan and heatsink will work together for optimum cooling, while the fan is automatically controlled according to advanced GPU loading detection – providing users with more proactive cooling and power savings in the event of lower graphics utilization – much like real Hybrid cars.
  • iTracker Offers Easy User Selection for the Most Suitable Usage
    The iTracker feature comes built-in with the Super Hybrid Engine technology, and offers 4 default profiles for entry-level gamers: namely the Optimized Mode, Gaming Mode, Power Saving Mode, and Default mode. This handy application is also able to display real-time graphics card information that includes the GPU/shader/memory clock, GPU/memory voltage, GPU loading, GPU/memory/power IC/ambient temperature, and fan speed. In addition, the 5th profile – the User Defined Mode, allows timely adjustments that include GPU/shader/memory clock, GPU/memory voltage, and fan speeds to suit different individual requirements.
  • Up to 15% Faster in 3D Mode
    GPU and memory voltage boosts from1.2V up to 1.301V and from 1.9V up to 1.92V respectively when the ROG EN9800GT MATRIX is running in 3D mode. This allows the GPU, shader and memory clock performance to be boosted from600MHz up to 750MHz, from 1800MHz up to 2000MHz, and from 1512MHz up to 1753MHz respectively. Such unprecedented performance upgrades helps raise the ROG EN9800GT MATRIX’s 3DMark Vantage Extreme Mode score from 2013 to 2308 – an unbelievable 15% speed improvement when compared to any other reference designed board*. Gamers can now feel the adrenaline rush of superb gaming performance – only with the ASUS ROG EN9800GT MATRIX/HTDI/512M.
  • Up to 30% Power Saved and Quiet Cooling in 2D Mode
    While in 2D modes, less power consumption is required, and the ROG EN9800GT MATRIX is able to reduce GPU and memory voltages from1.2V down to 0.9V, and from 1.9V to 1.8V respectively for great energy savings. Furthermore, the Auto Phase Switch technology optimizes the power supplied by the power phase for maximum power efficiency. In combining the exceptional voltage reduction with the Auto Phase Switch technology, the ROG EN9800GT MATRIX's power consumption is lowered from 46.27W to 32.44W – an astonishing 30% in savings when compared to any other reference designed board**. Additionally, Hybrid Cooler will help reduce fan rotation speeds in 2D modes – allowing users to enjoy much quieter cooling for comfortable computing environments.