Years ago, Proctor & Gamble, RJ Reynolds, and other consumer products giants discovered that the best way to increase their sales was to go into competition with themselves. Instead of making only one or two brands of laundry detergent or breakfast cereal or floor wax or cigarette, they started making dozens of brands, all similar, but each appealing to a different type of buyer. Just as important, the flood of brands meant the giants could stifle competitors by monopolizing scarce retail shelf space.
The two major video card makers, ATI and NVIDIA, have taken this scheme to ridiculous extremes. Each produces two dozen or more video chipsets, from old, slow models that are used in $20 video cards to models that are used in $700 gaming cards. Any particular chipset may be available in variants that run at different clock speeds, and any of these may be used to build video adapters with different amounts, types, and speeds of memory, and so on. Many different companies use ATI and/or NVIDIA video chipsets to build their own branded video cards, further increasing the number of choices and the level of confusion. (When we checked NewEgg.com while writing this paragraph, we found an incredible 569 different video cards for sale 238 PCIe, 283 AGP, and even 48 PCI models.)
By unit volume, Intel is by far the largest maker of video adapters, but all of their adapters are integrated into system chipsets. Intel video adapters provide excellent 2D display quality, but only pedestrian 3D graphics performance. In late 2005, Intel announced that it would no longer produce low-end chipsets with integrated graphics, leaving that segment of the market to other chipset makers such as ATI and SiS. We believe that Intel made this decision because they understood that their current integrated graphics products were inadequate to support the advanced video requirements of Windows Vista, and we expect Intel to remedy that problem in 2006 and 2007 with new series of chipsets that include integrated graphics sufficient for Vista.
We were about to say that even we can't keep everything straight, but in fact it's worse than that. Based on conversations with some of our contacts at ATI and NVIDIA, we don't think they can keep everything straight. So which video card should you buy? Here are some guidelines:
If you don't play 3D games, 3D graphics performance doesn't matter, unless you plan to upgrade to Windows Vista. Any current video card, even a $35 model, is fast enough to run productivity software and similar 2D applications under Windows XP or Linux.
If you're a casual gamer, spending $75 to $100 buys you much faster 3D graphics performance. If you're a serious gamer, plan to spend at least $150 to $250 to be able to play recent games at reasonable frame rates. If you're a rabid gamer, well, the sky's the limit.
If you are upgrading a system to be compatible with Windows Vista, choose a card with at least midrange 3D graphics performance, something like an NVIDIA 6800GT or better with at least 128 MB of video memory.
Rather than replace a failed video card or integrated video, consider spending a bit more to replace the motherboard instead. The integrated video on current motherboards is more than sufficient for anything other than gaming (or Vista), and by replacing the motherboard you also get a newer chipset, a new BIOS, new features such as SATA, and a new motherboard that has a warranty and will probably last for years.
DON'T BUY A VIDEO-SLOTLESS MOTHERBOARD
Make sure that any motherboard you buy allows embedded video to be disabled and provides an AGP or PCIe slot. That way, you can upgrade the video later if you need to.
If you buy a standalone video adapter, remember that video is just one part of your system. It makes no sense to buy a fast gaming video card for an older system. You'll see some video performance benefit, certainly, but the card can't reach anything near its real potential when hampered by the relatively slow processor, memory, and other components in the older system. Other than for gaming or to add extra features, such as TV capture, we recommend spending no more than $50 on a video adapter to upgrade an older system.
INEXPENSIVE DOESN'T NECESSARILY MEAN SLOW
If you need better 3D graphics performance than embedded video provides but you don't want to spend much, look at "obsolescent" 3D video adapters those a couple generations out of date. For example, in late 2005 ATI RADEON 9250-series adapters were available for $30 or so. The RADEON 9250 or a similar obsolescent NVIDIA adapter can't compare in graphics performance to a current $400 gaming card or even to a $100 midrange adapter but for many people, it's just the right compromise between cost and performance.
If you buy an older adapter, make sure to verify the level of DirectX it supports. The RADEON 9250, for example, supports DX8. If you plan to play a game that requires DX9, the older adapter will be of little benefit. This problem is self-limiting, though. You are unlikely to want to run 3D applications that require the most recent version of DirectX on an older card. Such applications require more graphics power than older cards can provide.
If you decide to install a standalone video adapter, buy only a Universal AGP 3.0 or an x16 PCIe adapter. Check the motherboard manual to verify which type or types of adapter it supports, and then buy accordingly. Most recent AGP motherboards use either 0.8V AGP 3.0 cards only or 1.5V AGP 2.0 and 0.8V AGP 3.0 cards interchangeably. Fortunately, the PCI Express standard has not yet fragmented, so any PCIe adapter works with any PCIe motherboard.
Display quality is subjective and very difficult to quantify, but a real issue nonetheless. Matrox video adapters have always been the standard by which we judge 2D display quality, but Matrox adapters are no longer competitive in terms of 3D performance. The three major video chipset companies are ATI and NVIDIA, both of which provide chipsets that are used both for standalone AGP adapters and for embedded video, and Intel, whose video adapters are available only in integrated form. ATI and Intel video adapters have always provided excellent 2D display quality just a half-step behind Matrox so we don't hesitate to recommend either of those brands to anyone who is concerned about 2D quality. Older NVIDIA adapters, especially high-performance models, often favored 3D performance at the expense of 2D image quality. NVIDIA 6000-series and later video adapters have excellent 2D display quality.
Video cards used to be among the most reliable components of a PC. This is changing, not because manufacturers are cutting corners, but because new high-performance video cards are pushing hardware technology to the limit. Having a video card die after only six months or a year is now relatively common, particularly for those who push the card past its limit by overclocking it in pursuit of the highest possible performance. We've seen video cards with 90-day warranties, which is completely unacceptable. Regard one year as an absolute minimum, and longer is better.
Video adapters differ in the resolutions and refresh rates they support. It's important to make sure that the video adapter you buy supports resolutions and refresh rates appropriate for your display.
Resolution describes the maximum number of pixels the adapter can display, horizontally and vertically. For example, an inexpensive display adapter may support resolutions of 640x480 (640 pixels horizontally by 480 pixels vertically), 800x 600, 1024x768, 1280x1024, and 1600x1200. If you have a large CRT monitor that supports 1920x1440 resolution (and you use that resolution), the maximum resolution of this video card is too low to be optimum for your monitor. Not all video adapters support all intermediate resolutions. For example, although the adapter we're discussing supports 1024x768 and 1280x1024, it may not support the intermediate 1152x864 resolution. If you have a display that is optimized for that resolution, you may want to look for a different video adapter that also supports 1152x864.
Refresh rate is the number of times the image is renewed per second. If the refresh rate is too low, the image on a CRT monitor flickers. (Flat-panel LCD displays operate at low refresh rates without flickering.) The minimum refresh rate necessary to avoid flicker depends on many factors, including the phosphors used in the monitor, the size and resolution of the monitor, ambient lighting conditions, and your own vision. Most people consider 72 Hz the minimum acceptable refresh rate on small to midsize CRT monitors, and 85 Hz is better. On large CRT monitors, a 100 Hz or higher refresh rate may be necessary to avoid flicker. The maximum refresh rate supported by a video adapter is related to the resolution you use. For example, a particular video adapter may support a 120 Hz refresh rate at 1024x768, but only 85 Hz at 1280x1024 and 60 Hz at 1600x1200. Despite the fact that the adapter technically supports 1600x1200, most people will find the 60 Hz refresh rate unacceptable, so in practical terms that adapter is limited to 1280x1024.
Most analog CRT monitors use the familiar high-density DB15 VGA connector, although a few high-end models also support RGB component video. Flat-panel displays (FPDs) use a variety of connectors, including the analog VGA connector (typically used by low-end FPDs), or one of three different types of DVI connectors described earlier in this section. Midrange and higher FPDs normally provide a DVI-D or DVI-I digital connector, and may also provide a DB-15 analog connector. If you plan to run dual displays, make sure the video adapter you choose has dual connectors of the type(s) needed by your displays, and that those connectors can be used simultaneously.
Advice from Jim Cooley
Before you buy a dual-DVI video adapter, verify the type of cables it uses. Some video adapters that support dual-DVI displays don't use two ordinary DVI cables. Instead, they require a special dual-DVI cable with a custom connector on the video adapter end. A dual-DVI cable for my Matrox video adapter set me back $100.
USE A DIGITAL VIDEO CARD WITH A DIGITAL DISPLAY
If you plan to use an FPD, whenever possible choose an FPD and a video adapter that both support digital (DVI-I or DVI-D) connectors. FPD image quality is hit-or-miss if you use an analog connection, because video data originates in digital form, is converted to analog by the video adapter, and is then converted back to digital by the FPD. These multiple AD/DA conversions can introduce various artifacts, none of which is pleasing. The inferiority of an analog video connection is particularly noticeable with 19" and larger flat-panel displays.
Also, some analog video adapters simply don't "play nice" with some FPDs, and there's enough variation from one sample to another that about the only way to know for sure is to try it. This problem seems to be worse with DB-15 analog connections than with DVI-A connections, but the only sure way to avoid it is to use a digital path end-to-end.