A video adapter, also called a graphics adapter, renders video data provided by the processor into a form that the monitor can display. Many motherboards provide an integrated video adapter. Most also provide a special video expansion slot that accepts a standalone video adapter card. A stand-alone video adapter is called a video card or graphics card. Upgrading video is fast, easy, and generally inexpensive. There are several good reasons to do so, including:
- Your current video card or integrated video has failed.
- You've installed a larger display and require higher-resolution video than the old adapter provides.
- You need a digital video (DVI) connector to drive a flat-panel LCD display.
- You want to record television programs or other video sources to your hard drive.
- You want to watch DVD-Video discs or other fast-motion video sources without jerkiness.
- Your current video adapter is too slow for or incompatible with the games you want to play.
DirectX (DX) is a Microsoft graphics standard that's particularly important to gamers. The oldest video cards still being produced support DX7. Entry-level video adapters usually support DX8. Current midrange and high-end video adapters support DX9.0a, DX9.0b, DX9.0c, or DX9.0L. DX10, originally called Windows Graphics Foundation, and DX11 will be released with Windows Vista.
In this section, we tell you what you need to know to choose and install a video adapter, and to troubleshoot video adapter problems.
Can't find what you need? Try this upgrade and repair index.