Mouse Mat home page

Hobbies

Building your own computer

Graphics (aka Video) cards

As mentioned on other pages, you well have your sound and video graphics integrated into your motherboard, in which case there is nothing else to buy! Let's assume you have neither.

GeForce 2 Video Card
Graphics card.
To see anything, your computer sends visual information to your monitor via your Graphics card. This plugs into a special brown expansion socket on the motherboard, called an AGP socket (Advanced Graphics Processor). This card can be very simple (for 2D office use) or very, very expensive, costing more than all other components, if you intend to play the latest games and want the ultimate in realism and 3D speed.

Memory requirements of the graphics card also determine the screen resolution you can see. This is simple maths. If you want to watch at 1024 x 768 (the most common resolution today) then you need 1024 times 768 bytes of memory (less than 1Mb). Except this would be in black and white. Each colour requires more memory. So to watch 1024 x 768 in anything from 64,000 to the full 16 million colours you need about 4Mb of memory. Much more if your graphics card supports textures or does a lot of processing for the next frame you are about to see. 32Mb, 64Mb 128Mb or higher is not unusual for the top end games cards.

Output name Screen Size No. of colours Memory req.
VGA 640 x 480 (old standard, Windows XP doesn't even use this anymore) 16 1Mb
SVGA 800 x 600 (used for old 14"/15" monitors that just can't manage a better resolution) 256 2Mb
XGA 1024 x 768 (common) 64K - 16M 4Mb
SXGA 1280 x 1024 (not unusual) 16 Million 8Mb
Super XGA+ 1400 x 1050 (rare) 16 Million 8Mb
Ultra XGA 1600 x 1200 (you need a 19" screen to use this realistically) 16 Million 8Mb


If your computer comes with an "integrated" video card, it will use "shared memory". That is, it steals some main memory from the computer to use as video memory, eliminating the requirement for dedicated (fast) Video RAM. There are two drawbacks to shared RAM: the video controller accesses shared RAM slower than dedicated Video RAM, and the memory used by the video card reduces total system RAM available to software programs. These days you might have 256Mb or more in your computer so "giving away" 32Mb to your video card is probably not the great deal it once was.

The benefit of shared RAM is that it allows PC manufacturers to cut cost by eliminating Video Ram. Thus it is a cheaper option for you too, and you might not even notice (or care!). Personally, however, I would think carefully about integrated Graphics cards, not only for the reasons above, but even more limiting is that it is likely you will nothave an empty AGP socket waiting for you on the motherboard in which to insert a better graphics card. The motherboard manufacturer has saved that cost - but limited your expansion potential. Of course, if you have integrated graphics and an empty AGP slot for future upgrades then you have the best of both worlds.

Refreshing
The Refresh Rate is the rate at which the images can be displayed on your monitor in one second. Although you might think your monitor is a static picture, what is happening is that a tiny dot of light is whizzing back and forth across and up-and-down your monitor many times a second. Because your eyes (and brain, actually) are fooled, you think you are seeing a continuous picture. This all works just like the TV does.

Now in order for you to be completely fooled, and not see a flickery old picture like some black and white movie from the Charlie Chaplin era, the picture must be refreshed at least 72 times a second (written as 72Hz, because Hz is short for "Hertz", after whom this unit of measurement is named). Which begs the question, why do so many monitor manufacturers allow a fresh rate of just 60Hz for some screen resolutions?

I'll tell you why. It's so monitor manufacturers (and hence complete PC system suppliers) can boast that their PC can display "up to 1280 x 1024" - what they then don't say is that this resolution can be accomplished with a fresh rate of just 60Hz. If you don't suffer from seizures before looking at such a screen for any length of time, you will afterwards. In the UK, at least, Health and Safety regulations prohibit such a low refresh rate in the workplace, which should tell you something. Tip: Ignore all resolutions that cannot be displayed at 72Hz or above.

Some people are more sensitive about refresh rates than others. Large expanses of white (on a blank word processing document, for example) also exacerbate the perception of flicker. The larger the monitor, the higher the refresh rate ought to be. 72Hz is fine for 15"-17" monitors for most people, but use that same refresh rate on a 19" monitor and you will probably get a headache after a while. These days, the accepted "flicker free" refresh rate for all people at all resolutions for all monitor sizes is 85Hz. Make sure your graphics card and monitor can support this at the resolution you intend working at.

Frames per second
The holy grail of graphics cards is speed. How many complete frames (as in movie-frames) a second can it display? This is not to be confused with refresh rate mentioned above. Frames refers to complete, individual snapshots, just like you see on a movie celluloid.

Less than 20 and your game quickly becomes unplayable as it is too stop-start and jerky. Over 60 is just wasting power (and hence money) because the eye can't tell the difference any more. At least I think that's true. The standard reference for testing frames-per-second is a game called Quake. Reviewers drool over any card generating more than 200 frames per second, and some processor/graphics card combinations are breaking the 300 frames per second. Maybe I'm missing something here?

Anyhow, any number of frames from 30 to 300 means your game's realism, in terms of smooth, highly detailed action, is just liquid gold (hey, poetic yes?). Effectively, your game runs just like a top quality, 3D animated film, of the Jurassic Park type, not Tom and Jerry.

The following table shows two animated pictures; the left one is jerky and moves a lot between refreshes. The right one is smooth(er) is much easier on the eye (look, this is only a simulation, right?).


If you are not a gamer (or only now and then will dabble in a quick shoot-em-up game and you're just not prepared to spend loads on a graphics card) then 32Mb is adequate. And the graphics card will cost less than 50. The more you pay, the more processing the card can reasonably be expected to do, making games much more realistic, with reflections, texture smoothing and so on. So a person's face displayed in a game on a 200 card will look more realistic than on a card costing 50. But that doesn't mean you can't play the game just as well with the cheaper card.

Recommendation: Buy a cheap 32Mb GeForce 2 card unless you have specific requirements in which case you're probably already aware of the latest cards by nVidia (GeForce 4) and ATI (Radeon 9000). Make sure you get drivers for the operating system you are intending to run. And remember that it is the chipset in the graphics card that determines its capabilities not the card manufacturer.

Back to components page.