For optimal confusion value, there are various breeds of DVI using loosely the same connector...
In summary: DVI-A - this is basically the analogue VGA signal that we all know and love on a funky new connector. DVI-D - Digital signal for driving flat-panel monitors without a needless digital-analogue-digital conversion. Uses different pins to DVI-A. 'dual link' provides a second digital signal on yet more pins to give sufficient bandwidth for driving *very* high resolution displays. DVI-I - Both of the above on the same connector (the digital bit may be single or dual link). This is the one that you typically find on graphics cards.
Since a graphics card with a DVI-I output has all the usual VGA signals on it, it's a trivial matter of using a plug adaptor to connect a standard VGA montor.
Don't get too hung up on integrated graphics, basic <20quid graphics cards are perfectly good if you don't care about 3D performance, and tend to be passively cooled (graphics card fans are evil noisy glued-on things which tend to clog with
( ... )
Since a graphics card with a DVI-I output has all the usual VGA signals on it, it's a trivial matter of using a plug adaptor to connect a standard VGA montor.
OK cool.
If it just says DVI - on, say, the Dabs site - will that be DVI-I?
Don't get too hung up on integrated graphics, basic <20quid graphics cards are perfectly good if you don't care about 3D performance, and tend to be passively cooled
aha that is useful to know. Less heat = less fans = less noise = better was indeed an equation I had in mind, but I think somehow I had got the idea that they all had fans, or at least that heat from the graphics card would be something I had to worry about.
Re northbridge fan - I think I am avoiding this already, yes, but may come back for further advice if I discover I don't know what I'm doing :-)
You can get graphics cards with both VGA and DVI ports. I use both on my card to run dual monitor. So you could run your current monitor and a future monitor without even needing an adapter.
Also, my flat screen TFT monitors have the usual VGA cables. I have to use a DVI adapter to plug one of them into my graphics card.
Connecting DVI to VGA is easy, it's VGA to DVI that doesn't always work well. Others may disagree, but I personally think it's better to have a DVI to DVI connection with a TFT monitor - the difference on some models is considerable.
There are graphics cards with only DVI-D (digital only) connections, but they're rare and tend to have a VGA (HD15) connector on the same card.
Any cable described as 'DVI' with no further explanation will be a single link, DVI-D. This will be fine in 90% of cases - the number of displays or situations needing a dual link cable are few, and in my experience it's just as effective to connect a DVI-A device using VGA.
I personally think it's better to have a DVI to DVI connection with a TFT monitor - the difference on some models is considerable.
What he said. My home machine has a nice Philips 17" TFT display. It takes DVI or VGA input, and my machine can put out either. With VGA the display is Ok; with DVI it's lovely.
Comments 10
In summary:
DVI-A - this is basically the analogue VGA signal that we all know and love on a funky new connector.
DVI-D - Digital signal for driving flat-panel monitors without a needless digital-analogue-digital conversion. Uses different pins to DVI-A. 'dual link' provides a second digital signal on yet more pins to give sufficient bandwidth for driving *very* high resolution displays.
DVI-I - Both of the above on the same connector (the digital bit may be single or dual link). This is the one that you typically find on graphics cards.
Since a graphics card with a DVI-I output has all the usual VGA signals on it, it's a trivial matter of using a plug adaptor to connect a standard VGA montor.
Don't get too hung up on integrated graphics, basic <20quid graphics cards are perfectly good if you don't care about 3D performance, and tend to be passively cooled (graphics card fans are evil noisy glued-on things which tend to clog with ( ... )
Reply
OK cool.
If it just says DVI - on, say, the Dabs site - will that be DVI-I?
Don't get too hung up on integrated graphics, basic <20quid graphics cards are perfectly good if you don't care about 3D performance, and tend to be passively cooled
aha that is useful to know. Less heat = less fans = less noise = better was indeed an equation I had in mind, but I think somehow I had got the idea that they all had fans, or at least that heat from the graphics card would be something I had to worry about.
Re northbridge fan - I think I am avoiding this already, yes, but may come back for further advice if I discover I don't know what I'm doing :-)
thanks!
Reply
My motherboard has a DVI-I connector on it, so they are out there.
Reply
Oh right OK. Not really a priority for this machine though I think.
Got my eye on this one now, on the grounds that it's (a) DVI-I (b) not too pricey :-)
Reply
Also, my flat screen TFT monitors have the usual VGA cables. I have to use a DVI adapter to plug one of them into my graphics card.
Reply
Hmm, so does that mean that although it's TFT, it only accepts an analog signal?
Reply
Reply
There are graphics cards with only DVI-D (digital only) connections, but they're rare and tend to have a VGA (HD15) connector on the same card.
Any cable described as 'DVI' with no further explanation will be a single link, DVI-D. This will be fine in 90% of cases - the number of displays or situations needing a dual link cable are few, and in my experience it's just as effective to connect a DVI-A device using VGA.
Reply
What he said. My home machine has a nice Philips 17" TFT display. It takes DVI or VGA input, and my machine can put out either. With VGA the display is Ok; with DVI it's lovely.
Reply
Leave a comment