Video – Analog vs. Digital
Posted on June 24, 2013 by KVMG-CMS | 1 comments
Video is a big deal. Your computer probably has a DVD drive and the software to play DVD video. It might even have a Blu-ray drive. You could have bought the latest, most detailed game on the market. Your computer can send those high-definition (hi-def) signals to a monitor or television.
Many computers come with multiple video output ports and many monitors and televisions come with multiple video inputs. You want to get the best signal. How do you choose which connection to use?
VGA (analog video) has been an active video standard for personal computers for a very long time. DVI, as well as HDMI and DisplayPort (digital video) are making a tough case for the aging analog port, and the debate between digital vs. analog will go on for quite some time. It's very easy to assume that VGA, because it is older, is inferior and obsolete, though you might be surprised to find out that VGA is still superior in some ways.
VGA is an analog technology. Computers use a digital signal internally which requires the graphics cards to convert the video signal to an analog form in order to send the signal to a VGA connection.
Because VGA is an analog standard, however, you are able to push through a much high resolution at a greater range of signal bandwidth than is possible with DVI or HDMI in their present configurations. The reason for this is the flexibility of analog vs. digital video connections. Compared to digital signals, analog is capable of pushing more data through a smaller connection due to an increased signal density. One analogy would be comparing a glass of water to a glass of ice cubes. The cubes are very defined pieces of information that take up more space than the equivalent amount of water.
The downside to analog signals is their capability of receiving interference from outside sources. If you were to sit an active VGA cable next to a transmitter sending out signals at the right frequency, this would introduce noise to the connection that can result in a distorted signal. Longer VGA cables often present a more distorted image than shorter ones due the increased area exposed to potential noise. One solution to this problem is better shielding and thicker cabling which helps to block these outside influences. For this reason, buying a quality VGA cable is actually recommended over a budget option in settings where video quality is a must.
DVI is a newer technology, but is still experienced with more than ten years. It is digital, so no conversion is necessary before sending the signal to the monitor.
DVI signal is crisper and generally free of any noise or distortion thanks to the bit-specific nature of digital. DVI keeps data in digital form from the computer to the monitor. There's no need to convert data from digital information to analog information. LCD monitors work in a digital mode and support the DVI format, although some also accept analog information which is then converted to digital format.
With DVI you don't have to worry about buying the highest quality cable to achieve a quality video connection. Professional environments aside, just about any standard DVI cable (free of obvious defects) should give you a clear image regardless of how much signal noise is present in the room.
The run-down of VGA vs. DVI is summarized in the following table:
VGA | DVI | |
Distance | 100 feet, varies by equipment & resolution | 16 feet without booster |
Clarity | May ghost* w/ high resolution or distance | Clear signal or no signal |
Audio | No audio | No audio, although most drivers include audio for HDMI adapters |
Connects to | VGA; DVI-I or DVI-A via adapters | DVI; HDMI or VGA via adapters |
EMI** interference | Very susceptible to interference | Little vulnerability |
Cable cost | Less expensive | More expensive |
* Ghosting – The same image, but offset a little from the original causing a blurred image
** EMI – Electro Magnetic Interference, created by electronic devices
At one time, a digital signal offered better image quality compared to analog technology. However, analog signal processing technology has improved over the years and the difference in quality is now minimal.
The two types of connectors treat the video signal very differently, but in video resolutions up to 1920x1080 show very similar pictures in most situations.
Theoretically, VGA and DVI will both provide the same signal. DVI will provide the exact same quality at any distance from 1-16ft. VGA, because it is analog, may lose some signal strength that may result in signal loss. This can lead to an image that is not as crisp. However, these factors are dependent on the graphics card, monitor, and any interference in the cable.
If you have DVI - stick with it!
If your computer(s) and the monitor (and KVM switch, if you're using one) feature both DVI and VGA ports - use DVI.
DVI's signal is digital, thus there is no conversion of the signal which means no loss of signal and no lag. DVI is also less susceptible to interference and will work well with just about any standard cable, providing a clear image regardless of how much signal noise is present in the room.
But, in a mixed environment...
If you are able to use either VGA or DVI, and cost or other considerations (such as a mixed DVI-I and VGA computer environment using a KVM switch) make you lean towards using VGA, it is very likely that at resolutions of 1920x1080 or less you will not be compromising your video quality by using VGA.
This has been very helpful! I wanted to connect my desktop and laptop (both supporting VGA as well as DVI) to a KVM switch and was wondering whether I should pay more for one supporting DVI. Based on you recommendation I chose a VGA KVM switch, saved a ton, and the video is crystal clear.