What Is Better, DVI or VGA?
Video Graphics Array (VGA) and Digital Visual Interface (DVI) are the two most common computer input technologies. Many desktop video cards and computer monitors allow either connection, and while both find standard use, the newer DVI offers more features and compatibility.
Digital vs. Analog
Analog signals, such as the signals recorded onto video tapes or sent by original RCA connectors, may degrade during transmission. For analog cables, longer cords or heavily tangled cords deliver less crisp signals than shorter and more organized cords. Digital signals suffer no degradation in transmission and are able to transmit without error.
VGA describes a type of connection, as well as the signal and cables it allows. Developed in 1987, the VGA connector worked as the standard PC visual connector for the next decade. VGA cables transmit an analog RGB signal through 15 individual pins.
DVI came onto the market in 1999. As the name suggests, DVI transmits digital video signals, though it remains compatible with analog transmissions. Signal degradation can still take place in very long DVI cables. DVI is compatible with the older VGA as well as other connections, such as HDMI.
DVI and VGA both encompass other connections that fall under the same general technical specifications. Micro-DVI and Mini-DVI have small physical differences, while DVI-A, DVI-D and DVI-I describe DVI cables with analog-only, digital-only and integrated capability, respectively. SVGA, XGA and SXGA are physically the same as normal VGA cables.
Many monitors and computers feature DVI as well as VGA connectors. If your monitor and computer use different connections, small adapters can convert between the two plug formats. If you have the option, a DVI connection makes the better choice for full compatibility and signal quality.