What is VGA?

Estimated read time 4 min read

[ad_1]

A VGA Connector Outline on a rainbow background
Benj Edwards/How-To Geek

You’ve probably seen a VGA video connector on the back of a TV set, video projector, or even on a computer. We’ll explain what it is, where it came from, and help you decide if you should use it.

Video Graphics Array: An Analog Video Standard

VGA (short for Video Graphics Array) is an analog video standard created by IBM in 1987 for the IBM PS/2 series of computers. Since then, the computer industry has extended the standard and included it in millions of products. VGA uses a high-density 15-pin D-sub connector (called DE-15), often with two thumb screws to secure the connector in place.

A male VGA connector on a cable end
mkos83/Shutterstock.com

Until the arrival of DVI (then HDMI shortly after that), computers commonly used VGA connections between a video card and a monitor. Roughly speaking, VGA was most popular on IBM PC-compatible computers from around 1990 to 2005, although some laptops continued to include the connector well into the digital video era, likely because of the port’s common use in TV sets and video projectors used for presentations in businesses, schools, and universities.

RELATED: 40 Years Later: What Was it Like to Use an IBM PC in 1981?

VGA Standard vs. VGA Connector

It’s important to note that “VGA” can mean different things based on context. For example, there’s a difference between the VGA graphics standard and the VGA connector itself. Traditionally, the strict definition of VGA includes a specific set of video modes, such as 640×480 with 16 colors or 320×200 with 256 colors. Those 16 or 256 colors are pulled from a palette of 262,144 colors.

The default 256-color VGA palette, known as Mode 13h.
The default 256-color VGA palette, known as Mode 13h.

But what we often call the “VGA connector” on more recent devices such as computers, TV sets, and video projectors can often support much higher resolutions and color depths than the strict definition of the VGA video standard. Historically, those resolutions were called “Super VGA” or “XGA” at times. VGA is also backward compatible with earlier IBM video standards such as CGA and EGA. Over time, it’s been easier to lump them all into “VGA” because they used the original VGA connector.

Should I Use VGA or Something Else?

Whether you should use VGA or not depends on that application and what you have available. Generally speaking, if you’re using a modern computer or TV set, you’ll want to reach for a digital connection standard such as HDMI first, which will give you a much higher level of detail and sharper picture quality.

An array of video port types.
Alexandr III/Shutterstock.com

If you only have a choice between VGA and older analog video standards such as composite video or S-Video, you should choose VGA, as it will result in improved picture quality.

In some cases, such as legacy projector installations in lecture halls, churches, or business conference rooms, you might not have any choice other than VGA. If you don’t have a VGA port on your machine (which is typically the case these days with newer hardware), you can purchase an HDMI to VGA adapter.

For example, for the past four years, we’ve used a Rankie 1080P HDMI to VGA adapter that includes an audio output port. It supports a wide array of resolutions and works seamlessly, with no need for drivers. To power it, you’ll need to plug its included USB cord into a port on your device.

You’ll also need a high-quality VGA cable to plug into the HDMI to VGA adapter and also into the VGA-capable display. You can find great VGA cables (such as this one) on Amazon easily. Good luck!



[ad_2]

Source link

You May Also Like

More From Author