HDMI vs. DisplayPort vs. DVI vs. VGA vs. USB-C – Which To Use?

What cable should you use to connect your display with your computer? What cable supports 144Hz at 1080p or 1440p? What about 240Hz? Let's find out.

Which suitable cable should you use to connect your monitor to your computer?

Having a high-performance GPU is a waste of money without a first-rate monitor to display those graphics, but both of these are useless without a cable that can deliver the performance you desire.

It’s entirely possible that you have heard of some, if not all, of these terms, but they can still sound a bit baffling.

Even for professionals, the advantages and drawbacks of each cable can seem ambiguous. It’s important that you understand how all of these types work so that you can make an informed choice.

Table of ContentsShow

VGA (Video Graphics Array)

VGA Cable
A VGA to VGA cable

Let’s start with the oldest tech in the book. VGA originally began production all the way back in 1987, and it had a fairly substantial run overall.

If you are a skilled PC user and have connected a monitor to a PC, you likely used this to connect the two. The VGA is attached to the GPU and held in place by tiny screws.

When compared to the other cables available today, this is the only one with an analog signal, which means it doesn’t directly communicate with the end device. The main drawback is that it’s prone to interference, but at least you will be able to connect an older device such as a projector.

As this is a rather old technology, the signal deteriorates with cable length and with conversion to a digital signal, which is necessary for it to be used on modern monitors, as VGA was originally made for CRT (cathode-ray tube) monitors.

Although VGA can technically support 1080p at 60Hz, it would be optimal to use anything else.

DVI (Digital Visual Interface)

DVI Cable
A DVI to DVI cable

Although DVI appeared about ten years after VGA, it is still on the verge of obsolescence. Like VGA, it also has pins and screws to keep it in place, but DVI uses a digital signal instead. Although audio support was added on some devices with DVI, it is still considered antiquated compared to more contemporary interfaces such as HDMI or USB-C.

DVI comes in several distinct variants.

There is DVI-A, which is for analog signals only, and DVI-D, exclusively for digital signals. Then there’s DVI-I, which is capable of both analog and digital signal transmission. It’s also compatible with a VGA interface, the same as DVI-A, which was rather impressive when it debuted but is far less beneficial nowadays.

DVI-I also has a dual-link configuration that increases the bitrate. DVI-D has a dual-link variation and is compatible with HDMI when using an adapter, thus DVI can support VGA, DVI, and HDMI.

Side note: DVI-DL refers to dual-link DVI versions, whether this is DVI-D or DVI-I.

Still, with the changes brought by technological advancement, DVI is no longer the optimal option available. Like VGA, it is physically unwieldy and can’t transmit data.

HDMI (High-Definition Multimedia Interface)

HDMI Cable
An HDMI to HDMI cable

HDMI can now be considered relatively old, but, despite debuting roughly four years after DVI, it is still widely accepted. It has been used for the previous and current generations of gaming consoles. Both the PlayStation 5 and Xbox Series X come with HDMI 2.1, which is the current latest version.

HDMI 1.4

It would be inaccurate to pretend that the same HDMI tech from 2003 is still being used today. The HDMI we use now is an enhancement that retained the name but improved upon the initial version in almost every way.

There are two main versions of HDMI that are in use today: HDMI 1.4 and HDMI 2.0. As the name suggests, the latter brings substantial improvements, but not enough that electronics manufacturers have given up on the old 1.4 standards.

The largest advantage of HDMI is that it’s probably the most widely available video connecter currently on the market. As mentioned earlier, it is used for multiple generations of gaming consoles, TVs, graphics cards, and any contemporary AV technology.

One of the reasons why HDMI 1.4 is so widely used is its adaptability. Its support of multi-channel audio is probably what initially propelled it above the previous standards set by VGA and DVI. Its additional options are also a substantial positive, such as ethernet data support, superior color data, and diverse video format support.

The largest drawback of HDMI 1.4 is related to this video format support. While it can technically support 4K, it can only do so at 30Hz. Sure, having a sharp image is cool, but most gamers prefer higher FPS over resolution, although, ideally, you should have both.

HDMI Cable connected to a laptop
HDMI is very widely used today

HDMI 2.0 and 2.1

Enter HDMI 2.0. It addressed the FPS issue and many others to further cement HDMI’s position at the apex of the totem pole.

One of the largest enhancements was the increase in bandwidth from HDMI 1.4’s 10.2 Gb/s to an almost doubled 18 Gb/s. Another significant 2.0 innovation is its support for HDR, which expands color ranges and is particularly evident in dark and bright scenes.

Although HDMI 1.4 and 2.0 are the most prevalent connectors, there is also an HDMI 2.1, released in 2017, that improves on its predecessors. HDMI 2.1 supports Variable Refresh Rate (VRR), which is used with certain monitors and rids the gaming world of NVIDIA’s G-Sync and AMD’s FreeSync. They do still have their uses, but VRR is a straightforward option.

Another handy thing that HDMI 2.1 offers is the preposterous 10K resolution, and it doesn’t even sacrifice the refresh rate, which is as high as 120 Hertz.

Although it’s laudable that it offers the possibility, it’s safe to say that running a game at such a high resolution with a high refresh rate is probably not even possible with the best hardware available today.

An RTX 3090 or RX 6900 XT would certainly labor to produce 10FPS at 10k resolution.

HDMI 2.1 also provided upgrades to audio, making substantial improvements to the surround sound.

This is where the technological advancement of HDMI gets complex. In order to support all those extra features, 2.1 requires 48 Gb/s bandwidth, and that is what it delivers. Its speed is roughly 2.5 times that of its predecessor and is more than some M.2 PCIe SSDs.

DisplayPort

DisplayPort Cable
A DisplayPort to DisplayPort cable

At the moment, the most prevalent DisplayPort out there is version 1.4, which logically succeeded versions 1.2 and 1.3, both extremely popular in their time.

DisplayPort offers security latches for the improved peace of mind that you don’t have with HDMI when connecting a device.

A cable losing contact on touch or when subjected to minor movement can damage both sides, which has happened with HDMI when plugged in at an unusual angle. DisplayPort addresses that with a similar system to DVI and VGA, though far less obtrusive.

For the longest time, DisplayPort was a superior alternative to HDMI 2.0, but then HDMI 2.1 debuted and changed everything.

That is not to say that DisplayPort 1.4 isn’t great. In fact, it’s probably the best option out there for gaming PC monitors due to its longevity on the market and subsequent broader adoption among graphics card manufacturers.

Mini DisplayPort to DisplayPort Cable
A Mini-DisplayPort to DisplayPort adapter cable

DisplayPort 1.3 introduced the world to 4K resolution at 120 Hertz. Something HDMI was unable to offer until three years later with the HDMI 2.1 release.

To provide 4K gaming at 144 Hertz, DisplayPort 1.4 has a 25.92 Gb/s bandwidth, which exceeds both HDMI 1.4 and 2.0 but trails behind the 2.1 version. This might be an unfair comparison as DP 1.4 was released two years before HDMI 2.1, but it still gives HDMI an edge.

There is some debate taking place regarding the future of DisplayPort, as version 2.0 was supposed to be released in 2017, and the specifics about it were scarce for a long while. Officially, the standard was released in 2019, but it isn’t expected to reach the consumer market until late 2021 or even 2022.

If you think the new DisplayPort 2.0 standard will be used for the next generation of graphics cards, you aren’t the only one. There has been substantial speculation about whether AMD or NVIDIA will look to future-proof their cards or stick with the old reliable DP 1.4.

USB-C

USB C Cable
A USB-C to USB-C cable

Apple originally introduced this tech and, for a while, many believed that it would be exclusive to their products. However, this perception changed as USB-C became accessible on more and more devices.

Like HDMI, USB-C is sort of a jack of all trades because of its adaptability. Unlike other connectors, USB-C can be plugged in regardless of orientation, like the standard USB 3.0 port, for example. It can also connect devices across platforms, meaning smartphones, tablets, laptops, and PCs.

As anticipated from newer technology, USB-C can transmit both video and audio but also information and energy, which is why you will frequently see it utilized for items such as phone or laptop chargers.

The standard USB-C port offers 10 Gb/s bandwidth and 85 Watts charging capability, but Intel’s Thunderbolt 3 port grants USB-C a colossal 40 Gb/s and a 100 Watts charge.

Macbook Pro with USB C ports
Apple’s Macbook laptops come with Thunderbolt ports, which have all the same capabilities as USB-C.

The largest downside of USB-C is that it doesn’t support Adaptive-Sync technology, meaning you won’t be able to run either AMD’s FreeSync or NVIDIA’s G-Sync. This is understandable as USB-C isn’t the optimal option for gaming and fills the all-around functionality role in the market.

This means if your goal is smooth and seamless gaming, USB-C is not the right choice for you.

It’s straightforward to understand why someone might be bewildered by all these different interfaces. Let’s talk about some of their best uses for gaming.

What Cable Do I Need For 144Hz 1080p Resolution?

multiple ports

If you prefer superior FPS, then you’re likely looking at this section. Ideally, you should use DisplayPort, but if your GPU is older, it’s conceivable that it doesn’t have a DisplayPort. This is a shame because even its earlier versions (including 1.0) supported 144Hz at 1080p.

The second best option is HDMI 2.0, but if you don’t have access to a card with a DisplayPort, you probably don’t have 2.0 access either. Don’t be anxious because it’s likely that there is an HDMI port in the graphics card; it’s just probably an HDMI 1.4 port, or perhaps even earlier.

It might sound like this is the last resort, but DVI is actually a fairly good option for 144Hz at 1080p. Its shortcoming is the fact that it doesn’t support audio, but you’re almost guaranteed to have a DVI port if you have neither DisplayPort nor HDMI 2.0.

What Cable Do I Need For 144Hz 1440p Resolution?

multiple connectors

If you’re looking to get even more in the appearance department and are reluctant to sacrifice the frame rate, here are some excellent options for you.

Again, the ideal choice is DisplayPort, with versions from 1.2 onwards supporting 1440p at 144Hz.

Although this is technically possible with HDMI 2.0, the optimal HDMI option for this resolution and refresh rate is HDMI 2.1, and the other versions are essentially worthless.

Recommended Reads

What Graphics Card Do I Have
What Graphics Card Do I Have?
Aleksandar Cosic

Alex is a Computer Science student and a former game designer. That has enabled him to develop skills in critical thinking and fair analysis. As a CS student, Aleksandar has very in-depth technical knowledge about computers, and he also likes to stay current with new technologies.