Which cable should you use to connect your monitor to your computer?
Having a high-performance GPU is a waste of money without a high-quality monitor to display those graphics, but both of these are useless without a cable that can deliver the performance you desire.
It’s entirely possible that you have heard of some, if not all, of these terms, but they can still sound a bit confusing.
Even for professionals, the pros and cons of each cable can seem vague. It’s important that you understand how all of these types work so that you can make an informed choice.
Table of ContentsShow
VGA (Video Graphics Array)
Let’s start with the oldest tech in the book. VGA originally started production all the way back in 1987, and it had a pretty solid run overall.
If you are an experienced PC user and have connected a monitor to a PC, you probably used this to connect the two. The VGA is attached to the GPU and held in place by small screws.
When compared to the other cables available today, this is the only one with an analog signal, which means it doesn’t directly communicate with the end device. The main drawback is that it’s prone to distortion, but at least you will be able to connect an older device such as a projector.
As this is a pretty old technology, the signal degrades with cable length and with conversion to a digital signal, which is necessary for it to be used on modern monitors, as VGA was originally made for CRT (cathode-ray tube) monitors.
Although VGA can technically support 1080p at 60Hz, it would be best to use anything else.
DVI (Digital Visual Interface)
Although DVI appeared about ten years after VGA, it is still on the verge of obsolescence. Like VGA, it also has pins and screws to keep it in place, but DVI uses a digital signal instead. Although audio support was added on some devices with DVI, it is still considered outdated compared to more modern interfaces such as HDMI or USB-C.
DVI comes in several different variants.
There is DVI-A, which is for analog signals only, and DVI-D, exclusively for digital signals. Then there’s DVI-I, which is capable of both analog and digital signal transmission. It’s also compatible with a VGA interface, the same as DVI-A, which was pretty neat when it debuted but is far less useful nowadays.
DVI-I also has a dual-link setup that increases the bitrate. DVI-D has a dual-link variation and is compatible with HDMI when using an adapter, thus DVI can support VGA, DVI, and HDMI.
Side note: DVI-DL refers to dual-link DVI versions, whether this is DVI-D or DVI-I.
Still, with the changes brought by technological advancement, DVI is no longer the best option available. Like VGA, it is physically bulky and can’t transfer data.
HDMI (High-Definition Multimedia Interface)
HDMI can now be considered pretty old, but, despite debuting roughly four years after DVI, it is still widely accepted. It has been used for the previous and current generations of gaming consoles. Both the PlayStation 5 and Xbox Series X come with HDMI 2.1, which is the current latest version.
It would be inaccurate to pretend that the same HDMI tech from 2003 is still being used today. The HDMI we use now is an upgrade that retained the name but improved upon the original in almost every way.
There are two main versions of HDMI that are in use today: HDMI 1.4 and HDMI 2.0. As the name suggests, the latter brings significant improvements, but not enough that electronics manufacturers have given up on the old 1.4 standards.
The biggest advantage of HDMI is that it’s probably the most widely available video connecter currently on the market. As mentioned earlier, it is used for multiple generations of gaming consoles, TVs, graphics cards, and any modern AV technology.
One of the reasons why HDMI 1.4 is so widely used is its versatility. Its support of multi-channel audio is probably what initially propelled it above the previous standards set by VGA and DVI. Its extra options are also a major positive, such as ethernet data support, superior color data, and diverse video format support.
The biggest downside of HDMI 1.4 is related to this video format support. While it can technically support 4K, it can only do so at 30Hz. Sure, having a crisp image is cool, but most gamers prefer higher FPS over resolution, although, ideally, you should have both.
HDMI 2.0 and 2.1
Enter HDMI 2.0. It addressed the FPS issue and many others to further cement HDMI’s place at the top of the totem pole.
One of the biggest improvements was the increase in bandwidth from HDMI 1.4’s 10.2 Gb/s to an almost doubled 18 Gb/s. Another cool 2.0 innovation is its support for HDR, which expands color ranges and is particularly noticeable in dark and bright scenes.
Although HDMI 1.4 and 2.0 are the most widely used connectors, there is also an HDMI 2.1, released in 2017, that improves on its predecessors. HDMI 2.1 supports Variable Refresh Rate (VRR), which is used with certain monitors and rids the gaming world of NVIDIA’s G-Sync and AMD’s FreeSync. They do still have their uses, but VRR is a simpler option.
Another cool thing that HDMI 2.1 offers is the preposterous 10K resolution, and it doesn’t even sacrifice the refresh rate, which is as high as 120 Hertz.
Although it’s admirable that it offers the possibility, it’s safe to say that running a game at such a high resolution with a high refresh rate is probably not even possible with the best hardware available today.
An RTX 3090 or RX 6900 XT would definitely struggle to output 10FPS at 10k resolution.
HDMI 2.1 also provided upgrades to audio, making significant improvements to the surround sound.
This is where the technological advancement of HDMI gets crazy. In order to support all those extra features, 2.1 requires 48 Gb/s bandwidth, and that is what it delivers. Its speed is roughly 2.5 times that of its predecessor and is more than some M.2 PCIe SSDs.
At the moment, the most popular DisplayPort out there is version 1.4, which logically succeeded versions 1.2 and 1.3, both very popular in their time.
DisplayPort offers security latches for the added peace of mind that you don’t have with HDMI when plugging in a device.
A cable losing contact on touch or when subjected to slight movement can damage both sides, which has happened with HDMI when plugged in at an unusual angle. DisplayPort addresses that with a similar system to DVI and VGA, though far less intrusive.
For the longest time, DisplayPort was a superior alternative to HDMI 2.0, but then HDMI 2.1 debuted and changed everything.
That is not to say that DisplayPort 1.4 isn’t great. In fact, it’s probably the best option out there for gaming PC monitors due to its longevity on the market and subsequent wider acceptance among graphics card manufacturers.
DisplayPort 1.3 introduced the world to 4K resolution at 120 Hertz. Something HDMI was unable to offer until three years later with the HDMI 2.1 release.
To provide 4K gaming at 144 Hertz, DisplayPort 1.4 has a 25.92 Gb/s bandwidth, which outperforms both HDMI 1.4 and 2.0 but lags behind the 2.1 version. This might be an unfair comparison as DP 1.4 was released two years before HDMI 2.1, but it still gives HDMI an edge.
There is some discussion taking place regarding the future of DisplayPort, as version 2.0 was supposed to be released in 2017, and the details about it were scarce for a long while. Officially, the standard was released in 2019, but it isn’t expected to reach the consumer market until late 2021 or even 2022.
If you think the new DisplayPort 2.0 standard will be used for the next generation of graphics cards, you aren’t the only one. There has been rampant speculation about whether AMD or NVIDIA will look to future-proof their cards or stick with the old reliable DP 1.4.
Apple originally introduced this tech and, for a while, many believed that it would be exclusive to their devices. However, this perception changed as USB-C became available on more and more devices.
Like HDMI, USB-C is sort of a jack of all trades because of its accessibility. Unlike other connectors, USB-C can be plugged in regardless of position, like the standard USB 3.0 port, for example. It can also connect devices across platforms, meaning smartphones, tablets, laptops, and PCs.
As expected from newer tech, USB-C can transmit both video and audio but also data and power, which is why you will often see it used for items such as phone or laptop chargers.
The standard USB-C port offers 10 Gb/s bandwidth and 85 Watts charging capability, but Intel’s Thunderbolt 3 port grants USB-C a whopping 40 Gb/s and a 100 Watts charge.
The biggest downside of USB-C is that it doesn’t support Adaptive-Sync technology, meaning you won’t be able to run either AMD’s FreeSync or NVIDIA’s G-Sync. This is understandable as USB-C isn’t the best option for gaming and fills the all-around functionality role in the market.
This means if your goal is smooth and simple gaming, USB-C is not the right choice for you.
It’s easy to understand why someone might be confused by all these different interfaces. Let’s talk about some of their best uses for gaming.
What Cable Do I Need For 144Hz 1080p Resolution?
If you prefer better FPS, then you’re likely looking at this section. Ideally, you should use DisplayPort, but if your GPU is older, it’s possible that it doesn’t have a DisplayPort. This is a shame because even its older versions (including 1.0) supported 144Hz at 1080p.
The second best option is HDMI 2.0, but if you don’t have access to a card with a DisplayPort, you probably don’t have 2.0 access either. Don’t be concerned because it’s possible that there is an HDMI port in the graphics card; it’s just probably an HDMI 1.4 port, or perhaps even older.
It might sound like this is the last resort, but DVI is actually a pretty good option for 144Hz at 1080p. Its downside is the fact that it doesn’t support audio, but you’re almost guaranteed to have a DVI port if you have neither DisplayPort nor HDMI 2.0.
What Cable Do I Need For 144Hz 1440p Resolution?
If you’re looking to get even more in the looks department and are unwilling to sacrifice the frame rate, here are some excellent options for you.
Again, the best option is DisplayPort, with versions from 1.2 onwards supporting 1440p at 144Hz.
Although this is technically possible with HDMI 2.0, the best HDMI option for this resolution and refresh rate is HDMI 2.1, and the other versions are essentially useless.