Which cable should you use to connect your display to your computer? Having a high-performance GPU and a high-resolution monitor requires the right cable that can promise you the performance that you desire.
Chances are you’ve probably heard of some if not all of these terms but they might sound a bit confusing. Even for professionals, the exact pros and cons may feel a little murky. It’s important that you understand how all of these things work, so that you can then make a clever choice yourself.
Table of ContentsShow
VGA (Video Graphics Array)
Let’s start with the oldest tech in the book. VGA originally started production all the way back in 1987 and it had a pretty solid run overall.
If you connected a monitor to a PC, you likely saw the port and possibly used it to connect the two. VGA is attached to the GPU and held in place by small screws.
Concerning the cables being discussed today, this is the only one with an analog signal, which means it doesn’t directly communicate with the end device. The drawback is that it’s pretty prone to distortion, but at least you’ll certainly be able to connect an older device like a projector.
Since it’s a pretty old technology, the signal degrades with cable length and with conversion to a digital signal, which is necessary to use on modern monitors, as VGA was originally made for CRT (cathode-ray tube).
Although it can technically support 1080p at 60Hz, it would be best to use anything else.
DVI (Digital Visual Interface)
Although DVI came some ten years after VGA, it’s still on the verge of obsolescence. Much like VGA, it also has pins and screws to keep it in place, but DVI has a digital signal instead. Despite not having audio support at the time it came out, its age is certainly showing when compared to more modern interfaces like HDMI or USB-C.
Fun fact about DVI: it comes in several different variants.
We have DVI-A, which is for analog signal only, and DVI-D, exclusively for the digital signal. Then there’s DVI-I, which is capable of both analog and digital signal transmission. It’s compatible with a VGA interface, the same as DVI-A, which was pretty neat when it came out but isn’t as useful nowadays.
DVI-I also comes in a dual-link configuration that doubles the bitrate. DVI-D also has a dual-link version but can support HDMI through an adapter, so DVI can support VGA, DVI, and HDMI.
Side note: DVI-DL refers to dual-link DVI versions, be it DVI-D or DVI-I.
Still, with the expected technological advancement induced degradation, DVI is not the best option out there. Like VGA, it’s physically bulky and can’t transfer data.
HDMI (High-Definition Multimedia Interface)
HDMI can be considered pretty old, but despite coming out roughly four years after DVI, it’s still widely accepted. It’s been used for both previous and current generation of gaming consoles, with rumors spreading that both Sony and Microsoft will stick with the widely popular standard and keep it in their next-gen consoles.
Truthfully, it’s irresponsible to go around pretending that the same HDMI tech from 2003 is the one used today. It’s in fact an upgrade that retained the naming but improved upon the original in almost every possible way.
There are two main versions of the HDMI standard that are mostly in use today, HDMI 1.4 and HDMI 2.0. As the name suggests, the latter does bring significant improvements, but not enough to have electronics manufacturers give up the old 1.4 standard.
The biggest advantage of HDMI is that it’s probably the most widely available video connecter currently on the market. As mentioned earlier, it’s used for different generations of gaming consoles, but also TVs, graphics cards, and any modern AV technology.
One of the reasons why HDMI 1.4 is so widely used is because of its versatility. Its support of multi-channel audio is probably what initially gave it wings and propelled it above the previous standards set by VGA and DVI. Its extra options are also a big plus, such as ethernet data support, superior color data and an extremely wide video format support.
The biggest downside of HDMI 1.4 is related to this video format support. While it can technically support 4K, it can only do so at measly 30 FPS. Sure, having a crisp image is cool, but most people prefer higher FPS over resolution, although it’s ideal to have both.
Enter HDMI 2.0. It addressed this and many more issues to further cement HDMI’s spot at the top of the totem pole.
One of the biggest improvements was the increase in bandwidth from HDMI 1.4’s 10.2 Gb/s to an almost double 18 Gb/s. Another cool 2.0 innovation is the support for HDR, which expands color ranges and is particularly noticeable in dark and bright scenes.
Although HDMI 1.4 and 2.0 are the most widely used connectors, there’s also an HDMI 2.1, released in 2017, that improves on its predecessors. HDMI 2.1 supports the Variable Refresh Rate (VRR), which is used with appropriate monitors and rids the gaming world of Nvidia’s G-Sync and AMD’s FreeSync. They do still have their uses, it’s just that VRR is a simpler option.
Another cool thing that HDMI 2.1 offers is the preposterous 10K resolution, and it doesn’t even sacrifice the frame rate, which is as high as 120 Hertz. Although it’s admirable that the possibility is real, it’s safe to say that running a game on such a high resolution with such a high refresh rate would require a couple of thousands of dollars.
HDMI 2.1 also upgraded the audio department, making significant improvements to the surround sound.
This is where the technological advancement of HDMI gets crazy. In order to support all those extra features, 2.1 requires 48 Gb/s bandwidth and that is exactly what it delivers. Its speed is roughly 2.5 times that of its predecessor and is actually more than some M.2 PCIe SSDs.
At the moment, the most popular DisplayPort out there is version 1.4 which logically superseded versions 1.2 and 1.3, both very popular in their time.
DisplayPort offers security latches for that added peace of mind that you just don’t have with HDMI when plugging in the device. A cable losing contact on touch or slight movement can damage both sides, and that has happened with HDMI when plugged in at a weird angle. DisplayPort takes care of that with a similar system to DVI and VGA, although much less intrusive.
For the longest time, DisplayPort was a better alternative to HDMI 2.0, but then HDMI 2.1 came out and changed everything.
That’s not to say that DisplayPort 1.4 isn’t great. In fact, it’s probably the best option out there for gaming PC monitors due to its longevity on the market and consequent wider acceptance among the graphics card manufacturers.
DisplayPort 1.3 introduced the world to 4K resolution at 120 Hertz, something HDMI wasn’t able to offer until three years later with their HDMI 2.1 release.
To offer 4K gaming at 144 Hertz, DisplayPort 1.4 has a 25.92 Gb/s bandwidth which outperforms both HDMI 1.4 and 2.0 but lags behind the 2.1 version. This was a bit unrealistic to expect as DP 1.4 came out two years before HDMI 2.1, but one could argue that it still gives HDMI the edge.
There’s a bit of a discussion going on about the future of DisplayPort, as version 2.0 was supposed to be released in 2017 and the details about it were sparse for a long while. Officially, the standard was released in 2019, but it’s not expected to get to the consumer market until late 2020.
If you think this new DisplayPort 2.0 standard will be used for the next generation of graphics cards, you’re not the only one. There has been rampant speculation about whether AMD or Nvidia will look to future-proof their cards or stick to the old reliable DP 1.4.
This tech was originally introduced by Apple and, for a while, many believed that it was exclusive to their devices. However, this perception changed as USB-C became available to more and more devices.
Much like HDMI, USB-C is sort of a jack of all trades with its accessibility. Unlike other connectors, USB-C can be plugged in regardless of position like the standard USB 3.0 port for example. It can also connect devices across platforms, meaning smartphones, tablets, laptops, and PCs.
As expected from a newer tech like USB-C, it can transmit both video and audio, but also data and power.
The standard USB-C port offers 10 Gb/s bandwidth and 85 Watts charging capability, but Intel’s Thunderbolt 3 port allows USB-C a whopping 40 Gb/s and a 100 Watts charge.
The biggest downside of USB-C is that it doesn’t support Adaptive-Sync technology, meaning you won’t be able to run neither AMD’s FreeSync nor Nvidia’s G-Sync. This is somewhat understandable as USB-C isn’t the best option for gaming and fills the all-around functionality position in the market.
It’s easy to understand why one might be confused with all these different interfaces. Let’s talk about some of the best uses for gaming.
What Cable Do I Need For 144Hz 1080p Resolution?
If the better FPS is what you prefer then you’re likely looking at this option. Ideally, you’d want to use DisplayPort, but if your GPU is older, it’s possible that it doesn’t have a DisplayPort. This is a shame since even its older version (even 1.0) supported 144Hz at 1080p.
The second best option is HDMI 2.0, but if you don’t have access to a card with a DisplayPort, you likely don’t have 2.0 access either. Don’t be confused because it’s very possible that there is an HDMI port in the graphics card, it’s just that it’s probably an HDMI 1.4 port, or even older.
It may sound like this is a last resource, but DVI is actually a pretty good option for 144Hz at 1080p. Its downside is the fact that it doesn’t support audio, but you’re almost guaranteed to have a DVI port if you have neither DisplayPort nor HDMI 2.0.
What Cable Do I Need For 144Hz 1440p Resolution?
If you’re looking to get even more in the looks department and are not willing to sacrifice the frame rate, here are some excellent options for you.
Again, the best option is DisplayPort with versions from 1.2 and up supporting 1440p at 144Hz.
Although technically possible with HDMI 2.0, the best HDMI option for this resolution and refresh rate is HDMI 2.1 and the other options are essentially useless.