Having a high-performance GPU and a high-resolution monitor requires the right cable that can promise you the performance that you desire.
Chances are you probably heard of at least some if not all of these terms but to a man on the street, it’s likely that they sound strange, at best. Even for professionals, the exact pros and cons may feel a little murky.
We believe that in order for you to understand why something works better than something else, it’s best that you understand how all of these things work so that you can then make a clever choice yourself.
Table of ContentsShow
VGA (Video Graphics Array)
Let’s start with the oldest tech in the discussion. VGA originally started production all the way back in 1987 and overall, it had a pretty solid run.
If you connected a monitor to a PC, you likely at least saw the port, and possibly used it to connect the two. VGA is attached to the GPU and held in place by small screws.
In regards to the cables being discussed today, this is the only one with an analog signal, which means it doesn’t directly communicate with the end device. The drawback is that it’s pretty prone to distortion, but at least you’ll certainly be able to connect an older device like a projector.
Since it’s a pretty old technology, the signal degrades with cable length and with conversion to a digital signal, which is necessary to use on modern monitors, as VGA was originally made for CRT (cathode-ray tube).
Although it technically can support 1080p at 60Hz, it would be best if you could use anything else.
DVI (Digital Visual Interface)
Although DVI came some ten years after VGA, it is still on the verge of obsolescence. Like VGA, it also has pins and screws to keep it in place, but unlike the VGA, DVI has a digital signal.
Despite not having audio support at the time it came out, its age is certainly showing when compared to more modern interfaces like HDMI or USB-C.
Fun fact about DVI is that it comes in several different variants.
We have DVI-A which is for analog signal only, and DVI-D, only for the digital signal. Then there’s DVI-I which is capable of both analog and digital signal transmission and it’s compatible with a VGA interface, the same as DVI-A which was pretty neat when it came out but isn’t as useful nowadays.
DVI-I also comes in a dual-link configuration which doubles the bitrate. DVI-D also has a dual-link version but can support HDMI through an adapter so DVI can support VGA, DVI, and HDMI.
Side note: DVI-DL refers to dual-link DVI versions, be it DVI-D or DVI-I.
Still, with the expected technological advancement induced degradation, DVI is not the best option out there. Like VGA, it’s physically pretty bulky and can’t transfer data.
HDMI (High-Definition Multimedia Interface)
HDMI can be considered pretty old, but despite coming out roughly four years after DVI, it’s still widely utilized and accepted. It’s been used for both previous-gen and the current generation of gaming consoles, with rumors spreading that both Sony and Microsoft will stick with the widely popular standard and keep it in their next-gen consoles.
Truthfully it’s irresponsible to go around and pretending that the same HDMI tech from 2003 is the one used today when in fact it’s an upgrade that retained the naming and improved the original in almost every way.
There are two main versions of the HDMI standard that are mostly in use today – HDMI 1.4 and HDMI 2.0. As the name suggests, the latter does bring significant improvements, but not enough to have electronics manufacturers give up the old 1.4 standard.
The biggest advantage of HDMI is that is probably the most widely available video connecter currently on the market. As mentioned earlier, it’s used for different generations of gaming consoles, but also TVs, graphics cards, any modern AV technology.
One of the reasons why HDMI 1.4 is so widely used and accepted is because of its versatility. Its support of multi-channel audio is probably what initially gave it wings and propelled it above the previous standards, VGA and DVI. Another reason is extra options, like ethernet data support, but probably the key is superior color data and an extremely wide video format support.
The biggest downside of HDMI 1.4 comes exactly at that video format support. So, while it can technically support 4K, it can only do it at measly 30 FPS. Sure, having a crisp image is cool, but most people prefer higher FPS over resolution, although it’s ideal to have both.
Enter HDMI 2.0. It addressed this and many more issues to further cement HDMI’s spot at the top of the totem pole. One of the biggest improvements was the increase in bandwidth from HDMI 1.4’s 10.2 Gb/s to an almost double 18 Gb/s. Another cool 2.0 innovation is the support for HDR which expands color ranges and is particularly noticeable in dark and bright scenes.
Although HDMI 1.4 and 2.0 are the most widely used connectors, there’s also an HDMI 2.1, released in 2017 that improves on its predecessors. One of the coolest things 2.1 has brought is support for Variable Refresh Rate (VRR) which is used with appropriate monitors and rids the gaming world of Nvidia’s G-Sync and AMD’s FreeSync. They do still have their uses, it’s just that VRR is a simpler option.
Another cool thing that HDMI 2.1 offers is the preposterous 10K resolution, and it doesn’t even sacrifice the frame rate, offering as high as 120 Hertz. Although it’s admirable that the possibility is present, it’s safe to say that running a game on such a high resolution with such a high refresh rate would require a couple of thousands of dollars.
HDMI 2.1 is also an upgrade in the audio department, making significant improvements to the surround sound.
This is where the technological advancement of HDMI gets crazy. In order to support all those extra features, 2.1 requires 48 Gb/s bandwidth and that is exactly what it delivers. That speed is roughly 2.5 times that of its predecessor and is actually more than some M.2 PCIe SSDs.
At the moment, the most popular DisplayPort out there is version 1.4 which logically superseded versions 1.2 and 1.3, both very popular in their time.
DisplayPort offers security latches for that added peace of mind when plugging in the device that you just don’t have with HDMI. A cable losing contact on touch or slight movement can damage both sides and that has happened with HDMI when plugged in at a weird angle. DisplayPort takes care of that with a similar system to DVI and VGA, although much less intrusive.
For the longest time, DisplayPort was a better alternative to HDMI 2.0, but then HDMI 2.1 came out and changed everything.
That’s not to say that DisplayPort 1.4 isn’t great. In fact, it’s probably the best option out there for gaming PC monitors due to its longevity on the market and, thus, wider acceptance among the graphics card manufacturers.
DisplayPort 1.3 introduced the world to 4K resolution at 120 Hertz, something HDMI wasn’t able to offer until three years later with their HDMI 2.1 release.
To offer 4K gaming at 144 Hertz, DisplayPort 1.4 has a 25.92 Gb/s bandwidth which outperforms both HDMI 1.4 and 2.0 but lags behind the 2.1 version. This was a bit unrealistic to expect as DP 1.4 came out two years before HDMI 2.1, but one could argue that this still gives HDMI the edge.
There is a bit of a discussion going on about the future of DisplayPort, as version 2.0 was supposed to be released in 2017 and the details about it were sparse for a long while. Officially, the standard was released in 2019, but it’s not expected to get to the consumer market until late 2020.
If you guessed that this new DisplayPort 2.0 standard will be used for next-gen of graphics cards, you’re not the only one. There has been rampant speculation about whether AMD or Nvidia will look to future-proof their cards or stick to the old reliable DP 1.4.
This tech was originally introduced by Apple and many believed that it was exclusive to their devices for a while. However, this perception changed as USB-C became available to more and more devices.
Much like HDMI, USB-C is a sort of a jack of all trades with its accessibility. Unlike other connectors, USB-C can be plugged in regardless of position like the standard USB 3.0 port for example. The versatility doesn’t stop there as it can connect devices across platforms, meaning smartphones, tablets, laptops, and PCs.
As expected from a newer tech like USB-C, it can transmit both video and audio, but also data and power.
The standard USB-C port offers 10 Gb/s bandwidth and 85 Watts charging capability, but Intel’s Thunderbolt 3 port allows USB-C a whopping 40 Gb/s and a 100 Watts charge.
The biggest downside of USB-C is that it doesn’t support Adaptive-Sync technology meaning you will be able to run neither AMD’s FreeSync nor Nvidia’s G-Sync. This is somewhat understandable as USB-C isn’t the best option for gaming and fills the all-around functionality position on the market.
It’s easy to understand why one might be confused with all of these different interfaces, and even more, with what possible uses are best for each so let’s talk about some of the best uses for gaming.
What Cable Do I Need For 144Hz 1080p Resolution?
If the better FPS is what you prefer then you’re likely looking at this option. Ideally, you’d want to use DisplayPort, but if your GPU is older, it’s possible that it doesn’t have a DisplayPort. This is a shame since even the older version of it (even 1.0) supported 144Hz at 1080p.
The second best option is HDMI 2.0, but if you already don’t have access to a card with a DisplayPort, you likely don’t have 2.0 access either. Don’t be confused because it’s very possible that there is an HDMI port in graphics card, it’s just that it’s probably an HDMI 1.4 port, or even older.
It may sound like this is the last resort, but DVI is actually a pretty good option for 144Hz at 1080p. Its detractor is the fact that it doesn’t support audio, but you’re almost guaranteed to have a DVI port if you have neither DisplayPort nor HDMI 2.0.
What Cable Do I Need For 144Hz 1440p Resolution?
If you’re looking to get even more in the looks department and are not willing to sacrifice the frame rate, here are some excellent options for you.
Again, the best option is DisplayPort with versions from 1.2 and up supporting 1440p at 144Hz.
Although technically possible with HDMI 2.0, the best HDMI option for this resolution and refresh rate is HDMI 2.1 and other options are essentially useless.