Why Your Next TV Needs HDMI 2.1
This article originally appeared on Koyaku, view the article here.
With a new generation of consoles now just over the horizon, you may be hearing some chatter around HDMI 2.1, so what’s all that about?
If you’re super new to TV tech, HDMI stands for High-Definition Multimedia Interface, which is an audio/video interface for transferring data. For the layman, it’s the connection a TV or monitor uses to talk to consoles, DVD players, etc. The type of HDMI being used will dictate the maximum resolution and refresh rate you can view.
Most current TVs use HDMI 2.0, which can deliver a 4K resolution at a maximum of 60 frames per second (FPS), which is fine for current-gen gaming but won’t deliver the full advantages of the Xbox Series X, PlayStation 5 or RTX 3080 if you’re hooking up a PC.
With HDMI 2.1, you can access a 4K resolution at 120 FPS, along with some other handy features. It can automatically set your TV to game mode thanks to something called Auto Low Latency Mode, which is triggered when a console or graphics card is detected to reduce input lag.
But most importantly, HDMI 2.1 supports Variable Refresh Rate, which is crucial when it comes to preventing screen tearing — a jaunty visual defect that occurs when a display is out of sync with the video source.
You can see what I mean in an example video below.