
Product test
Samsung Odyssey Neo G9 review: 7680 × 2160 pixels across 57 inches
by Samuel Buchmann
A new HDMI standard was presented at CES. Find out when it’ll be coming to devices, what it’s good for and whether you’ll need new cables.
The HDMI Forum presented its new HDMI 2.2 standard at CES. It offers a transfer rate of up to 96 Gbps – twice as much as HDMI 2.1 – and supposedly also eliminates audio synchronisation issues. Read on to find answers to the six most pressing questions about the change:
High resolutions can be transmitted at a higher frame rate than before – 96 gigabits per second. For example, you don’t need to compress the following combinations with 10-bit colour depth:
Resolutions and refresh rates this high have so far been mainly confined to gaming monitors. But more and more TVs are now able to achieve 144 or 165 Hertz, making the bit rate higher than the capabilities of HDMI 2.1. In order to still be able to transmit a signal, current TVs and monitors use visually lossless Display Stream Compression (DSC). With HDMI 2.2, there’s even more to it. For example:
Nothing so far. HDMI 2.1 can already transmit 4K at 240 Hertz with DSC. In such cases, HDMI 2.2 would theoretically offer better image quality due to its uncompressed transmission – but blind tests show that this can’t be seen in practice. This is because the compression is around a factor of three, so relatively low. For comparison, streaming services often compress their material by a factor of 100.
HDMI 2.1 will still be fine for films and series, as these are usually shot at 24 frames per second. So, the 48 Gbps of the previous standard is enough – even in uncompressed 8K resolution, where there’s hardly any source material yet anyway. The standard is still UHD, and that’s unlikely to change in the foreseeable future.
The new standard includes a latency indication protocol (LIP). According to the press release, it optimises the synchronisation of image and sound, especially if you use multistage systems with soundbars and AV receivers. Simply put, LIP is designed to ensure lip movements always match the sound output.
Only if the higher bandwidth makes a difference to you. The physical HDMI socket remains the same and is backwards compatible. So, if you buy a TV in three years’ time that only has HDMI 2.2 ports and you don’t actually need the performance, you can keep using your old cables.
However, if you really want to transmit uncompressed 4K 240 hertz or another high bit rate, you’ll have to get new cables. You’ll recognise these by the Ultra96 label.
The bandwidth is a bit higher, but DisplayPort (DP) is ahead of HDMI. The new 2.1 standard was introduced in 2022 and has been used in monitors and graphics cards since last year. However, there are different performance levels under the DP 2.1 label, which is confusing: DP 2.1 UHBR10 has a maximum of 40 Gbps, UHBR13.5 has 54 Gbps and UHBR20 has 80 Gbps. Only the new Nvidia RTX 50 series and a few monitors support UHBR20, which is required for uncompressed 4K 240 Hertz.
DisplayPort (DP) is still more popular than HDMI in PCs. Many current graphics cards, for example, have three DP outputs and only one for HDMI. The previously widely used DP 1.4 has a slightly smaller bandwidth of 32.4 Gbps compared to HDMI 2.1 (48 Gbps), but with DSC this is also enough for 4K 240 hertz. Only very few displays exceed the limits of DP 1.4 and HDMI 2.1 – such as the Samsung Neo G9 with dual UHD resolution (7680 × 2160 pixels) at 240 hertz.
It’ll probably take years for HDMI 2.2 to be widely used. The HDMI Forum announced that the new connection will be available to manufacturers in the first half of 2025. After that, however, it’ll take time for them to integrate it into their devices. For reference, HDMI 2.1 was announced in early 2017, but the first Nvidia graphics card with the connection didn’t appear until September 2020 – and the introduction went far from smoothly. We also cover this topic in the latest episode of the Tech Affair podcast (podcast in German).
My fingerprint often changes so drastically that my MacBook doesn't recognise it anymore. The reason? If I'm not clinging to a monitor or camera, I'm probably clinging to a rockface by the tips of my fingers.