Back in mid-2014, we jumped head-first into the 4K world with the purchase of the Samsung UN55HU8550, one of the very first TVs released featuring the new HDMI 2.0 specification. The goal: to view real 4K content. The first 4K TVs that hit the market in 2013 were dead on arrival, being saddled with older HDMI 1.4 connectors that virtually guaranteed that they’d never display 4K content. Sure, there’s always streaming, but it only took us a few minutes of watching Netflix 4K content over a 50Mbps cable Internet connection to conclude that chasing after that 4K content wasn’t worth it, because streaming 4K looks a whole lot like typical 1080p Blu-Ray, and we had plenty of Blu-rays already (thanks to Netflix, as a matter of fact). So we felt good about investing in a TV that had HDMI 2.0 connections, because we wanted our content hard-wired to our set, not streamed to us. Of course we knew that 4K Blu-Rays were just a pipe dream at that time (turns out that Ultra HD Blu-Ray, as it’s now called, is in fact coming, and will land late this year). But what if we wanted real 4K content before this new dream format arrived? Well, there was always the option of displaying content from a PC, and as regular readers of TBG probably know, we’re big fans of PC tech here. Of course we were aware that you can get 4K content on a screen using DisplayPort, but for some unknown reason (or maybe a well-known reason known only to people in the know), DisplayPort isn’t available on HDTVs. Interesting. OK, so if we wanted big-screen 4K content, we had to use HDMI 2.0.
No problem, right? Wrong. The first HDMI 2.0-equipped video cards wouldn’t be released until September 2014, in the form of the Nvidia GeForce GTX 970 and 980. OK, we said, we’d wait. In the meantime, there were various tricks to get 4K content on a TV using HDMI 1.4. Nvidia itself provided one of these through a driver hack, which allowed its current-gen cards (as of mid-2014) to display 4K content, but only with reduced 4:2:2 color sampling, versus the full 4:4:4 color sampling. HDMI 1.4 has insufficient bandwidth to transmit the full-quality signal, necessitating that something be cut. For video content, 4:2:2 color sampling isn’t a big problem, but for desktop use, it’s distracting, as text can disappear depending on the background color. Furthermore, we found that switching between applications often meant losing the signal all together, which we attribute to the driver not keeping up with and downsampling calls for 4K 4:4:4 output, thus knocking out the display until everything was shut down and restarted.
Another option to get display 4K content on a TV that’s pretty fool-proof is pushing it out at 30Hz, which of course has half the bandwidth requirements of 4K at 60Hz. That works for a whole range of cards, including AMD, Nvidia, and Intel’s built-in video, but it comes with distinct drawbacks, namely slow cursor movement and a pretty bad gaming experience. For movies, it’s fine, since they’re at 24Hz, but there’s not much of that content out there.
So, Nvidia’s newer cards would be the only solution. In the meantime, a strange thing happened over at Samsung – it decided to make a mid-year change to its HU8550 specification, equipping models produced after August 2014 with a converter incapable of 4:4:4. Wait, what? Yeah, really. Perhaps Samsung figured no one was going to get 4K out of their 4K set anyway. Luckily, we had one of the early-gen TH01 models, manufactured in May 2014, rather than the newer TH02 model. We’re guessing this is probably the one and only time being an early adopter actually paid off! We should note that our cursory research on the 2015 equivalent of the HU8550, the JS8500, hasn’t definitively determined whether it has the higher-quality converter of the original HU8550.
Rare bird of a TV in hand, the last piece of the puzzle was a 4K-capable HDMI 2.0-equipped AV receiver, because we were running a full-blown AV system after all, not just a computer and screen. We went with the Pioneer Elite SC-81, which besides being up-to-date HDMI-wise also happens to have excellent sound quality and incredibly-efficient digital amplifiers. That’s a win-win-win in our book! Hooked up to our EVGA GeForce GTX 980 FTW 4GB Video Card, we were finally ready to display real 4K, 60Hz, 4:4:4 content on our 4K TV….
And the good news is that after just a bit more fiddling than we anticipated, we got it to work! The last step was finding an obscure setting on our Samsung HDTV that enables “UHD HDMI Color.” Once that control panel “switch” was flipped, which apparently makes non-HDMI 2.0 sources go bonkers and stop transmitting, we got what we came for: The Real 4K. We are now mousing around our desktop in full color and enjoying a 60Hz/4K gaming experience on the big screen, and it’s glorious. There’s just no comparison between real 4K content and 1080p content, and because PC games can natively display their images in 4K without scaling, this is in fact the real deal. It took longer than we expected, but it was worth the wait.
Alas, the GTX 980 is a pretty powerful video card, but 4K gaming is quite a challenge even for it. So we’re enjoying some of our older games for now, hoping that the rumored GTX 980 Ti arrives sooner rather than later! As for UHD Blu-Ray? Well, we’re prepared for the eventuality that our 2014-era TV and receiver may not be modern enough to support it. In fact, we already know that our receiver doesn’t support the HDCP 2.2 standard required for UHD Blu-Ray content protection, and our TV has a single HDMI 2.0 input that supports it, which happens to also be the one that only supports 4:2:2 subsampling, so no full-color UHD Blu-Rays for us! That’s the grim reality you face when you’re skating on the cutting edge of AV, it seems.