Ultra High Definition is finally starting to take hold in the market with equipment prices that are beginning to drop into the price range for buyers of normal means and more content is starting to become available for everyone’s viewing enjoyment. But what does the new technology entail and how does one navigate all the new specifications, acronyms, and jargon along with all the recycled, updated, and bastardized terminology from the previous generation of HDTV. That is where Audioholics comes in to do some critical review to help you, the reader, sort through it all.

We will start our journey of discovery in this first article by initially attempting to identify both the wheat and chaff so that we can later separate them to establish what defines Ultra High Definition TV. This will involve trying to determine standards and specifications used for UHD TV. We will then move through definitions and explanations of some of the relevant technology and hardware performance in a subsequent article. Finishing up, we will discuss what sorts of performance characteristics to look for in your own personal quest for UHD as well as how the consumer electronics and film industries work against the owners of slightly less recent gear that could still otherwise function in this new UHD environment and how one might go about defying this built in obsolescence in our final article.

Muddled Definitions, Partial Specifications, and Marketing BS

There is plenty of potential for confusion in the marketplace for UHD TV that is built upon the previous generation of confusion about HDTV. This confusion is generated by liberally applied marketing jargon from manufacturers attempting to create brand distinction between one another for features that are otherwise functionally identical, a multitude of specifications and specification bodies, some of which feature optional specifications, and changing terminology for effectively the same performance characteristics. All of this is further exacerbated by manufacturers who don’t provide standardized or complete technical information about their products, sometimes even using marketing jargon to obfuscate performance deficiencies, exaggerate the actual performance, or to suggest functionality that is not there, and retail sales workers that are confused themselves. Taken all together, this makes it difficult for most consumers to navigate the growing plethora of new products and make informed buying choices.

There are numerous examples of these sources of confusion occurring over time that consumers have to wade through. One such example caused by manufacturers, marketers, and retailers who made an effort when LED backlighting for LCD televisions were first introduced to refer to the new LED backlit LCD displays as LED TVs while continuing to refer to the older Cold Cathode Fluorescent (CCFL) backlit models as LCD TVs. Both CCFL and LED are backlight technologies used for LCD displays, but the sales staff at your local Best Buy would argue that point with you with all of the flexibility of thought and understanding of a rock. Elsewhere, they will likely just look at you funny and carry on as if you had not said anything.

Another example is HDMI where portions of the specification are optional, and up until recently, products were labeled by HDMI version number, but were not required to even list supported portions of the specification by the HDMI licensing body. Let me be very clear about this: optional specifications are not specifications and referring to them as such is inherently an oxymoron, with emphasis on the moronic. Further, with the increasing bandwidth of each version of the HDMI specification, the qualitative descriptions of High Speed and Standard HDMI cables have been sliding with the specifications. The good news for cables is that you can use any cable that can transmit a picture with equipment of any HDMI version number: either the cable works, or it does not, and only have to replace the cable if needed. The bad news is with the equipment where you often have to have the matching standards. Even if the device is built such that it could support the signal processing involved, provide the necessary bandwidth, or if functionality is renamed and/or regrouped in a subsequent specification, an older feature treated as a new feature, the device will not work because everything is digital and the devices can be programed with which features they are allowed to support.

The Blu-ray specification is yet another example. Between being released before it was completed to compete with the now defunct HD-DVD standard, and rolling out the parts that did not make the original standard piecemeal using a band aid referred to as profiles, it also allows partial compliance. Features such as audio codecs, secondary decoders, internet connectivity, built in local storage, and even compatibility with preceding formats, such as DVD and CD, were all optional. The absence of many of these optional features became the bane of early adopters who found that not only would their expensive players be unable to support all the features later available on cheaper players, but that they eventually would be unable to play the latest movies when they were no longer being updated by manufacturers with the latest Hollywood mandated copy protection revisions. AV gear that was released in the earliest days of Blu-ray supported uncompressed LPCM audio decoding as the only mandatory high resolution audio encoding available. What some still do not realize is that the optional Dolby TrueHD and DTS Master Audio codecs that arrived on later AV gear are just a lossless compression of LPCM encoding which is always included on every Blu-ray. With the Blu-ray specification, even backwards compatibility with CD and DVD playback is optional, but is usually supported.

The latest example is, of course, 4K TV which is now also being called Ultra High Definition (UHD) TV.

The Easy Part: The Resolution

In defining what exactly is 4K/UHD, this is the easy part: at 3840 x 2160 pixels, 4K Ultra High Definition provides a picture with twice as many pixels, both horizontally and vertically, of what has come to be called Full HD (FHD) at 1920 x 1080 pixels, which is also known as 1080p. This means the picture has four times the overall resolution of FHD with more than 8.29 mega pixels to approximately 2.07 mega pixels and an enormous jump over traditional 480i SDTV with a mere 0.307 mega pixels, only half of which are actually lit up on any given frame thanks to interlacing.

From here, it gets a bit more murky.

The Less Easy Part: The Resolution, and Some Other Stuff Thrown In

The latest form for the type of confusion discussed above is with ultra-high definition television which has been called many things in the marketing and product literature: 4K TV, 4K HDTV, Ultra HD, UHD, 4K UHD, UHD TV, UHD Premium, etcetera. Initially, the most prominent terminology was to call the increased resolution 4K followed by TV or HDTV.

So why 4K rather than 2160p, which most consumers are familiar with and can related to at this point in time?

Apparently, in search of hyped up ways to describe the improved resolution, the marketing types initially seemed to latch onto 4K cinema. But, while both 4K cinema and UHD TV both happen to have 2160 vertical pixels, not being technical, the marketers apparently were unaware that 4K HDTV does not actually have 4000 pixels in the other direction, the one responsible for the 4K nomenclature.

The difference here is that cinema presentations and HDTVs are not at the same aspect ratio, so the resolutions are inherently different. Cinema 4K resolution is 4096 x 2160 pixels, an aspect ratio of about 1.89:1 while a 4K HDTV is at a resolution of 3840 x 2160 pixels, which is 1.78:1, commonly referred to as 16:9, and obviously does not actually have 4000 (i.e. 4K) pixels in the horizontal direction. Added to this confusion is the fact that the 4K specified screen dimensions in UHD switches from vertical lines of resolution, like 720p and 1080i/p HDTV or the traditional 480i SDTV, to horizontal lines of resolution. At least the now superfluous p has been dropped because there is no more i in the current video standards as there is only 2160p in 4K.

As of late, the various manufacturers associations seems to have settled on Ultra HD or UHD although the 4K moniker has been retained in some form or another in the marketing of many products by individual manufacturers. Now that High Dynamic Range (HDR) is being thrown into the mix mid cycle, so to speak, you will find all of the above variations on UHD Premium, the name now intended to separate HDR capable TVs from those not capable of HDR playback, which is also not followed by marketing from various manufacturers in any consistent sort of way. HDR also has nothing to do with 4K TV resolution, it is simply that 4K is the latest commonly available resolution that happened to coincide with the introduction of HDR. There is no reason one could not have a HDR 1080p FHDTV, or even a HDR 480i SDTV, other than no one is currently making them, although that could change with current bandwidth limits on broadcast TV, digital encoding, and some of the specifications floating around, as discussed below. One should also keep in mind that the definition of UHD also includes the forthcoming 8K resolution, meaning that UHD as defined can run the gamut of 1080p (i.e. 2K) to 8K resolution as long varying amounts of HDR are thrown into the mix.

Speaking of HDR, this acronym is its own muddle of rearranged definitions that covers more than just dynamic range. In addition to setting limits for the minimum range for the brightness and darkness of screen output, i.e. dynamic range, competing standards, HDR10 and Dolby Vision, based on the STMPE Perceptual Quatizer (PQ) transfer function, also include increased color bit depth and Wide Color Gamut (WCG), both of which are an optional part of some of the base UHD TV specifications; again, more on that below.

Editorial Note: Deep Color & Extended Gamut Back in the day, Deep Color and Extended Gamut YCC, also known as xvYCC as well as by Sony’s proprietary name for the format, x.v.Color, were once the terms de Jour. Deep Color specified an increase in color bit depth from the standard 8 bits per color to either 10 or 12 bits. This is now included in HDR10 at 10 bits or Dolby Vision at 12 bits. While not exactly the same color space as that used in xvYCC, HDR10 and Dolby Vision specify the Rec. 2020 color space, all of which are improvements on the older Rec 709 color space that was constrained by the limitations of CRT RGB saturation, which interestingly, current TVs cannot meet. More on that below as well. With xvYCC now apparently abandoned, the extra color gamut available in an older xvYCC compatible devices will not be of any benefit with sources following the newer standards.

To add in just a bit more confusion, a third HDR standard is floating around called Hybrid Log Gamma (HLG) that was developed by the BBC. HLG makes use of a modified Gamma Curve developed from the standard Gamma Curve used by most, if not all, current video display devices. This makes UHD content encoded with HLG backwards compatible with commonplace SDR equipment and content, but is not yet common place itself.

Standards and Standards Bodies Galore

So how did we get to this semantic quagmire?

Standards bodies seem to be a dime a dozen in consumer electronics. Consumer Technology Association, the UHD Alliance/Ultra HD Forum, DigitalEurope, and God knows how many others. It seems as soon as one group tries to decide on and set some sort of standards, another group comes along that thinks that they can come up with something better. Usually, they cannot, it is just different, and likely incompatible; sometimes deliberately. And when not going defunct or rearranging themselves into new forms, these organizations also seem to like to change their names periodically for some reason just to keep consumers guessing.

Myriad consumer electronics organizations and standards bodies exist all over the world, and all are pushing their own standards, many of which are at least partially overlapping in actual functionality if not implementation, and worse still is when several separate standards bodies join Voltron like into a giant meta standards body to publish standards of standards. Television broadcast standards, television manufacturer standards, digital cinema standards, video recording standards, video playback standards, digital encoding/decoding standards, international standards, regional standards, open standards, proprietary standards… the list goes on and on.

Let’s just say that, worldwide, consumer electronics standards are a cluster…

Broadcast TV has historically had a hand in some of this mess. Around the world, numerous broadcast standards were developed independently over time which drive the basic design of TVs in their respective regions that are then layered with the requirements of both the current standards and compatibility with legacy standards. So even if current standards do somewhat converge, the legacy requirements keep the underlying variation in play.

In North America, digital television broadcasting conforms to the Advanced Television Standards Committee (ATSC), based largely on recommendations from the International Telecommunications Union (ITU), but is still confined by legacy analog broadcast standards requirements of the National Television Standards Committee (NTSC). In Europe, the European Telecommunications Standards Institute (ETSI), the European Broadcasting Union (EBU), formerly the International Broadcasting Union (IBU), and the European Committee for Electromechanical Standardization (CENELEC) all morphed together into the Digital Video Broadcasting (DVB) standards that still have to make do with the detritus of all of the legacy analog broadcasting standards such as Phased Alternating Line (PAL) and Sequential Color with Memory (SECAM, acronym derived from the original French). Then there are the Integrated Services Digital Broadcasting (ISDB) standard in Japan, the Digital Terrestrial Multimedia Broadcast (DTMB) standard in China, just to name a few, along with each of their legacies.

To be fair, some of the differences in broadcast standards were driven by outside factors. Different screen refresh rates between North America at ratios of 30/60 Hz and Europe at ratios of 25/50 Hz were driven by local AC power generation at those corresponding frequencies. Other differences have come about from the independent development of technologies inevitably exacerbated by a lack of communication, particularly in the past, local preference, national pride, and good old corporate desire of control for profit.

Piled on top of all of this are the additional standards from other industries such as movie production and cinema projection required to actually get content onto a TV. This means an infusion or overlap with other standards to comply with motion picture production, such as the Society of Motion Picture and Television Engineers (SMPTE) and the Digital Cinema Initiative (DCI) specifications.

In more recent times, many newer standards accumulating in the actual video products making their way into consumer’s homes have at least been somewhat unified by recommendations from the International Telecommunications Union, a UN body responsible for worldwide coordination of information and communication technologies including global use of shared electromagnetic radio spectrum, communication satellite orbits, and communications infrastructure which develops and promotes corresponding technical standards. ITU membership is open to UN member states as well as private corporations such as telecom carriers, regional telecommunications agencies, and research organizations who act as nonvoting members.

Documents from organizations like the ITU can be thought of along the lines of worldwide model codes that, in turn, are followed and adapted to differing extents by other organizations closer to the actual product manufacturing or content production along with the necessary recording, broadcasting, and reproduction devices. It is at this level that new organizations spring up every time there is a change in technology, in an attempt to usurp the previous standards group, usually in an effort to control licensing fees.

Case in point would be the Consumer Technology Association (CTA), formerly the Consumer Electronics Association (CEA), which originally was an independent branch of the Electronics Industries Alliance (EIA), an American National Standards Institute (ANSI) accredited standards body that dates back to the 1920’s that seems to now be in competition with the more recent UHD Alliance/Ultra HD Forum that sprung up in 2015. Both are now decreeing standards for UHD TV in the United States that are fairly similar based on ITU recommendations, but for some reason, someone felt we needed another organization, apparently one that does not have roots directly in communications infrastructure engineering.

Editorial Note: History of Standards The EIA has a long history of producing legally recognized standards that have been used in the electronics and communications industries for decades, many of which are American National Standards Institute (ANSI) recognized documents. From power distribution infrastructure to broadcast and cellular telephone infrastructure to electronic and microelectronic components, the various branches to the EIA, such as the TIA, ECA, and JEDEC, have been active in setting technology standards. While the EIA as an overall organization ceased operations in 2011, the various sections still exist to support their particular industries. The significance of ANSI accreditation is that it was founded as an interdisciplinary standards body in 1918 by five professional engineering societies: the Institute of Electrical and Electronics Engineers (IEEE), the American Society of Mechanical Engineers (ASME), the American Society of Civil Engineers (ASCE), the American Institute of Mining, Metallurgical, and Petroleum Engineers (AIME), and the American Society of Testing and Materials (ASTM). Many of these societies produce model codes and standards that have been incorporated into law as manufacturing, production, and building codes. Unfortunately, this model of standards producing bodies does not seem to hold sway in consumer electronics. Standards do not come from professional organizations formed by independent engineers and academics with joint input from manufacturers to solve problems, they come from organizations formed by corporations to control profits. The altruism of solving common problems in the name of progress is supplanted by the desire for licensing fees as frequently, the majority of engineers involved are employed by those corporations and likely operate under pressure to represent corporate interests over their profession as a whole. This fundamental difference can be seen in the standards produced by ANSI and other legally recognized and accredited bodies which may cost money to obtain copies of the actual standards, but there is no money grab to make users of the standards pay for what they create based upon said standards. If such were the case, most of the infrastructure and facilities that we depend on in our daily lives, including our homes, would likely cost much more than they do now to cover those additional licensing fees and they would be an absolute mess of incompatibility and cost to either maintain or improve over the usable service life for these types of projects.

On the other hand, the CTA does not seem to really be following this more engineering orientated model, if it ever did, and of late, it does not even seem to be able to muster a comprehensive set of recommendations for the future of UHD but it is offering licensing. The CTA, which claims to represent over 2200 member companies, initially developed a somewhat limited set of recommendations for UHD TV in 2012 with some subsequent extensions, only to be upstaged by the recently formed, initially several dozen member strong Ultra HD Forum, many of whom likely overlap with CTA membership, that seems to have taken the lead with more substantial requirements.

To make matters more confusing about who exactly is in charge, the Ultra HD Forum seems to also have a parallel, complimentary organization called the UHD Alliance with many of the same members. The arrangement seems to be something along the lines of the Forum setting standards, particularly for video production and distribution, and the Alliance maintaining consumer marketing logos, product certification, and such, along with collecting the licensing fees, to produce TV sets capable of displaying Forum specification content. The UHD Alliance is behind the UHD Premium branding per the Ultra HD Forum specifications, which mostly separates HDR from non-HDR capable consumer UHD video equipment via this specification. Why two separate but apparently intertwined groups are required to do this is anyone’s guess. Maybe it is to pretend that the standards portion is somehow independent of the money grabbing portion.

The Ultra HD Forum states that its purpose is to assist in solving real world deployment issues of UHD video including HDR, HFR, WCG video, and next generations object based audio. Seems that could have been covered with some sort of existing group to which the members of the newly formed group likely already belong. But hey, what’s one more organization added to the mix?

In the meantime, the CTA has taken a yea, what they said stance, rather than updating their own standards, by announcing that they agree with and share the goals, along with a few member companies, of the UHD Alliance and its new UHD Premium standard shortly after it was announced. Or maybe it was the Ultra HD Forum? UHD Forum? Ultra HD Alliance? One of those.

Furthering their push for legitimacy in their specifications, the UHD Alliance/Ultra HD Forum is actually requiring independent testing to receive certification for meeting their standards and to be able to use their logos on products. Seems that should have already been happening with some specification somewhere all along, but apparently wasn’t.