High definition has lost its luster and sparkle, at least in name. I was at a SMPTE conference in New York recently, speaking with a codec IP vendor about their product, and as I was taking notes I wrote "4K HD" and he corrected me -- Digital Cinema, he said is 4K, and HD is 1920 x 1080, or "almost 2K" in filmmaker's parlance. There is no such thing as "4K HD." (The resolution of 2K digital cinema is 2048 x 1080 at 24 frames per second, and 4K is 4096 x 2160.)
He was technically correct, of course, but this was the first time I've ever actually been accused of using the term "HD" sloppily. It reminded me a bit of the early days of portable video, when college professors I knew and some of their disciples would become enraged whenever anyone referred to the videotape as "film."
Including the word "high" in an acronym recalls the experience of radio wave naming, progressing from high frequency (the HF band) to very high (VHF) to ultra high (UHF), after which, upon entering the gigahertz range, the namers of things wisely gave up on "high" and switched to numbers.
What happens with the naming of HD beyond 1080p is still up for grabs. I have used the phrase "ultra hi-def" on numerous occasions, skipping right past "very hi-def," which frankly just doesn't quite sound very hi def. NHK's futuristic TV system, with resolution several multiples better than today's HD, is called "Super Hi-Vision" (see
NHK demos 8K x 4K image sensor .) We already went down that naming convention road in the 1980s and 90s with Super-VHS and Super-Betamax, and Super Audio CDs -- I'm not sure if it's wise to repeat.
But as far as the professional production community is concerned, HD refers to 1080p, 1080i, and 720p. Anything better has another name. HD is a distribution format.