Close

The ridonculous era of overscan and hdmi

A project log for Potentially Useful/Obscure Linux Stuff

Things that I've found useful/hard-to-find in my Linux endeavors...

eric-hertzEric Hertz 05/14/2019 at 03:511 Comment

This is a rant, that's all it is...

Seriously...

So here's the deal... TV's have historically displayed the image in such a way that the edges would be covered by the TV's housing. This, historically, was because, historically, TV-signals weren't perfectly-synchronized at the very beginning of each scan-line... so the [left] edge was pretty ugly. (and maybe the right, as well). Also, historically, people kinda preferred the "rounded" look of their old TV-sets, which meant that quite a bit of the once-square (by-design, not in reality) transmitted-image was hidden at the corners by, again, the TV's housing.

Frankly, it didn't really matter, back then, if you lost a tiny bit of the televised image, it was way better than looking at garbage on the edges.

OK, Great... But we live in an era of LCDs, Pixels, HDMI, etc. Frankly, in this era, I consider a display-purchase in terms of price-per-pixel. I sure-as-heck don't want my pixels to go wasted! (or, frankly, worse: mangled!)

We also live in an era where "native-resolution" is *considerably* sharper than any scaled-resolution, even with the fancy new intra-pixel-anti-aliasing (or whatever the new buzz-word may be) available today... The fact is, there are X horizontal pixels, and Y vertical, on today's screens.

Note that this varies *dramatically* from the old-school display-technologies, *cough* CRTs *cough*... which, frankly, were quite a bit more "analog". As long as the circuitry could handle it, it wouldn't appear much different if you displayed 300 rows or 350 rows... it'd just scale the image to the available screen.

But, now, our era is such that displaying 1080 rows on an allegedly 1080-row display means first upscaling the image from 1080 to, say, 1120, then cropping off the upper 20 and lower 20 rows. This is called "overscanning." The idea is to mimic the old TV-technology of displaying stuff outside the TV's housing, so you won't see the "garbage" at the edges. Except, yahknow, we're in a digital era; if there *is* any "garbage" at those edges, it's because it was *placed* there, intentionally (e.g. for a while, there, that "garbage" in the end of the analog-era contained things like closed-captioning). Regardless, in the digital-era, it's *completely* irrelevant, and merely exists as a "feature" that most people *do not want*. 

There's a bunch of math involved, but basically it boils down to a *much* less-crisp image, because whereas *without* this artificial "overscanning" *every* pixel transmitted (from the TV-station, or the video-file, or the *cough* COMPUTER *cough*) corresponded to a single pixel on the screen itself, instead, now, we have pixels which are one and some-fraction tall/wide. And how do we display some-fraction of a pixel? By anti-aliasing, or "intra-pixel" goofiness, which may [or may not] be so sophisticated as to consider each "pixel" and the physical position of its red, green, and blue "subpixels," and somehow creating a fractional-pixel (in which direction{s}?) by using a subset of RGB that may, in some case turn out to be GBR... [which, in fact, when displaying black text on a white background may appear as, e.g., Red-shift at the edges, but that's another topic. Although, now that I think about it, it's kinda ironic that in this digital era, we're essentially experiencing the bane of the analog-display-technologies *again*, e.g. NTSC="Never The Same Color," along with the whole reason people once preferred monochrome "hercules" displays over color CGA for text-editting, and more. We're experiencing it again! History Repeats! DESPITE THE TECHNOLOGY NOT TO!).

Regardless, what it boils down to is that even when displaying a 1920x1080 image via HDMI (which is inherently digital) on a 1080p-native display, what you're really viewing is something more like 1880x1040 stretched across 1920x1080 pixels. The 20-pixel "border" on *all* sides is completely non-displayed, aka. "invisible". And each of the *visible* transmitted-pixels is something like 1.002 pixels on the displayed-image (mathing not actually done to come up with that number).

Now... I'm sure this is a *common* irritation, The Great Goog will tell you such... Just search for "overscan hdmi."

The VAST MAJORITY of results on how to relieve this problem are to adjust the settings on the TV itself. Makes a heck of a lot of sense... But somehow, apparently, there slipped, even in the 1080p-era, displays which *cannot* disable overscanning. I happen to have one.

----

So, let's look at this situation differently...

I'm trying to connect a computer to my TV, which allegedly has HDMI (an inherently digital interface, which inherently sends *every* pixel, as a unique entity), allegedly has 1920x1080 pixels. Yet, I cannot see my taskbar, and when I maximize a window, cannot see either of the scroll-bars.

Is it that my display is actually less than 1920x1080? Or is it scaling up my 1920x1080 image to something, again, like 1960x1120, then cropping every edge?

Again, The Great Goog (and all the searched forums) insists that the solution is to turn off overscan *on the TV*... But somehow, I've managed to come across one of the few that don't have any option like that (e.g. "pixel-for-pixel" etc.). This may be why this was only $30 at Goodwill. 

But, yahknow what? RANDR 1.4 (TODO: LINK from xorg, yahnow, RANDR, which 'xrandr' gives access to, from the command-line) explicitly states that one of its new features is to *eliminate* such borders, explicitly for such televisions... (TODO: Friggin' copy-paste that quote. Seriously).

Thing is... HOW is it done...?

(and, by the way, it *isn't* done with my setup... because apparently my vid-card's driver doesn't offer that capability).

So, to recap: My TV overscans, inherently. Something like 20+ pixels of the image are lost at every edge. It's *plausible* my TV cheated, and isn't actually 1920x1080, but more like 1880x1040... It's equally-plausible it's actually 1920x1080, but uses scaling to acheive this overscan "feature".  Regardless, I can't see my friggin' taskbar nor scroll-bars.

There are a few solutions on the 'net, involving using xrandr and --translate, or --scale, but, frankly, those don't work with my setup.

And, frankly, I haven't come up with a reasonable solution, yet. The *obvious* solution involves using the RANDR 1.4 "border" technique, but, again, my vid-card doesn't support it.

But, again, the bigger question is *HOW* it achieves this...

WHATEVER solution I come up with will result in fractional (IOW UGLY!) pixels. Unless, maybe, my display isn't actually 1920x1080, but something smaller, in which case, I might be able to force some "border" or "translation" which results in a *native* resolution that matches the display.

Frankly, I don't really care...

What I do care about is *how* this is achieved...

If I understand-correctly, at the digital-level, EVERY PIXEL is transmitted *uniquely* via HDMI... So, if xrandr is somehow capable of sending a 1920x1080 *timing* to the TV, while squeezing (allegedly) 1920x1080 pixels into the TV's *NON*-bordered space, then, technically, xrandr would be attempting to squeeze more pixels onto the display than it physically can handle. So, now, we've got a 1920x1080 image squeezed into 1880x1040 pixel-space, which, again, is scaled-up to 1920x1080 on the display itself.

BTW: each time a scaling is performed, the image gets blurrier, but that's another topic.

I'd Much Rather: Know my display's *native* resolution, and work with that... But, again, let's say it's 1920x1080, and overscanning is inherent, then the only way to display 1920x1080 actual pixels is to transmit a slightly higher resolution, which will be overscanned...

Here's where things get really stupid.

I mean STUPID.

This TV (and surely most) has already demonstrated that it's more than capable of scaling... Send a 728p image, and it'll upscale to (allegedly) 1080p. Send a 1080p image, and it'll upscale a few pixels off-screeen on every edge.

So, the logical conclusion is to send, e.g. a 1980x1140 image, with borders, and the screen should scale that down to 1920x1080 *with borders* and all will be right with the world.

But No!

Displays like these seem to expect "standard" resolutions!

Wait, what?!

The dang thing can scale and scale again, but it can't handle an un-programmed resolution?!

I mean, what're we talking about, here...?

Mathematically, *every* resolution imaginable could be handled with a simple math equation... So, what'd they do, program a shitton of if-then or "switch" statements for each *expected* resolution? WTH?!

I might expect this of old displays... ala the VGA era, wherein the display itself had to determine where pixel-data started and ended... But this is the HDMI era...

I dunno how much y'all know about HDMI, so I'll tell you: HDMI carries a "Data Enable" signal. When that signal is active, there is active pixel-data. This is KEY.

Back when our LCD displays had to accept VGA input, they had to *scan* the input-signal to try to detect where pixel-data began and ended. That's because... well, VGA didn't send that information. It was *expected* that your CRT would display a black border (where pixels could've been displayed) before the actual pixels came-through.

Note: This is the *Opposite* of the old Televisions, which, again, displayed the image *before* it was visible on the CRT...

BECAUSE: EVEN IN THE VGA ERA (again, still analog), We Were Capable Of Aligning Pixels Vertically!

But NOW, In the HDMI ERA, we're pretending we're not capable of that, and "overscanning" to compensate for some imaginary problem DECADES-resolved.

Meanwhile, THINGS ARE EASIER NOW. We actually TRANSMIT a signal that says "PIXEL DATA EXISTS HERE." And, yet, we ignore it.

-------------------

THIS is where I get UTTERLY PISSED OFF....

This display--and it's surely not the only one, otherwise RANDR 1.4 wouldn't make mention of it as its *first* major change--is fully-capable of scaling up AND scaling down, AND pushing those scalings out of its range... yet it can't handle a simple input-resolution that's *slightly** out of its expected range? RIDICULOUS.

Again, we're talking about a tremendous amount of *MATH* involved in its upscaling/downscaling, yet its *reception* uses nothing but "switch" statements, *despite* being *TOLD* when pixel-data is available.

Beyond Belief.

....

Man, I wish I could come up with an analogy...

For some reason Weird Al's song "This song is just six words long" comes to mind.

.

I mean... the thing is... ALL the computation-power is there. In fact, the friggin' libraries are there. There's *nothing* stopping it, except software.

This is the godforsaken era in which we live...

On the plus-side. I got a 1080p 30-something-inch display for $29 at the thrift-store... I can deal with it.

....

On that note: My old 1600x1024 SGI display required a similar hack... Oddly, that one was *easier* because it used VGA (analog) rather than HDMI... 

Lacking a "Data Enable" signal, a controller receiving VGA has to *detect* the pixel-data. The trick, there, was to initially send a standard signal (1600x1200, as I recall), synchronize the display, then switch to 1600x1024 with otherwise the exact same timings.... (so extend the vertical porches by 1200-1024=176). The *display* thought it was receiving 1600x1200, and I was able to stretch it, via configuration options, to fill the screen.

The irony, now, being that HDMI actually *sends* the "data enable" signal, telling *exactly* where the pixel-data is, yet the display can't handle it because it's *unexpected*, despite the fact that *MATH*, the same dang math it does for its weird-ass scaling, is apparently beyond it.

--------

I probably had some other intent with this rant... I don't recall...

xrandr, it may help...

Frankly, if I wound up with 1880x1040 resolution and it actually matched-up pixel-for-pixel, that'd be great. As it stands, it seems downright ridiculous to use 1280x1024 on this display. And, doesn't matter anyhow because it *still* overscans that shizzle.

Here's what it really boils down to....

I need to rant, and this particular thing is something I don't particularly care about.

Computers are stupid.

Discussions

Dr. Cockroach wrote 05/14/2019 at 08:33 point

Wow, you got that right my friend ;-)

  Are you sure? yes | no