Tuesday, December 4, 2012

More thoughts on the mediocre standard definintion video

I'm absolutely sure that I have seen better Standard Definition (480i) video than my Mac is now producing through the Belkin adapter (to get HDMI) and the Monoprice adapter (to get analog component).  It's very soft looking, and if I get up close, I can see weird artifacts.

480i done correctly can be a beautiful format.  It ways all most people ever knew in video until about the year 2000.  For 10 years I was the proud owner of a $3299 list price Sony TV made in 1997, a 32XBR100.  This was probably the best NTSC-only TV ever made.   It did not support any extended format features.  It did not do progressive scanning even.  But it did a perfect job of decoding NTSC from either composite (!) or S-Video inputs.  The composite inputs were decoded by one of the best Digital 3D Comb Filters ever made.  (I think the one in my later 34XBR960 is also very good.)  The 3D comb filter eliminated interlace artifacts.  In 2008, when I got a very nice 550 series Samsung 42" LCD to replace it, I was extremely disappointed that it could not eliminate interlace artifacts (from component video inputs) as well as the 32XBR100 had.  I then bought an ancient Faroudja de-interlacer (which originally sold for $15,000) and a DVDO scaler to get back what I once had.  But neither of those were as good as the Sony was either.  Neither of those expensive converters actually has a 3D comb filter.  But the DVDO is a nice unit and I still use it today to digitize analog sources onto my digital TV network.

And don't get me started on LCD vs CRT.  LCD has lagged for years the dynamic range, bright-to-dark, of old CRT sets by a country mile.  LCD has no problem with bright, but it has a problem with near-black, the so-called "shadow detail."  With fancy LED lighting schemes, LCD may be finally catching up.  I don't buy TV's often, so I'm not quite sure how good they are now.  People I talk to generally don't get that LCD is not the same as high definition.  It's just that they both came on the scene at about the same time.  And almost nobody has seen CRT TV's as good as the "XBR squared" models I have had, the 32XBR100 and the 32XBR960.

But now that we have High Definition TV, few if any manufacturers do a decent job with Standard Definition anymore.  I think that's basically the problem here.  They just can't be bothered to do it well anymore.  And they might say that nobody demands it anymore.

The 480i component video I now get is not nearly as sharp as the 480i S-Video I had been getting from my Titanium Macbook Pro, made in 2003, which had an S-Video jack right on the side.  That was also going through the DVDO adapter I am still using now.

 So where is the problem now?  I doubt that anything bad is done by the Belkin adapter which converts mini DisplayPort to HDMI.  It looks to me like this is not a conversion at all; the display port is simply carrying essentially the same lines as an HDMI cable.

It could be that there is a bandwidth limitation in the Monoprice adapter which converts HDMI to Composite Video.  But since this adapter is capable of 1080p, you would think it would have no problem with 480i.  And the box says it has 10 bit converters, that should be good too.  Many early DVD players, even the most excellent ones (like my $1500 Sony DVD-7000), had only 8 bit converters.

I have a suspicion a lot of loss is occurring in the Mac itself, via a very lossy 480i codec.  Actually, I just read this website explaining video codecs, and 480i is not a codec.  But 480x720i is a codec, and that is the one being used.  Video codecs are usually at least a little lossy.  I suspect one of two things (or both)

1) Apple didn't much bother to make a best-possible 480x720i codec.  After all, who uses that anymore?  They can't be that critical if they are still using such an old (the original) format.

2) Apple didn't want to make a good 480x720i codec.  Standard Definition video is the one video source that can be copied by all VCR's and most other video copying devices.  My Sony RDR-HX900, one of the last consumer video recorders Sony ever made, only accepts interlaced inputs in Standard Definition.  While not all electronic manufactures have interests aligned with the movie industry, none of them want to offend the movie industry either.  And one way they do this is by not making it easy, in fact making it increasingly hard, over time, to copy video, any kind of video, even when there is not active copy restriction system like HDCP in effect.  Manufacturers remember how Sony was sued by the movie industry (and Sony now *is* the movie industry, or at least one of the biggest players) for making the Betamax which could be used to record broadcast TV and movies.

3) Apple softened the 480x720i codec specifically so it would not look as bad on standard definition displays that have lousy comb filters.  Actually, only "composite video" in standard definition even needs a comb filter.  Neither S-Video nor Component Video require a comb filter.  But just in case someone might be using composite video, after all what does anyone do with 480i anymore, the codec is softened to reduce artifacts if viewed through a composite video connection.

If indeed the problem is the lossy codec that Apple uses in the Mac, I'm not sure what I can do about that.  Basically nothing unless I can find a better replacement version, which is unlikely, I am afraid.  Now in theory I could have the Mac output 1080p, and have some other device downscale to 480i.  But guess what...nobody makes scalers that downscale to 480i !!!  Or at least I have not found one.  It seems like converter makers also follow the written and unwritten rules to make 480i less and less available.  My DVDO will produce 480p (even from a 1080p source) but will not produce 480i.

But it's also possible the main limitations are in the Monoprice HDMI to composite adapter.  And in that case, I can simply use another adapter.  I will do that very soon.

No comments:

Post a Comment