Wednesday, June 27, 2012

Ground loop hum from HDMI even over CAT6

One reason I had been using a fiber optic connection between kitchen and living room was electrical isolation.  The fiber optic connection could not in any way create a ground loop.   I decided not to use another fiber optic system because it would have cost a minimum of $300 more than the CAT6 alternative, and possibly much more, and because the CAT6 connections are cheap enough I could extend them to even more rooms.

I was hoping that the CAT6 digital video balun, which transforms the HDMI transmission into one that goes over a pair of CAT-6 wires, and then back to HDMI, would not induce the ground loop, given that the two baluns would isolate the ground.

Unfortunately, it did not work that way.  Once I hooked up the new video connection system using CAT6 and baluns, I did get a slight amount of ground loop hum in the living room system, regardless of analog input chosen.  When I disconnected the RCA's from the TV, or disconnected the HDMI cable from the TV, it would go away.

So I went through my cable box and could not find any more Jensen Isomax CI2RR audio isolation transformers with RCA jacks.  I did have one spare unit last year, but I took it for the computer I use at work, which creates horrible hum because the computer is plugged into an online UPS but the amplifer isn't.  I had been using a cheap Radio Shack isolation transformer there, but apparently the current running through this ground circuit when the UPS is power the computer was great enough to burn out the Radio Shack unit.  So far the Jensen has not suffered the same fate.

So for the last week I had been using a spare Radio Shack isolation transformer.  It did the isolation job fine, though I wondered if it had the best audio fidelity.  (From measurements, I know it gets rather high bass distortion if level are high enough.  The Jensen is almost perfect, as it should be for about 10x the price of the Radio Shack unit.

So today I have ordered another CI2RR isolation transformer, along with an 18 inch DVI cable (for the HDMI audio inserter I use on my harddrive recorder which has no HDMI outputs), and a VS-1SS S-video isolation transformer I can use with the Denon 5900 in the living room I use for audio discs.  The living room TV has only one HDMI input, so normally it will be connected to my kitchen video hub.  On special occasions, I could hook the Denon to the TV using a DVI/HDMI connection, but it's not necessary normally because I can use the Oppo BDP-95 in the kitchen to watch video discs instead, and that will always run through the new CAT6 HDMI connection.  Full remote control of the Oppo works great too, through the Radio Shack Remote Wireless Extender transmitter I have in the living room.

Tuesday, June 12, 2012

Fun with FM and TV

I have been finding that my Kitchen system has more clarity on FM when playing in the "Direct Stereo" mode of my Yamaha receiver.  That mode bypasses analog-to-digital conversion, DSP processing for eq and crossover and surround, and digital-to-analog conversion.  I know that the Direct Stereo mode bypasses all this because I tested it many years ago.  It's quite clear when digital processing is being done, because square waves coming through the unit have that distinctive pre and post ringing at ultrasonic frequencies.

Well usually Direct Stereo sounds better, and I think it's mainly because of the crude crossover that is being used, that crosses over to subwoofer at 100Hz or 80Hz, most likely at 12db/octave.  In the bedroom, a larger room, I cross over the subwoofer at 48dB/octave at 60hz.

I had usually kept the receiver on "2 channel stereo" because, despite the name, that is the mode that uses DSP to implement the crossover for the subwoofer.  That still works best if the deep bass is the most important part of the sound.


I've also tried some surround options.  Concert Hall uses DSP processing to get a concert hall effect out of 5 (or 7) speakers playing plus subwoofer.  It makes the normal stereo seem a bit wider.  It has the most impressive effect on a mono announcer, he sounds like he is speaking from the stage in a big concert hall (duh).


Over the second weekend in May I finally got my living room TV hooked up again to my central video system (in Kitchen).  That means I watch satellite TV and hard drive recordings and and computer and DVD's and even Blu Ray discs from the living room TV.  That was what I had set up in mid 2009, but late in 2011 the OWLink optical transmitter for HDMI to the bedroom failed.  I replaced that transmitter with the one that had previously been used for the living room, and ever since then, there has been no video system connectivity to the living room.  (The bedroom is the most important video link, I use that connection daily.)  For showing my monthly movies in the living room, I've typically used the Denon 5900 player there, and fortunately all the movies I've shown since November have been brand new (no concern about dirt from rental movies).

As discussed in this blog, I investigated several options.  I could buy a new OWLink kit for about $450.  They are very very hard to find (out of production for several years) but can be found.  There were other optical options as low as $299.

I decided to try a CAT6 balun instead.  They are much more widely available and more reasonably priced, from many brands, and the connecting CAT6 wire is a commodity product which can be readily replaced or extended.  I got the basic kind that uses two CAT6 wires.  This has essentially the same number of wires as the HDMI cable itself, so the system is relatively low cost.  Much more expensive systems manage to cram all the information into a single CAT6.

Setup was actually quite easy.  The hardest part was running the two CAT6 wires, but even that was fairly quick.  I had bought the 50' CAT6 wires from two different Best Buy stores, but they worked fine together.  I also hooked everything up to make sure it worked before running the CAT6 cables behind the couch and under the kitchen counters.  That pre-testing made running the cables harder because the cable became a huge snarl rather than a small roll that could be simply unrolled.

Another trick was getting rid of the audio hum.  After the HDMI through balun from the kitchen was connected to the living room TV, the audio output goes into Analog 1 of the TACT preamp.  There was a bad hum after hooking up the HDMI because of ground difference in the two powerline circuits.  So I used a Radio Shack isolation transformer on the audio line, and it fixed the problem.  (Now if I could only get KPAC to fix their hum problem.)  I plan to replace the Radio Shack isolator with one from Jensen that I use in other locations.  The Jensen transformers are very good, but also this is only TV.  If this were a high resolution audio line, I'd figure out some other solution.  The main other solution would be going back to fiber optic  HDMI.  I was hoping I wouldn't have to do that, and it has turned out that I don't need to, but I did need to use an isolation transformer.

I also took the opportunity to add the DVI-plus-Audio--to--HDMI adapter to the DVI line from the Anchor Bay DVDO.  That means the audio signal is sent down the HDMI connections to living room and bedroom from the hard drive recorder.  I had been using that for several months now, but it got taken off during the massive rewiring on the previous weekend that moved DVDO and HDMI-switch into the main rack for reliable remote control from all rooms.

I also replaced the 15' HDMI cable (borrowed from the Mac-to-Bedroom-TV connection) with a new 12' HDMI from Blue Jeans cable.  That means I got the Mac-to-Bedroom-TV cable back, for use in the bedroom as intended.

The system currently lacks a "rental" dvd player because that unit, a Denon 2910, is still under the table where there is no no remote control (I've given up trying to get remote down there through wireless transmission, that was the whole point of the rearrangement) and no HDMI connection to the main switch.  Probably that player should be moved to the main rack also, so it can be controlled by remote.

Friday, June 8, 2012

Resolution needs more bits

Resolution is not "signal to noise ratio (SNR)."  Resolution refers to something we can't easily measure directly, but infer, in analog systems.  Analog amplifiers have the potential ability to reproduce every voltage level from zero to maximum, subject to limitations (noise, distortion, etc) which are additive in nature.  If the world were truly continuous, this would mean it can reproduce an infinite number of potential voltage levels, for in between each two levels we could specify, there would be another.

It is already known that we can hear a signal even when embedded in noise that is equally loud, so long as there is some other characteristic we can use to distinguish the signal from the noise, such as frequency response.  If the noise were pure gaussian, and the signal a pure tone, it is easy to see how this could also be done electronically as well as by ear.  I can't think of a general rule which would describe the limits of ability to do this.  For quite some time, it has been claimed that we can still hear the signal even if the noise is 15dB louder, but if our hearing were as good as possible for any acoustic sensor, and we had some pre-knowledge of either the signal or the noise, the noise could be much higher still relative to the signal.

Digital audio systems have been designed to have SNR nearly as good as the best possible analog equipment, and far better than most, with the potential 96dB SNR at 16 bits.  Perfectionists like me have sought to use 24 bit digital audio systems that have potential 144dB SNR, which is better than the best available analog amplifiers.  That has to be good enough, right?

Well, no, if the goal is to have the same resolution as analog systems.  I can't prove that we need this extra resolution, but I have some suspicions that it is, and I think it's interesting to think about what that kind of performance would require.

The answer seems to be we need bits to encode resolution down to some appropriate quantuum level.  Assuming the world is like that described by quantuum mechanics, 
quessing an appropriate quantuum level to be about 10^-33 volt, the number of bits required for 0-1V would be about 76 (2.3 bits per decimal digit x 33).

I can imagine a 24 bit encoder expanded to full 76 bit capacity.  Alternatively, and even given off-the-shelf parts, one could cascade multiple 24 bit encoders.  Suppose a successive approximation method is used, then we could simply make sure that 76 iterations of successive approximation are done.  Now we can't prove we have 76 bit accuracy, but I wasn't so much worried about accuracy as resolution.

Now such a digital encoder would probably not perform as well as an ordinary 24 bit one with regards to signal to noise ratio.  But often analog systems don't do that either.

I say not to worry about either the fact that SNR is not improving or even getting slightly worse.

Now I've only been considering the quantization accuracy here.  What if we consider time, making the sampling interval as small as possible.  Well, that gets us into trouble very quickly.  It is clear we have no hope of digitally encoding at something like a quantuum rate (10^-30 sec), and we would have no hope of storing so much data either.

My guess is that, for us, the amplitude accuracy is more important than the timing.  I've generally felt that 96Khz sampling rate is sufficient.  For what it's worth, audio research published in JAES has claimed that jitter, for example, isn't audible until we get to the 100's of nanoseconds, about 1000 times worse than the level modern digital equipment performs at. 

A 76bit/96Khz stereo datastream would require less than twice as much as a currently used 24bit/192Khz datastream.  It would be quite feasible to implement, if you were making digital converters anyway.  A major problem would be fighting off the nags who say it isn't necessary and has no proveable benefit.