Legacy Connections, Power Strips, Bit Rate
Why is it that most A/V receivers still have 2-channel RCA inputs? Why doesn't a company make a receiver that has only HDMI, maybe with a few component-video and digital-audio ins and outs? Why the need to include legacy stuff?
Because lots of people still have legacy products, and an AVR is intended to be the central switching station in any home-entertainment system. It's the same for HDTVs, which still have composite and S-video inputs on them.
When you plug components into a surge protector (or any power strip for that matter), do the peripherals continue to draw vampire power after the switch on the power strip has been turned off, or do they continue to draw standby power? I have read both accounts. Plugging and unplugging from the wall can take a toll on the outlet. I would prefer to just flip the switch on the strip if it powers them all down, but I want to ensure that there is no power going into the components.
As far as I know, turning off a power strip stops all power from flowing to any of the connected devices. However, this might not be such a good idea, especially if the devices have something in memory such as the date and time or other settings that rely on a trickle of power when they're off. Also, if you turn the components on and off with the power strip's switch, they might receive a momentary jolt of power that could shorten the life of their delicate electronics. Finally, if you suddenly stop power from flowing to a device that's on, it may think there's been a power outage and reset itself, requiring you to turn it on separately anyway.
In my view, the trickle of power is negligible and well worth paying a few pennies for in order to keep any memory powered and prevent the possibility of power-on surges. It would be much better to get a good universal remote and power the devices on and off that way.
A Bit of Confusion
I really appreciated your discussion with Leo Laporte a while back in which you were explaining how analog audio signals are sampled to make a digital file. Let me see if I got what you were saying before I get to my question. You start with an analog signal, and you sample it 44,100 times per second for CD, measuring the instantaneous level at each sample point.
Where I get lost is when we throw in the bit rate. I create my podcast recordings at 64Kbps, but I don't know what that actually means. We've measured the amplitude 44,100 times per second, so what's left to do at 64Kbps?
You got it right that to convert an analog audio signal to digital, the signal is instantaneously sampled 44,100 times per second (the sampling rate, often written 44.1kHz). Also, the instantaneous level of each sample point is represented by a 16-bit binary number (the bit depth). BTW, these are the specs for CDsBlu-rays often use different sampling rates and bit depths.
Bit rate is a measure of how much data must be able to flow through the system to play a digital-audio file, so it is relevant during playback, not sampling. The bit rate required to play a 2-channel signal that was sampled at 44.1kHz/16 bits is 1.4 megabits per second (1.4Mbps). This is uncompressed digital audio, which is technically called PCM (pulse code modulation).
However, podcasts must use a much lower bit rate than this so they can be quickly downloaded or streamed from the Internet, so the PCM data is compressed by discarding up to 90 percent of the original sampled data. If this is done properly, you still perceive much of the intended audio, but the data rate required to play it can be reduced to only 64 kilobits per second (Kbps) or even less.
There are many different types of digital-audio compression; MP3 is the most common. So if you see a digital audio file that is identified as PCM, you know it's uncompressed, whereas an MP3 file is always compressed so it requires a lower bit rate to play back.
When you record audio for your podcast, the computer is compressing the digital audio as it goes, and you can typically specify the bit rate you want to use when the file is played back. In general, the lower the bit rate after compression, the lower the audio quality.
You might want to try a little experimentrecord something at the lowest possible bit rate and then record it again at a higher bit rate, then listen to both recordings and see if you can hear the difference in sound quality.
If you have a home-theater question, please send it to email@example.com.