Someone on the Free Republic forum mentioned to me that the "I Love Lucy" Show might have been seen by aliens around some far star by now.
|No, they can't watch "Lucy" (from "Galaxy Quest" )|
Uh, no. Don't believe everything you see in movie. Take it from a radio engineer, they can't.
Not every signal is designed for detection in a weak-signal environment, and the black-and-white NTSC television signal during "Lucy's" run in 1951 - 1957 definitely was not.
The transmitted signals of the original "I Love Lucy" episodes are now 63.57 light years (LY) from Earth. The signal-to-noise ratio of these signals is so low-- and has been for a long time--that no physically-realizable receiver that could be built by our putative alien friends out there to receive them. (With the physics and engineering we know, granted.)
Suppose "I Love Lucy" was transmitted by a station that transmitted 1 Megawatt (EIRP) that was pointed toward our alien's system on October 15, 1951. That's not much of a stretch; there were many VHF TV stations of that power class. Let's suppose it was on VHF TV Channel 6 (no longer used), which was on a frequency of about 87.8 MHz.
Let's further suppose our aliens, being far in advance of humanity, built an antenna dish in orbit 100 km (62 mi) in diameter, designed to point itself straight at us. A dish of that size -- which no human effort could construct today -- has a calculated gain of 97.3 dB @87.8 MHz. Compare your lowly log-periodic TV antenna at 11 dB of gain; our aliens' dish is 10^((97.3-11)/10) = 426.6 MILLION times better at "pulling in" signals than your average TV antenna.
At a distance of 63 LY, the free-space path loss is so incredibly high-- 366.9 dB at 87.8MHz. Even with their monster antenna, the signal the aliens receive would be incredibly weak: -199.6 dBm. With this weak signal, there would be no possible way for the receiver to detect the amplitude-modulated video signal in the 6 MHz-wide TV channel, even if you immersed the receiver's first amplifier stage in liquid helium, a few degrees above absolute zero. The thermal noise floor at -270°C is -125.8 dBm. An NTSC video signal requires at least 50 dB signal-to-noise ratio (SNR) for studio-quality video, and roughly 20 dB for snow-filled ghosts. Our aliens only have -125.8 - (-199.6) = -73.8 dB SNR.
No "Lucy" for them!
Even fictional Alpha Centaurians (4.3LY away) couldn't have seen the first "Lucy" episode in 1955; they'd receive it on their 62-mile-wide orbital dish at about -176.2 dBm, which gives them a non-usable -50.4 dB SNR. Remember, they need +20 dB SNR for snow-filled reception.
Suppose they just wanted to hear Lucy's cackle, instead of seeing her smiling face. Well, NTSC TV sound is modulated using FM, in about a 100 kHz bandwidth, using a ±25 kHz deviation. Even if the aliens knew this ahead of time, and used frequency-compressive feedback in their receiver to reduce the speech bandwidth down to 60 kHz, this gives them a noise floor and SNR improvement of only 10 x log10(6/0.06) or 20 dB. Even the Alpha Centaurians in 1954 would have had a -30.4 dB SNR on Lucy's voice. And FM needs a 20 dB SNR just to beat scratchy old AM voice quality.
If the aliens aren't in our Solar System, or very nearby, they aren't going to see our TV transmissions, historic or modern. And them's the facts.