The FM Stereo System That Lost

The FM Stereo System That Lost

Remember Betamax, HD DVD, Cinerama? How about the Crosby FM Stereo system?

murray_g_crosby150In the late 1950’s, former RCA engineer Murray G. Crosby invented a method of broadcasting FM stereo that worked (and sounded) better in fringe areas than the Zenith/GE system under consideration at the time. Only one problem though – it was not compatible with the subsidiary communications authorization (SCA) services (think Muzak) that some FM stations were experimenting with to bolster their bottom lines. (Remember that this was before the FCC Report and Order requiring licensees to broadcast separate programming on their AM and FM outlets.)

Finally on April 19, 1961, after lobbying by the background music industry and the FM stations who held special experimental authorizations for SCA services, the FCC selected the Zenith/GE system as the FM stereo standard. Which is why to this day your car radio reception degrades to AM-type static as a station goes out of range.

(For details, see Broadcasting Magazine, 10/06/1958, page 64.)

We’ve Discovered 10-Bit Video!

We’ve Discovered 10-Bit Video!

8bit150

At home there are some 10-bit TVs and displays that can display HDR video.” – Mark Walton in Ars Techina

Well, my 1984 Zenith color TV could display 10-bit video quality. Not when watching an OTA broadcaster like the Almost Broadcasting Company, (who used 8-bit D-2 videotape,) but most likely when watching Discovery Channel (they used 10-bit Digital Betacam) and definitely when watching my 12 inch analog videodiscs. (I know because my post facility mastered some of those discs.)

Now I know that my Zenith had maybe 350 lines of resolution (if I was lucky), but it didn’t have just 256 steps (8 bits) of brightness. In fact, because it (and the whole transmission chain at the time) was analog, it had no steps at all. Brightness flowed smoothly from 0.339 volts (black) to exactly one volt (white). None of this stair-step stuff you see at the end of some commercials where the finely-graduated background color looks more like van Gogh’s “Starry Night“. So in this 4K and HDR world we now live in, its CRAZY to have newly produced specs (like the NABA-DPP Common Technical Specs) that call for delivery of 8-bit files instead of 10-bit files. What a rip-off for consumers who purchase HDR sets expecting to get better OTA and cable TV pictures.

UHD: 8 Bits vs 10

UHD: 8 Bits vs 10

Simulation of 8-bit pictureFinally realizing that more pixels alone is not enough, the UHD Alliance has come out with specs for what they call “Ultra HD Premium.” Besides HDR and a wide color gamut, the performance metrics require that the video bit depth must be 10 bit. (I assume they are talking about the luminance channel here, as most video is encoded as luminance plus color difference signals.) However if you look inside almost any broadcast/cable/satellite transmission facility, you’ll see that most are using 8-bit mezzanine formats such as XDCAM HD 50 for server storage. Why is this so bad? Imagine paying $1K or more for a 4K monitor and seeing background colors like the thumbnail above. (Or at left depending on the screen size of your device.) And no, the graphic artist did not design it that way.

While the upcoming 4K Blu-ray Disc format (which requires a new 4K Blu-ray Disc player) and some streaming services (like Netflix) will be capable of delivering Ultra HD Premium content, don’t expect any from broadcast or cable services any time soon. The future over-the-air ATSC 3.0 standard will support Ultra HD Premium services, but you’ll need some future ‘to be announced’ converter box to make it work with today’s 4K displays. Gee, it’s fun being on the bleeding edge of technology, isn’t it?

Will TR-03 Stay In Sync?

Will TR-03 Stay In Sync?

BNC-to-RJ-45 ConnectorsLast week the Video Services Forum released its draft recommendation for “elementary stream” IP media (read that as non-embeded audio). While touted by some as a low latency, low payload transport protocol with video, audio, and ancillary data being transported separately, I think it once again opens the Pandora’s Box of “out of sync” audio and video. I know that Thomas Edwards of Fox Networks said: “…when every packet is time stamped accurately, we should have better synchronization between media streams than SDI solutions could provide.” I thought that audio & video packets in MPEG-2 transport streams were time stamped as well, but that hasn’t prevented them from becoming annoyingly out of sync.

I say if you have to implement IP transport now, plan to use SMPTE ST 2022-6. (‘Cause it’s nice when audio and video arrive together.) And let’s see what actually happens when someone else actually implements TR-03 in a control room near you.