Return to list of reminiscences

My Career in the BBC, Part 2
by Richard Russell

In Part 1 I described my career in Designs Department, at Western House, until its merger with Equipment Department and the move to Avenue House, Chiswick, in 1988. The change in location didn't suit me very well, because my journey time from home increased significantly and involved a change of train at Waterloo, but I soon got used to it.

Studio lighting
Leopard Control Panel
Leopard Control Panel
At around the time of transition from Designs Department to D&ED a major project was under way to develop a replacement for the old Thorn Q-File lighting control system and console, which at one time had been installed in most of the BBC's studios. The replacement was to be called, perhaps with some lack of originality, Q 2. Although I was not directly involved with the design of the console itself – that was done by a team led by David King – I was asked to develop an ancillary piece of equipment that would be used when dynamic (e.g. flashing) light effects were required, for example in shows like Top of the Pops.

In such programmes commercial lighting effects generators would be used, but in order to connect them into the studio lights a patching system was required. This was effectively a router, which would take in the relatively few analogue channels from the effects devices (up to 240) and feed them selectively to the studio dimmers – up to 1536 of them. Each connection could be set to a different 'gain' and the equipment could store up to 14 different 'patches' (combinations of routes and gains). The outputs used the standard DMX-512 digital interface, although they were de-rated to a maximum of 192 channels each to give improved dynamic performance.

My patching system, the EP5S/31, was given the name LEOPARD, notionally standing for Lighting Equipment Online Patching and Routing Device but also easy to remember because this leopard really did change its 'spots'! An account of the Q 2 and Leopard systems, from the viewpoint of a user, can be read on the TV Studio History web site under the heading The curious saga of the QII.

Clocks revisited
Having been promoted to Senior Design Engineer a couple of years before, my role now shifted to managing large projects with other engineers working for me. However, I made sure I would still be involved with 'real' hardware and software design! Thus it was that in around 1989 I had a rare opportunity: to redesign, using new technology, something that I had first developed several years earlier.

I previously described how I produced the first ever fully electronic on-screen clock, which by now had been used on both BBC1 and BBC2 for the best part of ten years. As part of a revamp of the BBC's 'network identities' something better was needed, which didn't have the limitations of that earlier design (in particular hands restricted to simple white rectangles and quite poor anti-aliasing). Whereas the original design had used some analogue techniques, the new one would generate the image entirely digitally.

Since it had not yet been decided what clock-face design would be required, I was determined that my new generator should be as flexible as possible. The static background would be a full-frame Rec.601 image, the hands could have any shape or colour, and the 'movement profile' of the second hand would be programmable. Additionally, the hands could cast shadows on the background. This flexibility necessitated quite a complex piece of kit, and I was assisted in its development by Lindsay Cornell (drop shadow effects), Gonzalo Gili (clock pulse generator), Andy Johnson (background store) and Richard Stevens (digital to analogue converter). I did the most interesting bits myself: the hand generators and the control software.

The clock face demonstrated at IBC 1990
The clock face demonstrated at IBC 1990
The new clock was christened GNAT (Generator, Network Analogue Time; more formally the GE6SM/574) and it was decided to demonstrate it at the 1990 International Broadcasting Convention in Brighton. To show off its capabilities to the full we created a clock face (with the help of colleagues at TV Centre) that would look exactly like an old long-case clock. Although not shown in the adjacent picture, the real thing had a second hand with a 'recoil' motion to add to the realism.

Predictably, perhaps, when it came to the television service deciding what clock face they wanted, none of the fancy capabilities I had built into the generator were used! They wanted simple rectangular hands again, with no shadows, and we couldn't even persuade them to use a 'continuous sweep' second hand. However the image quality was vastly improved over the old design, and a later BBC2 version did at least have the three hands in different colours!

The GNAT clocks remained in service for another ten years or so, until changes in presentation style, and the advent of Digital Satellite and Digital Terrestrial broadcasting with their attendant delays, resulted in on-screen clocks going out of fashion. However you can have your own personal GNAT clock, implemented as a Windows screensaver, by downloading it from here.

Secret projects
My next two projects, in their own different ways, both had an element of secrecy about them. The first is still considered, to this day, to be so sensitive that I cannot even hint at what it was, but I can talk about the second. At this time, in about 1991, conditional access TV services on satellite had started up. These could be decoded only with a smart card supplied by the broadcaster, and only on receipt of the appropriate payment for the programme or channel. The BBC decided that this was something it wanted to do too, so a service called BBC Select was proposed.

This was well before the advent of direct digital broadcasting to the home, so the programs had to be 'scrambled' (so as not to be watchable without the appropriate authorisation) using a technique compatible with standard analogue (PAL) transmissions. The method favoured by the satellite broadcasters was Line Cut and Rotate (LCR); this involved 'cutting' each TV line at some point and 'swapping' the sections before and after the cut. By moving the 'cut point' from line to line on a pseudo-random basis the picture could be rendered unwatchable except via a decoder capable of reversing the operation.

Test Card F scrambled using Line Shuffling
Test Card F scrambled using Line Shuffling
Unfortunately whilst this method worked well for satellite transmissions it was unsuitable for terrestrial TV; this was because of reflections (ghosts). Virtually all off-air analogue TV (VHF or UHF) suffers to a greater or lesser degree from reflections, often from nearby buildings, trees etc. Normally they are not serious enough to be a problem but, with LCR, the action of the decoder 'reassembling' the lines meant that the reflections would be moved to a different place on each line. This covered the picture with objectionable flickery noise. Therefore an alternative method of scrambling was needed - Line Shuffling. With this, the lines making up the picture would be reordered (in a different sequence on each frame) and the decoder would restore them to their original positions. This required more storage in the decoder than Line Cut and Rotate, but worked well in the presence of reflections.

The decoder for the BBC Select service was to be developed by a commercial partner, but building the encoder was my job. As well as shuffling the lines under control of signals from a Security Encoder Computer it had (optionally) to scramble the associated audio and to insert VideoCrypt data in the signal to be broadcast. I well remember the occasion when it was first tried for real - in a test transmission in the middle of the night. I had set up my prototype scrambler (CD2SM/532) in the lab at Avenue House, unattended, under control of a computer that would switch its modes at certain times. I knew that if it didn't work correctly I, sitting at home watching, wouldn't be able to do anything about it. However all was well and the BBC Select service was a technical success.

Audio work
Audio has never been one of my particular interests, and I have mostly escaped working on it! However, as mentioned above, a requirement of BBC Select was (optionally) to scramble the audio content as well as the pictures. Once again the method used had to be compatible with ordinary analogue (FM) transmission, and the one chosen was spectrum inversion. Conceptually this is simple to understand: low audio frequencies become high frequencies and high frequencies become low ones. For obvious reasons the detailed parameters are chosen to make the domestic decoder as simple as possible, at the expense of a more complex encoder. Typically the decoder would use analogue mixing techniques but the encoder would need to work on the digital (AES/EBU) audio used by the broadcaster.

Despite my lack of experience in this area three things helped me. Firstly I had attended a residential course at BBC Wood Norton, run by Motorola, to learn all about programming the DSP56000 Digital Signal Processor; secondly I was able to use a ready-made hardware unit (the UN26S/111, designed for a different project) containing four of these processors, and thirdly my experience as a radio amateur had given me some insight into how to generate single sideband - a very similar process to spectrum inversion. So it was that I was able to write the DSP assembly-language code for use in the audio scrambler.

Just before the production BBC Select scramblers were about to be commissioned, I went on my first cruise, to the Caribbean. It was rare for me to take a long holiday, and I felt it was only reasonable to provide the BBC with contact details - not expecting that they would be needed. However about half way through the cruise, with the CD2SM/532 far from my thoughts, a fax from the BBC appeared in my cabin! It was a cry for help because, on testing the audio scrambling, the output level had been found to be 6dB down (i.e. half what it should be) and they wanted to know how to fix it. This puzzled me, because I thought I had checked it, and I was very concerned at the idea of my code being modified in my absence. It then dawned on me that the input to the scrambler was stereo, the two channels being summed internally; if only one of the channels had been connected, the output would indeed be 6dB low. So I hurriedly wrote a fax in reply suggesting that this might be the cause, and delivered it to the ship's radio room. A day or so later I heard from the BBC again - I was right, and I was able to enjoy the rest of the cruise in peace!

By now I was evidently considered something of a (reluctant) digital-audio expert since in about 1993 I got involved in a very unusual activity for the BBC at the time: a project carried out entirely for an external customer. For reasons that I can't now remember, Design & Equipment Department had decided to tender for a Ministry of Defence contract - to design a 'Speech Clarifier' to be used as part of the new Practice Emergency Training Facility for military Air Traffic Control. The purpose of the Speech Clarifier was to receive voice signals from military aircraft, via a number of different outstations, and process them in such a way as to maximise their intelligibility in the presence of fading, interference etc. I probably should not go into too much detail, but once again the existing audio processing unit (UN26S/111) and the experience I had acquired of programming the DSP56000, came in very handy.

Success out of failure
Queen's Award for Technological Achievement
Queen's Award
My knowledge of the DSP56000 was put to good use again when I was seconded onto a project whose aim was to develop a piece of equipment that could replay video material in slow-motion, without having to use a slow-motion camera! Research Department had been experimenting with Phase Correlation as a means of measuring motion in a picture, and believed it to be so good that it could predict the intermediate positions of moving objects, between two video frames, with sufficient accuracy. In a rare cooperative effort between the departments, Development Group was asked to engineer it as a product that could be exploited commercially.

My small role was to write the software that would determine 'global' motion, i.e. overall pan, tilt, zoom and/or rotation, generally resulting from actions of the cameraman. Basically this involved examining all the motion 'vectors' measured by the hardware, working out which corresponded to global motion, and calculating the camera parameters from them.

In the event it turned out that the project had been too ambitious, and it proved impossible to generate the intermediate pictures with sufficient accuracy, but the technology was redeveloped into a different application: standards conversion. This also requires accurate motion measurement, but isn't quite so challenging. The end product was arguably the best standards converter in the world: Snell & Wilcox's Alchemist PhC. The development won a Queen's Award for Technological Achievement in 1998.

This was not to be the last time I would program the DSP56000, but the next application will be described a little later.


More upheaval
Site of proposed Development building at KW
Site of proposed Development building at KW
By 1993 D&ED had been at Avenue House for about five years and was reasonably settled, but financial constraints on the BBC were about to deal another blow. Avenue House would close, along with the BBC's in-house manufacturing capability, and Design Group would move to Kingswood Warren. At first we thought that development staff would escape relatively unscathed, and occupy a new building at KW, but it was later decided to merge Design Group and Research Department (with significant job losses) to form a new Research and Development Department that would fit in the existing accommodation.

The change of departmental structure also had a major effect on the nature of the work. No longer would it be considered normal for the BBC to design and develop its own equipment; the commercial sector would be expected to meet the majority of our needs. So although the combined department had Development as part of its title, hardware would be built only rarely, when there was no alternative.


Virtual Production
In the new organisation (which was established before the physical move to KW in 1994) I became part of Studio Group, working on a collaborative European project called MONALISA. This was an early experiment in 3D Virtual Production, that is the generation of a synthetic - but realistic - background image which moves in synchronism with the movement of a TV camera. Effectively it is like Chroma Key but with the ability for the camera to move freely. One requirement of this project was that the 'virtual background' should be able to contain live video images, such as a virtual TV screen or a window through which the 'real world' could be seen.

Whilst the graphics computers of the day (specialised machines rather than general purpose PCs) could 'render' the synthetic virtual background with acceptable quality, they couldn't manipulate live video in real time. It was therefore decided that a hardware video manipulator (a kind of special-purpose Digital Video Effects device) would be developed, which could shrink (or enlarge) and distort the live video so it could be directly inserted into a suitably shaped 'hole' rendered in the background by the graphics computer. The development of this video manipulator was my job.

Although DVEs capable of this kind of video manipulation already existed, often as part of a vision mixer, they were not suitable for this application; they were over-complicated (being designed to do things like wrap pictures around a cylinder, or turn over a corner) and the resultant quality was not good enough. I therefore designed a special-purpose device, the CO6S/514, able to perform the kinds of perspective transformation required whilst maintaining a very high picture quality.

Virtual Scenario in use on Match of the Day
Virtual Scenario in use on Match of the Day
Once again fate lent a hand, and the unit I designed never got used for its original purpose, but since it was just as capable of enlarging a picture as shrinking it, another application emerged. Instead of inserting the manipulated picture in a synthetic background, it would itself be used as the entire background.

Typically a static background image, either synthetic or real, would be fed into the CO6S/514 DVE, which would enlarge and distort it according to the pan, tilt and zoom of a studio camera. Conventional blue or green screen Chroma Key (Colour Separation Overlay) techniques would be used to superimpose the studio foreground (presenters, props etc.) on the virtual background, achieving what we would now call 2D Virtual Production.

The CO6S/514 won the Video R&D Achievement of the Year award at the International Broadcasting Awards in 1996. It was licensed to Radamec Broadcast Systems (who manufactured the camera pan, tilt and zoom sensors) and marketed as their Virtual Scenario system for several years. It sold in large numbers around the world.


Synthetic defocussing
Picture with artificially defocussed background
Picture with artificially defocussed background
The same picture with defocussed foreground
The same picture with defocussed foreground
One of the reasons why Virtual Production rarely looks entirely 'real' is the fact that the background image isn't affected when the camera's focus setting changes. To a degree the tendency for the Virtual Scenario background to go soft when the camera zooms in compensated for this, but for maximum realism it is desirable for the virtual parts of the picture to defocus in a natural way. This can be achieved (at the cost of lots of CPU power) with a fully-rendered 3D model, but not in a 2D system like Virtual Scenario.

I was therefore tasked, in about 1996, to develop a sister unit to the CO6S/514 which would defocus the background image in response to a signal from the camera's focus sensor. For best results this new unit, the FL4S/596, would be fed with both the sharp background image and a depth map - a monochrome (greyscale) image in which brightness represents distance from the camera. Although a completely realistic defocussing effect cannot be achieved with a picture shot from a single point, it was possible to simulate it reasonably well using a special technique developed for the project.

Here you can see the effect created. Both pictures shown are generated from the same (sharp) image and a depth map representing the distance of each point from the camera. In the first the virtual camera is focussed on the foreground and in the second it is focussed on the background. With the real FL4S/596 the virtual focus depth and aperture (depth of field) can be varied continuously in real time.

This development was also licensed to Radamec Broadcast Systems and marketed under the name D-Focus.

Another shortcoming of Virtual Production can be the misregistration between the real and virtual parts of a picture (particularly at the edges and corners) caused by lens distortion. To tackle this, I later reconfigured the CO6S/514 2D DVE, with different firmware, as a lens undistorting unit. Connected at the output of a studio camera, and fed with signals from the zoom and focus sensors, it would correct the distortion and allow the real and virtual elements to be combined more accurately.


Camera tracking
The aim of the MONALISA project had been full 3D Virtual Production, that is the ability to move the studio camera in all directions rather than just pan, tilt and zoom as in the 2D variety. To make this work it is necessary to measure the camera's position and orientation very precisely, to a resolution of better than 1 millimetre in position and 0.01 degrees in angle. The difficulty of making such a measurement, in real-time and with sufficient accuracy, had been the main stumbling block to implementing a practical system. Although a few commercial camera tracking systems had been demonstrated, they didn't work very well and were generally limited to small areas - not a full-sized studio of the kind used by the BBC.

Spotter camera (rear of studio camera) and targets
Spotter camera (rear of studio camera) and targets
My colleague Graham Thomas, who had been leading the BBC's contribution to the MONALISA project, came up with a solution to this difficult problem. The studio camera would be fitted with a small, monochrome, upward-looking 'spotter' camera which would view special fixed targets mounted in the studio ceiling or lighting grid. Knowing where each of the targets was positioned, and its apparent position in the camera's image, it would be possible to calculate the required position and orientation information (so long as at least four targets are seen). Software simulations confirmed that the required accuracy could be achieved by this means.

Performing the necessary image processing and computation in real-time was another matter entirely! The simulation software was all written in C and I had to work out a way to convert this to a combination of hardware and software which could be implemented on a dedicated unit that would be practical to manufacture and reliable to use. The result was the CD3/563 Camera Tracking Decoder; it uses a combination of programmable logic and Digital Signal Processing to perform the necessary functions. This is where, somewhat surprisingly, my earlier experience in programming the Motorola DSP56000 came in handy again. Although intended principally for audio processing, I realised that the 24-bit fixed-point architecture of the device had just the right precision for the required computations.

Still from the IBC 1997 live production 'Shrink'
Still from the IBC 1997 live production 'Shrink'
The system was first shown in public at the 1997 International Broadcasting Convention in Amsterdam and attracted an enormous amount of interest, not least because it was demonstrated by means of an actual production performed live (several times a day) on the BBC stand, with actors (including Ronni Ancona), cameramen and the rest of the studio paraphernalia all transported to the exhibition hall!

The camera tracking system was licensed to Radamec Broadcast Systems as Free-d. It has been sold around the world and is still available (at the time of writing) from Vinten Radamec Broadcast Robotics. Free-d was responsible for the only globetrotting period of my career, when I went to Tokyo, Bergen (Norway), Las Vegas and New York to help with the demonstration or installation of the system.

The BBC's Virtual Studio system, with Free-d at its heart, won the Most Innovative Achievement in Production award at the Production Europe Technology Awards in 1998.


Software projects
As I alluded to earlier, the development of in-house hardware would now be sanctioned only when no commercial alternative was available, and when commercial partners could not be found to do the work. Therefore my role became far more software-orientated than it had been previously. Whilst software had been an important part of my job for a long time, it had mostly consisted of assembly-language code running on 8-bit embedded systems. Now I had to get more involved with programming in high-level languages and on general-purpose computing platforms such as PCs or Unix workstations.

Test Card J
Test Card J
It would be tedious to list all my software projects here, but I will highlight one in particular. A colleague at Television Centre had taken an interest in test cards, particularly the famous and iconic Test Card F. This had originally been produced from a (very expensive) slide, and later by an electronic generator with the image programmed into Read Only Memories. However even the electronic version was far from perfect, with the central picture having been derived from the old slide, and whilst tolerable for analogue (PAL) video it wasn't really of a high enough quality for digital video. It was also 4:3 rather than the 16:9 widescreen format increasingly used.

You can find the full story here but in brief it was decided to create a replacement for Test Card F with much improved quality. Whilst the detailed design of the image had been done at TV Centre I took on the task of implementing it as a digital still-frame. This involved developing special software to create all the patterns, with their anti-aliased edges, 'from first principles' so the results would be as immaculate as possible. I also had to scale and adjust the colorimetry and gamma of the central picture, which had been re-scanned from the original large-format film. The result was Test Cards J (4:3) and W (16:9) which are now used in preference to Test Card F. I have also produced High Definition versions, in both 720 and 1080 line formats.


PAL decoding
I wasn't sure whether I would ever have another major hardware design project, but I needn't have worried. In 2001 I was asked to implement a real-time version of the PAL decoding algorithm developed by Jim Easterbrook. Over the previous few months he had been beavering away, quietly, inventing and testing (in simulation) a revolutionary new way of decoding PAL using Fourier Transforms, which he modestly thought might be an improvement on the alternative methods. In fact it was dramatically better than any of them, but it was also far, far more complicated (see the patent)!

Transform PAL Decoders being soak tested
Transform PAL Decoders being soak tested
Fortunately, despite the near-moratorium on hardware development, there was a very important customer for such a technology within the BBC - Information and Archives. They needed to transfer their entire video archive from obsolete tape formats like D3 and C-format onto modern digital formats such as DigiBeta. In so doing they needed to preserve the very best possible quality, and ideally use a PAL decoder which was 'complementary' so as not to throw anything away irreversibly. The Transform PAL Decoder fitted the bill perfectly, so they sponsored the work.

When Jim first tried to explain his method to me it was completely incomprehensible, and I needed very rapidly to get up to speed with three-dimensional spectra, window functions, complex conjugates and the like! Anyway, after several months of theoretical work, schematic entry and programmable-logic design (and a miserable Christmas spent wondering why on earth I couldn't get the prototype to work) I eventually got the thing to go. I even managed to improve the algorithm a little compared to Jim's original version. Once the prototype had been demonstrated and seen to work well, I&A ordered an initial batch of six. Eventually a total of 19 Transform PAL Decoders (CD3/568) were built, most of which are, at the time of writing, in heavy use for transferring the BBC's archive. A number of BBC DVD releases have also benefited from the technology.


High Definition TV
BBC Research Department, and later Research & Development, had been heavily involved in High Definition Television since the 1980s, in particular under the aegis of the pan-European 'Eureka 95' project. This used what was originally expected to become the European standard for HDTV - 1152 active lines, interlaced, at 50 fields per second (exactly twice the number of lines as standard definition). A complete studio (initially with tube cameras, later with CCD), an editing suite and recording facilities for this format were set up at Kingswood Warren, and over a period of several years a large archive of HD content was amassed, including a lot of wonderful Outside Broadcast material.

A famous HD test picture - Kiel Harbour
A famous HD test picture - Kiel Harbour
Despite the commitment from broadcasters and manufacturers, this HD standard never moved out of the research environment and was ultimately overtaken by developments in digital television, and the evolution of the HD standards of today (principally 720p and 1080i). It would have been a disaster for the HD material shot by R&D and others to be lost, so consideration was given to how it might be converted from the Eureka 95 format to one or other of the modern digital standards; this is where I came in!

The material had initially been recorded in the Quadriga format, which involved demultiplexing it into four standard-definition digital bitstreams that were recorded, simultaneously, on four D1 digital recorders. For replay the four signals needed to be remultiplexed back to HD. This was an extremely expensive (in tape usage) and complicated method, needing four old D1 recorders to be kept in working order. Later a more practical system was devised, involving compressing the HD by a factor of four, storing it on a single D1, and decompressing it on replay. Fortunately, for compatibility with the earlier method, the digital interface to the compressor was four separate SDI bitstreams (no standardised HD digital interface existed at that time) which provided a convenient way to get at the HD without going back to analogue.

So in 2002 I was asked to develop a hardware unit (the CO6/515) which would take as its input four SDI bitstreams, in the Quadriga format, and convert the 1152i signal to either 720p or 1080i (on an HDSDI interface) as required. I was able to call upon my earlier experience in designing the 2D DVE to understand the filtering algorithms that would be required, particularly in respect of deinterlacing. Two units were manufactured and I believe most of Kingswood's HD archive has now been transferred to modern formats using them. This would turn out to be my last hardware design for the BBC.


The end of an era
In 2005, as part of another major reorganisation of the Corporation, it was announced that Kingswood Warren would be closing (probably some time before 2008) with what was left of Research & Development - once again to be renamed BBC Research - moving either to West London or to Manchester (or both). It had always been my intention to retire no later than age 55, since the BBC's pension discounting rules were quite generous at only five years to go before normal retirement (this was before the change of legislation), and the closure of KW made that all the more certain.

Indeed there was a prospect of being made redundant - even more financially attractive - because of the reduction in staff numbers and the discontinuation of the formal 'development' function. And so it transpired: I was offered redundancy, which I very readily accepted, at the beginning of 2006. I finally left the BBC on my 54th birthday, and haven't regretted it for one moment.

© R.T.Russell, 2007