Sergio Ammirata Ph.D., chief scientist, SipRadius
John Logie Baird, Philo T Farnsworth, Isaac Schoenberg, Leon Theremin: who you regard as the father of television is probably more about nationality and culture than any historical justification. The truth is that while these were remarkable pioneers, they were more about creating the concept of television than today’s reality.
World Television Day, on 21 November, is chosen to mark the centenary of John Logie Baird demonstrating his “televisor”. It was hardly high resolution: 30 lines at 12.5 frames a second. Along the development path Baird had given himself a 1000 volt shock, which did little to enhance the concept’s reputation.
But the idea of live moving pictures transmitted by wireless took hold, and a lot of further research and investment went into developing a practical television system. The first regular television broadcast service was launched by the BBC in 1936.
To get this far, took a lot of work from a lot of very clever people, who took the state of the art in electronics to the very limit. There was little existing technology needed for television that could be found elsewhere: it all had to be designed and built for this one purpose.
We still see some of the compromises from those early days in today’s broadcasting. Interlacing the picture was created because 1930s electronics simply could not operate fast enough to send complete images at the required frame rate. That cheat is still used in broadcast television today, even though both cameras and displays are natively progressive.
This need for bespoke hardware continued for 80% of the history of television. The technology needed to create and distribute programs was only appropriate for the television industry. In turn, this made it expensive, because those companies developing and designing the kit had a limited market so the R&D was amortized over relatively few sales. It also limited innovation, because you had to be pretty certain of success before signing off that R&D budget.
You could only do what the technology allowed you to do. Creativity was limited by the engineering.
Moving content around, too, required specialist, application-specific technology. Regular venues like major sports stadiums had video lines direct to broadcast centers. Other venues used satellites or microwaves to deliver live programs.
This all contributed to tremendous inflexibility for broadcasters. They had to decide what they wanted to do, then build the case for capital investment for the necessary systems. Once built, those systems were pretty much fixed in place for seven to 10 years, until you could justify another major capital investment.
There was one perhaps overlooked advantage in all this, though. It was pretty secure, because unless you were another broadcaster there was not much you could do with either the kit or any signals interrupted in transit. The high cost of the equipment needed to use the signals priced would-be pirates out of the market.
Meanwhile, the much younger IT industry was advancing at considerable speed. Companies like Apple were spending as much in R&D each year as the total market for broadcast equipment. Famously, in 1965 Gordon Moore wrote in Electronics magazine that the power of integrated circuits (actually, the number of transistors on a chip) would double every two years, a prediction that proved remarkably accurate.
Together, the gathering power of ICs and huge R&D investment meant first that personal computers became possible, then they became so powerful they began to have the resources to process video in real time.
Oh, and a thing called the internet came along.
We often talk about transformative change, but this really was. It tipped the fundamentals of television on its head. No longer limited by the hardware infrastructure, we could come up with an idea and the software-centric technology platform would instantly adapt to make it happen,
The internet, too, became our friend. IP – the internet protocol – is almost completely ill-suited to deterministic video streams, but clever people found ways to tunnel through the internet to deliver the high availability, low latency circuits that broadcasters expect. And now we could link from wherever there was a broadband port, not just where we could park our satellite uplink.
This has proved almost universally good news for the world of television. But it has raised two challenges, which are still being worked through.
At first it was challenging for broadcast engineers to get their heads around the flexibility, scalability and agility of software infrastructures. The temptation was to build systems the way they always had, just using applications on a computer not bespoke hardware. Realizing that virtualized environments which could instantly adapt to new demands took some considerable adjustment.
And second, of course, we have lost the inherent security which bespoke hardware and connectivity gave us. No point in stealing a 2” video tape because you would never be able to play it. But when a valuable program exists as a data file which can be played on the same computer you otherwise use to access Fortnite, then cyber-security suddenly becomes a very important consideration.
In just 100 years television has moved from a slightly dangerous science experiment to the most important cultural influence on the planet. Along the way the technology has grown to meet ever-evolving demands.
As well as celebrate World Television Day, it is important to reflect on all those changes, and the impact that has on the way we manage and protect our media operations.
