ALEXANDRIA, VA.—Now that broadcasters soon plan to deploy ATSC 3.0, there needs to be test equipment and procedures to ensure accurate transmission. Manufacturers are working to develop gear that will measure the new parameters that are important to maintaining a clear ATSC 3.0 signal.
One feature of ATSC 3.0 that pushes the limits to measure reliably is its layer-division multiplexing (LDM), one of a few techniques to pack as much data into the standard as possible. However, being a state-of-the-art standard makes ATSC 3.0 a state-of-the-art measurement challenge.
“ATSC 3.0 is a very practical standard that gathers the most recent coding techniques and experiences of its predecessors,” said Vladimir Anishchenko, president and chief technology officer for Ontario, Canada-based Avateq Corp., manufacturer of the AVQ1022 RF signal analyzer. “That is why the challenge is not in a particular measurement but rather in a number of measurements creating a whole picture of the signal quality.”
Anishchenko noted the difficulty that ATSC 3.0’s LDM technology adds to the mix.
“Compared to MER/SNR measurements in the DVB/ASTC 1.0 standard not using the LDM technique, physical layer pipe [PLP] MER measurement is not a trivial task for ATSC 3.0,” he said. “In ATSC 3.0, actually two PLP MERs exist in a case of LDM: a) MER of PLP calculated over a superimposed core and enhanced PLPs, and b) MER calculated for LDM ‘decomposed’ PLPs, i.e., for each LDM super-positioned PLP.”
WHERE BROADCAST, BROADBAND MEET
Although digital broadcasting has existed worldwide for several decades, ATSC 3.0 is something of a departure from older digital standards. That has an effect on the way it is tested and measured.
“ATSC 3.0 is at the meeting point of broadcast and broadband, more tailored to viewers’ needs but much more challenging for the operators,” said Jean-Pierre Thomas, test and monitoring director for France-based Enensys, a manufacturer of ATSC 3.0 gateways and SFN synchronization systems. “Permitting both indoor and mobile reception for the same TV channel implies configuring an LDM-based transmission, meaning the combination of a basic and robust QPSK with a complex but high-quality NUC 256QAM. The other main topic, such as multi-PLP, is something DVB-T2 operators are already familiar with, but is now adapted to ATSC 3.0.”
Broadcasters have seen a shift to IP for years, and ATSC 3.0 continues that trend. Components that used to be discrete blocks of hardware are disappearing in the IP world, which has implications on test and measurement.
“The move to an all-IP infrastructure is also a challenge, with software-based and virtualized major components such as a gateway (scheduler) or even a modulator,” Thomas said. “With ATSC 3.0, it only uses hardware parts when necessary, such as when dealing with analog signals.”
To accurately measure ATSC 3.0 signals, “first you need to know whether you are receiving a good-quality signal, therefore you need to measure the usual RF parameters,” Thomas said. “[That includes] the signal level of course, but also the MER and BER values, which should be measured for every PLP received, as they provide the quality of the modulation.
“If not, when a frequency interleaver is used these values can be estimated based on the L1 (basic/detailed), as the pilots are spread over the spectrum,” Thomas added. “These measurements should be done in the field, as seen by a common receiver. The operators can then generate coverage quality maps, logging RF measurements with GPS location.”
ATSC 3.0 is a very dynamic environmen that puts complex demands on a test environment, said Johan Craeybeckx, business line director at Eurofins Digital Testing International in Belgium.
“In order to ensure the accuracy of a device under test, Eurofins Digital Testing has built an ATSC 3.0 broadcast environment allowing tests to be created in a real-world and reproducible way,” Craeybeckx said. “When there is a particular challenge to measure, Eurofins Digital Testing has designed specific hardware to make measurements, such as a hardware device that measures the accuracy of audio-to-video synchronization at frame-accurate levels. This reduces the time and costs associated with validation and conformance to ATSC 3.0 standards.”
An ongoing concern for test systems is their accuracy over time. With ATSC 3.0 needing novel and highly precise measurements, what is the probability that test equipment will remain calibrated and continue to produce accurate results?
“I am not sure if the reference is to the crystal in the RF card or clock on the analyzer,” said Ralph Bachofen, vice president for sales and marketing for Triveni Digital. “As for the clock on the analyzer, it can be configured as PTP or NTP on Linux, or NTP on Windows. The crystals on the RF receiver cards are highly accurate and don’t really need recalibration these days.”
Not all manufacturers suggest a hands-off approach. “Although our gear is thoroughly calibrated at the factory to provide reasonably accurate power and time-related measurements, the accuracy might degrade mainly due to the gear component aging and parameter variations,” said Avateq’s Anishchenko. “We would recommend recalibration each year.”
Software measurement tools depend on the timing of the devices they run on, which in turn are connected to networks that use precision timing systems.
“A lot of solutions will actually be software-based, or virtualized,” Thomas said. “The gear will then rely on standard or professional-grade servers provided by third-parties. As for the RF measuring gear, after purchasing the equipment, it is usually standard to recalibrate the device once every other year, in order to prevent time and weather-induced artifacts on the main components.”
The transition to 3.0 from 1.0 is moving slowly, with just a handful of stations committed to the new standard. Bachofen recommends simultaneous testing of both standards to keep the broadcast neighborhood clean.
“In the early deployment phases, it is important to analyze all demarcation points in the broadcast chain to minimize anomalies and issues that might arise,” he said. “We believe that during the simulcast timeframe, it is imperative to analyze both the ATSC 3.0 and ATSC 1.0 standards simultaneously, and possibly on DMA-wide scale, as channel sharing is required in most scenarios.”
As Bachofen points out, ATSC 3.0 is about as different from ATSC 1.0 as ATSC 1.0 was from analog. “ATSC 1.0 is an MPEG transport stream over 8-VSB, so that is a completely different standard compared to ATSC 3.0,” he said. “DVB has some similarities as part of the physical layer based on COFDM. However, everything else in ATSC 3.0 is different and fortunately IP based.”
This means that there will be new testing gear and procedures, and it may take time for the best practices to sort themselves out. In the meantime, some familiar companies and some less-familiar ones are making the gear needed to ensure that ATSC 3.0 transmissions are delivering pristine signals to viewers.