Abstract

This article provides a survey and a general methodology for coexistence studies between digital terrestrial television (DTT) and mobile broadband (MBB) systems in the ultra high frequency (UHF) broadcasting band. The methodology includes characterization of relevant field measurement scenarios and gives a step-by-step guideline on how to obtain reliable field measurement results to be used in conjunction with link budget analyses, laboratory measurements, and simulations. A survey of potential European coexistence scenarios and regulatory status is given to determine feasible future use scenarios for the UHF television (TV) broadcasting band. The DTT reception system behavior and performance are also described as they greatly affect the amount of spectrum potentially available for MBB use and determine the relevant coexistence field measurement scenarios. Simulation methods used in determining broadcast protection criteria and in coexistence studies are briefly described to demonstrate how the information obtained from field measurements can be used to improve their accuracy. The presented field measurement guidelines can be applied to any DTT-MBB coexistence scenarios and to a wide range of spectrum sharing and cognitive radio system coexistence measurements.

1. Introduction

This article surveys coexistence between digital terrestrial television (DTT) and mobile broadband (MBB), which is relevant because DTT spectrum has already been reallocated for MBB use and further reallocations are under investigation. Thus, DTT and MBB transmissions are already operated in adjacent frequency bands and wide investigations of the operation within the same frequency band are being conducted.

The main original contributions in this article are (i) a description of how different methods to study DTT-MBB coexistence should be used in conjunction to overcome their shortcomings and to obtain realistic results and (ii) guidelines for conducting field measurements. The term field measurement is used to refer to radio signal measurements conducted outside of a controlled laboratory environment to obtain numerical results for DTT reception protection criteria.

The results of coexistence studies can be used in the regulation and standardization to develop spectrum sharing frameworks and standards to optimize the utilization of scarce spectrum resources and to fulfill the needs of increasing traffic volume of wireless communications [14]. When spectrum sharing is considered in a frequency band, it is essential to study and understand the characteristics of the existing spectrum users to both maximize the throughput for the shared spectrum users and minimize the interference towards the existing users. Ultimately, the aim is to remove coexistence issues between different services by developing a single ecosystem to provide different types of services, such as the MBB and DTT services considered in this article. Studies of wireless coexistence contribute to the development of more dynamic spectrum utilization methods needed to create such an ecosystem.

Simulations are widely used in coexistence studies as they offer an inexpensive method to perform an extensive number of studies. In practice, the realisticity of simulations is limited by the simplifying assumptions which need to be used in the modeling process [5]. Simulations often use parameters defined by the minimum requirements in regulation and standards, which might differ significantly from the actual transmitter or receiver performance. Simulations typically evaluate only one type of interference mechanism, but in practice the interference is cumulatively created by a combination of different interference mechanisms. Simulations and theoretical studies thus often give only directionally correct results. Measurements should be used to provide a more complete and realistic evaluation of coexistence [6].

This article extends the considerations on the role of measurements in coexistence studies presented in [6] to cover field measurements in DTT-MBB coexistence scenarios. Table 1 summarizes the main contributions of this article. The article categorizes the methods to study coexistence into four main classes: link budget analysis, simulations, laboratory measurements, and field measurements. The key strengths, key weaknesses, and role in coexistence studies for each method are described in Table 2.

Link budget analysis calculates the gains and losses in radio signal transmissions and can thus be used to determine the signal levels from DTT and interfering MBB transmissions at the DTT receiver input in different scenarios. For example, link budget analysis is used to determine worst-case scenarios in coexistence studies within European Conference of Postal and Telecommunications Administrations (CEPT), which is the highest-level coordinating body for telecommunications in Europe.

Laboratory measurements are conducted in a controlled environment and used to determine the performance and behavior of a DTT receiver in presence of interference. The DTT receiver behavior needs to be known before conducting field measurements to be able to fully understand and analyze the measurement results in an uncontrolled field environment. Laboratory measurements allow a large degree of automation and can thus cover a larger number of DTT receivers than is practical to measure in field conditions. Laboratory measurement results also provide simulation parameters which are based on actual equipment performance. Publications from laboratory measurements often conclude that their results should be assessed in real operating conditions for further validation [7, 8].

Field measurements are time-consuming and expensive to conduct as they require substantial human resources, test network infrastructure, professional level measurement devices, and radio licenses. As the diversity of conditions observed at field is difficult to model comprehensively, there is a clear benefit in conducting field measurements to validate the results from simulations and laboratory measurements and to obtain knowledge about DTT reception in realistic operating conditions [9]. Field measurements can also reveal unexpected phenomena which affect the coexistence performance. For example, the reflections in signal multipath can result in higher coupling gain and thus higher level of interference at the DTT receiver than is theoretically predicted [10, 11].

The main goals of this article are as follows:(i)To describe a step-by-step field measurement methodology which can be used to determine relevant scenarios and to obtain realistic and reliable results to be used in coexistence studies.(ii)To analyze the interference mechanisms in DTT-MBB coexistence and consider how the DTT reception system coexistence performance could be further improved to enable more efficient shared use of spectrum.(iii)To analyze simulation methods used to determine DTT broadcast protection criteria and coexistence compatibility and consider how field measurements could be used to improve their accuracy.

The rest of this article is organized as follows. Section 2 describes the history, current situation, and potential future developments in the use of ultra high frequency (UHF) television (TV) broadcasting spectrum in Europe. Section 3 continues by describing the DTT receiver protection criteria and the receiver characteristics affecting their coexistence performance. Section 4 briefly describes simulations typically used in determining DTT broadcasting protection criteria and coexistence compatibility and considers how field measurement results could be used to improve their accuracy. Section 5 describes a step-by-step guide for planning and conducting field measurements to study coexistence between DTT reception and MBB. Finally, Section 6 concludes the article.

2. The Utilization of UHF TV Broadcasting Spectrum in Europe

The frequencies between 470 and 862 megahertz (MHz) (UHF bands IV and V) have traditionally been used for broadcasting terrestrial TV in Europe. The transition from analog terrestrial TV under the Stockholm 1961 (ST61) agreement to spectrally more efficient DTT has been completed recently, and the available spectrum resulting from this efficiency gain is known as digital dividend (DD). The Geneva 2006 frequency plan (GE06) agreement [12] revised the ST61 agreement [13] to allow DTT broadcasting in UHF TV band in Europe through an extensive interference planning process [14]. The GE06 agreement defines binding agreements with respect to incoming and outgoing interference between allotment areas and countries. The agreement is technology-neutral and uses spectrum masks to constrain the out-of-band emissions.

The World Radiocommunication Conference 2007 (WRC-07) allocated the 800 MHz band (790–862 MHz) for MBB in what is known as DD1 [15], and technical conditions for MBB operation in the band were created in the European Union (EU) [16, 17]. The DTT transmissions were regrouped into 470–790 MHz frequency range. The Electronic Communications Committee (ECC) then commenced studies on how to exploit the unused spectrum in UHF TV frequencies through nonprotected noninterfering secondary spectrum access with TV White Space (TVWS) approach [1820]. The concept mainly relies on the fact that DTT broadcasting topology uses high power high tower transmitters, which leaves local opportunities to reuse the spectrum with low power communication systems. The availability of TVWS spectrum has been widely studied through measurements [2125] and propagation modeling [2630].

The protection of DTT reception is implemented through geolocation databases regulating the power levels the TVWS users can use at a certain frequency, time, and location [31]. As the TVWS approach does not guarantee the availability of spectrum or any quality of service (QoS) and requires a somewhat complex framework [32], it has not gained much popularity in Europe. The European activities have been focused on the United Kingdom (UK), where extensive trials and experimentations have been performed [33, 34] and geolocation database operators are active.

A number of TVWS trials have been performed throughout the world [35, 36] and standardization of TVWS systems has been active [3742], but the TVWS market is fragmented due to lack of globally harmonised regulation and overlapping standards developed in different parts of the world. For example, European Telecommunications Standards Institute (ETSI) defines out-of-block (OOB) emission levels using the European 8 MHz channel raster, while Federal Communications Commission (FCC) uses the American 6 MHz channel raster. This results in different spectrum emission mask requirements in different markets. In general the interest in using TVWS communications is highest in the rural areas of developing countries, where more TVWS spectrum is available and providing Internet backhaul is difficult [43, 44]. In developed countries, the lack of TVWS spectrum and complexity of the system hinder the investments to TVWS networks.

In DD2 the 700 MHz band (694–790 MHz) was allocated to MBB at World Radiocommunication Conference 2012 (WRC-12) [45], and technical conditions for operation were created in the EU [46]. Currently the DTT transmissions are being regrouped into 470–694 MHz frequency range in Europe with a common deadline for clearing the 700 MHz band for MBB in 2020. The amount of DTT broadcasting spectrum in Europe thus has decreased from 392 MHz to 224 MHz. Regrouping the DTT transmissions into lesser spectrum results in lesser amount of TVWS and further hinders the feasibility of the TVWS approach. Figure 1 illustrates the spectrum utilization in 470–960 MHz after the 700 MHz band MBB allocation. The 470–694 MHz range is used by the DTT transmissions and Programme Making and Special Events (PMSE) wireless microphones, and the 694–862 gigahertz (GHz) by Long Term Evolution (LTE) MBB [47, 48]. The 738–758 MHz range is an optional unpaired frequency arrangement of up to four blocks of 5 MHz for Supplemental Downlink (SDL) [49]. Frequency range 862–960 MHz is used for example by Short Range Devices (SRDs) and Global System for Mobile Communications (GSM)/International Mobile Telecommunications (IMT) systems. Detailed table of European frequency allocations is available in [50].

A coprimary MBB allocation for the 470–694 MHz frequency band was considered in World Radiocommunication Conference 2015 (WRC-15), but a decision was made that the allocation will not be changed and DTT broadcasting will be safeguarded until 2023. A decision was made that a review of the spectrum use in the entire 470–960 MHz UHF band is to be made at World Radiocommunication Conference 2023 (WRC-23). The coprimary allocation would have allowed countries to flexibly choose to use the band for DTT broadcasting or MBB, or a combination of both. Figure 2 illustrates a timeline for recent regulatory decisions regarding the use of European UHF TV broadcasting band.

The recent rapid adoption of smartphones and especially video streaming has resulted in a significant increase in the volume of wireless broadband traffic. Approximately 70% of this traffic is offloaded to fixed networks over Wi-Fi, and the remainder is carried over MBB networks [51]. The amount of MBB traffic in Q1 2016 was over tenfold compared to Q1 2011 and 60% more than in Q1 2015 [52]. The increases in the amount of traffic are projected to continue [53]. The propagation characteristics of UHF TV broadcasting frequencies are very suitable for building MBB networks, and harmonised spectrum below 1 GHz will be needed to provide nationwide and indoor coverage for 5th generation mobile networks (5G) [54].

Simultaneously, the recent trends in media consumption have indicated that the use of linear content (such as broadcasted TV programs) is decreasing, and the use of personalized content is increasing, especially amongst the younger age groups [55]. This indicates that the significance of DTT broadcasting is decreasing. As the amount of MBB traffic is projected to increase, the EU continues to consider how the remaining 470–694 MHz DTT broadcasting spectrum could be utilized most efficiently. The ECC considers four different classes of broadcast spectrum utilization in Report 224: Long Term Vision for the UHF Broadcasting Band [56], from which three consider the use of interleaved spectrum between DTT transmissions and one future communication technology:(i)Class A: primary usage of the band by existing and future Digital Video Broadcasting (DVB) terrestrial networks.(ii)Class B: hybrid usage of the band by DVB terrestrial networks and/or downlink LTE.(iii)Class C: hybrid usage of the band by DVB terrestrial networks and/or LTE including uplink.(iv)Class D: usage of the band by future communication technologies.

The Class A existing terrestrial broadcasting technologies used in Europe are Digital Video Broadcasting Terrestrial (DVB-T) [57, 58] and its successor Digital Video Broadcasting-Second Generation Terrestrial (DVB-T2) [59, 60]. PMSE wireless microphones also utilize the interleaved broadcast spectrum, and some national regulatory authorities have adopted secondary usage through TVWS concept. The amelioration of the DTT transmission quality from Standard Definition (SD) to High Definition (HD) and eventually Ultra High Definition (UHD) could increase the need for DTT spectrum, but simultaneously the progress in audio and video compression is reducing the need for spectrum. It is still debatable whether DTT broadcasting high power high tower topology is an optimal method to use the spectrum resources efficiently. Unstable regulatory environment adds the uncertainty in long term investments to DTT broadcasting.

In Class C hybrid use between DTT broadcasting and LTE including uplink the DTT transmitters and LTE base stations (BSs) would require large geographical separation. Otherwise the reception of the uplink signal will be interfered at the LTE BS by the DTT signals [61]. As described in Section 3.2, the LTE User Equipment (UE) uplink signals have large variability in time and frequency, which results in more difficult interference towards DTT receivers [9]. The mobility of LTE UEs also leads to a diversity of potential interference scenarios. The interference to DTT reception from the LTE UEs introduced to the 700 MHz band is mitigated through a 9 MHz guard band to the highest DTT channel and limitations to the emissions to the 470–694 MHz band [46]. Such guard bands inside the 470–694 MHz band would exceedingly limit the amount of available spectrum for LTE.

Class B hybrid use between DTT broadcasting and MBB downlink would remove the need to consider the challenging uplink interference scenarios. Also, the amount of downlink traffic is increasing faster than the amount of uplink traffic, and the ratio between downlink and uplink traffic is already on a scale of 10 : 1 [62]. Thus, spectrum for downlink traffic is critical. The European Commission (EC) decision proposal in 2016 states that the use of terrestrial TV band other than broadcasting should be limited to downlink-only [63], and the final report of a high level group on the future of the UHF, known as Lamy’s report, [64] also proposed studies on the coexistence of downlink-only MBB and DTT broadcasting in the 470–694 MHz band. LTE SDL concept [62] could be used to provide additional capacity to “traditional” LTE downlink and could be flexibly used for broadcasting with the evolved Multimedia Broadcast Multicast Service (eMBMS) [65]. However, the current revisions of eMBMS in practice are a tool to optimize the cell capacity rather than a real dedicated broadcasting channel [66]. The 3rd Generation Partnership Project (3GPP) aims to remove the limitations in eMBMS operation in 3GPP Release 14 [6769] and to perform a gap analysis of eMBMS performance in 3GPP Release 14 to make further improvements in 3GPP Release 15. The LTE SDL is transmitted as a Secondary Component Carrier (SCC), while the normal LTE management, authentication, and uplink traffic are provided by a Primary Component Carrier (PCC) in the existing LTE networks. Potential use cases for LTE SDL are discussed in [70], but in general the main use would be to provide additional capacity for video streaming, with both linear and personalized content. If LTE SDL is used to deliver video content, it is in accordance with the EU objectives to prioritize the use of 470–694 MHz band for Audiovisual Media Services (AVMS) [63], with GE06 which designates the use of the band for broadcasting AVMS, and with the technology-neutrality supported in the EU to promote competition [71]. Even though the uplink interference can be omitted, the coexistence between MBB downlink and DTT broadcasting still needs to be studied to further evaluate its feasibility. Cross-border coordination between different countries is needed if the outgoing interference level exceeds the trigger level set in GE06.

The low-pass filters used to mitigate interference from LTE-700/800 [72] cannot be used against interference originating from LTE transmissions interleaved within the 470–694 MHz band in Class B or Class C scenarios. More complex filtering solutions, such as programmable and adjustable band-pass filters, would be required. Such filters are more expensive than the low-pass filters, but location-specific fixed filters to attenuate the relevant LTE transmissions at each location could offer an affordable solution.

Class D future communication technologies are studied, for example, in ETSI Mobile and Broadcast convergence specification group. Basically, the goal is to study the integration of broadcasting and MBB into one technological solution. A converged solution would eliminate the need for separate DTT receivers and allow dynamically using the networks to deliver linear or personalized content. However, delivering linear content through other solutions than the high power high tower broadcast transmitters would impose huge requirements on the core network. As all the DTT transmitters and receivers would need to be replaced, an adoption to a completely new technology is likely to happen in 2030s at the earliest. One of the most interesting potential concepts for a future converged ecosystem is WiB [73], in which DTT is transmitted using a wideband signal (which would directly remove any TVWS spectrum) and in which MBB services could be provided in the same band using Layered Division Multiplexing (LDM) [74] and interference cancellation methods.

Regardless of what the developments in the future utilization of the UHF broadcasting frequency are, further studies on coexistence between DTT and MBB are needed. This chapter has considered the potential developments in DTT in Europe, which belongs to International Telecommunication Union (ITU) Region 1. The current situation and potential developments in different ITU regions and countries are different from Europe, and, for example, in the United States (which belongs to ITU Region 2) DD1 was the 700 MHz band (698–806 MHz) and DD2 is made through a broadcast incentive auction in the 600 MHz band [99]. The auction was formally closed in April 2017. It resulted in reallocating 84 MHz of DTT spectrum and in beginning a 39-month transition period, during which some TV stations need to transition to their new transmission channel assignments [100]. Europe and the rest of the world could follow the United States in reallocating the 600 MHz band for MBB, and an incentive auction is a potential method to accomplish the reallocation. Thus, it has to be noted that different types of coexistence scenarios are currently relevant to different regions and countries.

3. Protection Criteria and Characteristics of DTT Receivers

3.1. Definition of Protection Ratio and Picture Quality Criterion

The protection ratio (PR) is defined as the minimum value of wanted-to-unwanted signal ratio at the DTT receiver input [101] and is usually expressed in dB. The wanted DTT signal power is measured over the channel of the DTT signal, while the unwanted interfering MBB signal is measured over its assigned channel. The power levels are root mean square (RMS) values of the emitted signal power within the respective channel bandwidth [101]. No data is communicated from the DTT receivers to the DTT transmitters in DTT broadcasting, and thus only the protection of DTT reception system needs to be considered. The aim is to determine the level of interference a DTT receiver can tolerate from MBB and still meet the chosen quality criterion for reception.

Figure 3 shows the DTT signal on the left and to illustrate both cochannel and adjacent channel PR in the same picture; the interferer is shown both on cochannel with DTT transmission and on an adjacent channel on the right. The interference level in this picture is equal to the maximum allowed level, and thus the PR is the ratio between the DTT and interfering signal powers. When the interference level is equal to or less than the maximum limit defined by the PR, the probability of errors is so small that the DTT reception quality criterion is fulfilled. The probability of errors increases with higher levels of interference, and the DTT reception quality criterion is no longer fulfilled.

Typical PR values in cochannel operation are of the order of 20 dB. This means that the DTT signal has to be 20 dB higher than the total interference from all sources of interference and background noise. Thus, cochannel operation is not desirable with small geographical separation distances, as it would limit the MBB transmission powers to very low levels and the MBB would suffer from high levels of interference from DTT transmissions. In adjacent channel operation, the PR values are negative (typically of the order of −30 to −60 dB), meaning that the interfering signals can have higher power levels than the DTT transmission. PR is defined on the 8 MHz channel raster used for the European DTT transmissions. The selection of DTT transmission modulation and coding scheme (MCS) is a trade-off between robustness against interference and transmission throughput.

The DTT system PR studies were initially based on achieving a target bit error rate (BER) of measured between the inner and outer codes in DVB-T [75], responding to a quasi error-free (QEF) picture quality (“less than one uncorrected error event per hour” [102]). The commercial receivers often do not allow the measurement of BER, and the QEF criterion is not suitable for portable or mobile reception where BER fluctuations are very large [75]. Thus, subjective failure point (SFP) method [101] was created. The quality criterion in SFP is just error-free picture at the TV screen, which corresponds to a picture quality where a maximum of one error can be visible in the picture during an observation time of 20 s. The PR of the DTT signal to the interfering signal is measured at the receiver input at signal levels producing just-error-free picture and rounded to the next higher integer. The SFP PRs are 1-2 dB lower than is needed to obtain BER of to provide QEF picture quality (exact delta values between the picture failure point and QEF reception range between 1.3 and 2 dB for DVB-T and are provided in Section of [102]).

The main drawback in SFP is the need for long observation times and that their automation is challenging. In practice, they require a person permanently monitoring the picture quality. Laboratory measurements can be automated more easily, but in field conditions the interplay between several persons operating the measurement and transmission equipment cannot be fully automated. Even if automation could be used in field conditions, the measurement personnel would still be occupied during the long observation times.

Another very similar and commonly used criterion for PR is Erroneous Second Ratio 5 () [75]. The criterion is fulfilled if the ratio of seconds with packet uncorrectable errors to all seconds in a 20-second interval does not exceed 5%, as the index in the name states. This means that, in a period of 20 seconds, there can be one second with packet uncorrectable errors. The packet uncorrectable errors in MPEG-2 stream generate visible failures in the picture, and the Viterbi decoder signals them by setting a flag. All DTT receivers do not allow access to Viterbi flags, and thus they cannot be used to automate the measurements with such receivers.

The and SFP quality criteria are somewhat equivalent to each other and also to the criterion used in DTT receiver Harmonised Standard EN 303 340 [103], where the minimum time between successive errors in the video is 15 seconds. PR laboratory measurement methodology and setup are described in [75]. Step attenuators are used to adjust the DTT and interfering signal levels, and the level of interference is increased, for example, in steps of 1 dB until the reception quality criterion is not fulfilled. The PR value is obtained from the previous level where the reception quality criterion is fulfilled. This interference level is typically 1 dB lower than where the onset of picture degradation occurs.

The obtained PR values can be used, for example,(i)in geolocation database algorithms to determine the power levels the MBB can use without causing harmful interference to DTT reception,(ii)to establish a baseline for DTT receiver performance. The effects of future innovation in both DTT receiver design and the MBB design/waveforms can be reviewed against this baseline,(iii)to develop targets for coexistence to be included in future DTT receiver standards [76],(iv)in network planning in hybrid spectrum use by MBB and DTT broadcasting.

The protection/separation distance concept [104] can also be used to assess the minimum geographical separation distance between the interfering transmission and the DTT transmission to guarantee acceptable quality for DTT reception [77].

3.2. DTT Receiver and Interfering Transmission Characteristics Affecting Coexistence Performance

As stated in Section 3.1, cochannel operation between two different systems is not desirable. For optimal efficiency in spectrum utilization, the transmissions should use the channels as densely as possible, and thus the adjacent channel performances of the DTT receiver and the interfering MBB transmitters are of paramount importance.

There are two mechanisms in adjacent channel operation by which the interferer’s emissions can affect the DTT reception. The interferer’s emissions in its assigned channel can be received by the DTT receiver in its adjacent channel, or the interferer’s emissions in its adjacent channel can be received by the DTT receiver in its assigned channel. In the former case, the DTT receivers’ susceptibility to interference is defined by its adjacent channel selectivity (ACS), and the amount of interfering emissions in the latter is defined by the interferer’s adjacent channel leakage ratio (ACLR).

The ACLR is a measure of the OOB performance of the interfering transmitter. In Figure 4, the ratio between the interfering transmitter in-block power and the interfering transmitter OOB power defines the ACLR on the DTT reception channel , which has a frequency separation of channels from the interferer. When the frequency separation from the DTT channel to the interferer’s assigned channel increases, the level of interference decreases. CEPT has developed a technology-neutral block edge mask (BEM) approach, which defines the required ACLR performance for the interfering transmissions. The BEM limits for power leaking to adjacent channels depend on the frequency offset from the interferer’s assigned channel [105, 106]. The larger the frequency offset is, the more stringent the limit is.

DTT receiver ACS defines the receiver’s ability to reject interference from an adjacent channel. In Figure 4, the receiver ACS performance on channel determines the level by which the power from the assigned channel of the interfering transmission affects the DTT reception. Interfering power can be considered as additional noise which has the characteristics of the interfering transmission. This noise degrades the DTT reception signal-to-noise ratio (SNR) if the receiver ACS cannot sufficiently reject .

The ACS is mainly defined by the receiver input filter performance if the interference is continuous, but it also depends on all the receiver components, especially in case of bursty time-varying interference; the automatic gain control (AGC) implementation contributes largely to the ACS performance. The AGC circuits in modern DTT receivers often use fast attack times (1 ms) and slow recovery times (150 ms), and thus when a high level interferer is presented the DTT receivers rapidly reduce the gain to prevent overload. The slow recovery times can lead to an extended failure period after the interferer is removed. Error extensions of up to 1.5 s due to AGC circuitry and mechanisms in video compression have been observed [9].

The disruptive effect of time-variance has been studied in laboratory measurements with wide selection of DTT receivers and time-variant interfering signals in [75, 76, 78]. No single type of interfering signal can be determined to be the most disruptive, but different signals caused distortion in different receiver designs. In general, the receiver performance against an LTE interferer in idle mode is worse than against an LTE interferer in fully loaded mode [75]. Rapid variations in time and/or frequency characteristics of the interfering signal induce different behavior in DTT receivers due to the diversity in AGC and RF front-end implementations. This results in large variations in the DTT receiver PRs against a given type of interference. Thus, averaging the measurement results containing several receivers and tuner types should be avoided. The receiver performance spread can be better illustrated by grouping the receivers into different percentile groups, such as 10th, 50, and 90 percentile of all measured receivers. This method is typically used in ECC reports, for example, in [79].

Adjacent channel interference power ratio (ACIR) is the ratio of the total transmission power of the interferer to the total interference power affecting the DTT receiver. The ACIR takes into account the transmitter and receiver imperfections and depends solely on the interferer ACLR and DTT receiver ACS performance. ACIR is defined in

High values in both interferer ACLR and DTT receiver ACS result in high ACIR values, which means that the interfering transmissions can use higher power levels without causing harmful interference to the DTT reception. Larger frequency offsets between the DTT and interfering transmission signals result in better PRs as the DTT receiver ACS and the interferer ACLR both increase.

PR for adjacent channel () can be calculated using cochannel PR (), the interferer ACLR on channel (), and DTT receiver ACS on channel () using [94]

DTT receiver overload is a nonlinear feature of the receiver, where the receiver starts to lose its ability to distinguish the received DTT signal from other signals at different frequencies when the signal level is at or over the overload threshold. The receiver behavior is linear until it reaches the overload threshold. At this threshold, the receiver ceases to behave linearly but does not necessarily fail immediately [107]. When the receiver is in an overload state, the PRs no longer apply and the receiver cannot display the DTT transmissions no matter how high the received DTT signal level is.

Overloading performance varies greatly between the different DTT receivers currently available in the European market [75, 101]. Overloading events typically occur in the vicinity of LTE700/800 BSs, where the total signal levels resulting from MBB and DTT transmissions are high. The DTT receiver Harmonised Standard [103] requires that the receivers should tolerate a signal level of −4 decibel-milliwatt (dBm) without going to an overloading state, but a majority of the DTT receivers released prior to the Harmonised Standard does not comply with this performance requirement [108].

The effect of transient interference on PRs is especially relevant in the case of LTE UE interferers. They can completely appear or disappear in an occasional fashion and have long gaps of no transmission activity, which results in a difficult situation for the DTT receiver AGC. Even 10–12 dB higher protection ratios have been measured when the interference source is applied to the DTT reception scenario after the DTT signal has already been acquired (interference is introduced after the DTT signal has been acquired) [108].

3.3. Standardization in DTT Receivers and DTT Reception Installations

The Radio Equipment Directive (RED) [109] defined in 2014 that requirements for receiver performance need to be created to enable efficient use of spectrum in the EU. The RED came in force 13 June 2016, and the Harmonised Standards for radio equipment need to be updated to meet the requirements of RED. Final draft version of EN 303 340: Harmonised Standard for Digital Terrestrial TV Broadcast Receivers to Cover the Essential Requirements Defined in the RED [103] was released in March 2016 and defines requirements for DTT receiver performance against interference particularly from LTE in 700 and 800 MHz frequency bands. The Harmonised Standard [103] defines requirements for receiver sensitivity, receiver rejection performance against strong OOB interference, receiver overloading, and unwanted emissions in the spurious domain. The radio equipment which does not comply with its relevant Harmonised Standard cannot enter the EU internal market after the transition period has ended on 13 June 2017.

Requirements for DTT receiver performance have previously been available from, for example, DTG [110] and Nordig [111], but they were nonbinding and thus equipment with inferior performance could enter the EU market. Several different receiver requirements also fragmented the European market. The introduction of an ETSI DTT receiver Harmonised Standard has both clarified the testing process for the manufacturers and improved the coexistence performance of the receivers. However, it takes a long time to renew the customer DTT receiver base.

The requirements in EN 303 340 Harmonised Standard [103] were defined using a hybrid approach of laboratory and field measurements. Field measurements were used to determine coupling gains between an interfering terminal and a DTT receiver in [10], and their statistical data was used to determine the power level of the interfering LTE signal at the DTT receiver input in laboratory measurements for EN 303 340 [112]. To determine the performance criteria, the laboratory measurements used recordings of one UE signal and two BS signals in near idle modes at 700 and 800 MHz bands as interference towards the DTT receiver.

The requirements for the DTT reception installation system, including antenna, feeder cable, and amplifiers, have not been addressed in EN 303 340. Having a Harmonised Standard for each of them would further improve the performance of DTT reception. According to observations from field measurements [10], particularly a Harmonised Standard for amplifiers could improve the coexistence performance of a DTT reception system. As no performance requirements or Harmonised Standards for amplifiers exist, devices with inferior performance are still available on the market. The nonlinear characteristics of an amplifier can generate intermodulation distortions and significantly worsen the DTT receiver susceptibility to overloading.

Distribution amplifiers and domestic grade amplifiers are used in a large proportion of households in the UK [113]. Amplifiers are typically used either to improve DTT reception in areas of poor coverage or in households having multiple TV sets fed from a single antenna [114]. According to the field measurements in [10], an additional margin of up to 3 dB may be needed because of the degradations due to the use of a domestic amplifier. Installing an appropriate low-pass filter before the affected element can solve the problem with interference from LTE in 700 and 800 MHz bands [115], but interference originating from transmissions interleaved within 470–694 MHz band requires more complex filtering. Further measurements should be conducted to determine the level of degradation caused by the use of an amplifier to obtain more realistic parameters for coexistence studies.

4. Simulations to Determine Compatibility with DTT Broadcasting

As the broadcast planning covers large geographical areas, simulations are the only practical planning method. Broadcast planning is usually based on achieving a certain reception location probability inside a small area known as a pixel, typically 100 m 100 m. This probability is defined as the percentage of locations where the DTT receiver would operate correctly for a given percentage of time. Different reception modes such as outdoor rooftop, outdoor mobile, and indoor mobile have their own location probability planning target levels. A location probability target of 95% is used at the edge of the coverage area for outdoor rooftop reception in GE06 [12].

International Telecommunication Union Radiocommunication sector (ITU-R) BT.1895: Protection Criteria for Terrestrial Broadcasting Systems [116] recommends that compatibility studies should be made if following interference values are exceeded:(1)The total interference at the receiver from all radiations and emissions without a corresponding frequency allocation in the Radio Regulations (RR) exceeds 1% of the total receiving system noise power.(2)The total interference at the receiver arising from all sources of radio-frequency emissions from radiocommunication services with a corresponding coprimary frequency allocation exceeds 10% of the total receiving system noise power.

The relevant ITU-R documentation regarding frequency sharing and interference analysis of DTT broadcasting systems in the UHF TV band is listed in [95]. The protection criteria for broadcasting are based on local interference considerations, such as degradation to reception location probability in presence of additional interference, degradation to carrier-to-noise ratio (), and degradation to carrier-to-noise-plus-interferer () [95]. A given level of degradation translates into an estimate of the number of populations the DTT broadcasting network cannot serve anymore due to the additional interference. These degradations can be translated to interfering field strengths using the methodologies presented in ITU-R Report BT.2265 [117], which also gives further methodologies to assess the interference when the limits are exceeded. For example, the methods to determine the availability of TVWS spectrum in the UK TVWS framework [118] are based on allowing the interferer to cause a target degradation in DTT broadcasting location probability. These calculations translate into allowed power levels for the white space devices (WSDs).

The method described in Annex 2 of [117] evaluates the degradation to reception location probability due to additional interference and is based on Monte Carlo simulation method typically used to evaluate interference probability in compatibility studies between wireless communication systems [8993]. Multiplying the location degradation by the population in the related pixel gives a statistical estimate on the number of people who cannot receive the DTT transmissions due to the interference.

Annex 3 of [117] describes a method which allows using information from the actual deployments of DTT broadcasting networks and MBB networks. The accuracy of this method is limited by the accuracy of available terrain models and propagation models, but the method accepts the use of realistic DTT receiver sensitivity and PR parameters.

4.1. Simulations to Assess the Compatibility with GE06

This section describes a simulation methodology which can be used to determine compliance with GE06 [12] and allows using laboratory and field measurement results to improve the accuracy of its results. GE06 defines trigger levels for outgoing and incoming interference between allotment areas. Exceeding the trigger level means that coordination is needed in the relevant area. Exceeding the trigger level means that more detailed calculations are needed and does not directly result in an exclusion zone. If the trigger interference levels are exceeded in cross-border interference, bilateral or multilateral coordination negotiations between affected countries are needed. If the trigger interference levels are not exceeded, installing a DTT transmitter or an LTE SDL BS designated for broadcasting AVMS does not require coordination measures between countries.

The maximum power an LTE SDL BS can use in the DTT coverage area while complying with GE06 is determined by where is the minimum median field strength for DTT reception (56 dBV/m + a correction factor for fixed reception , where is the channel center frequency in MHz), and is DTT reception antenna discrimination (16 dB for antenna complying with [119]) and multiple interference margin (MI) to determine the degradation caused by multiple sources of interference. The PR for channel offset between DTT and LTE transmissions can be derived fromwhere is the DTT receiver protection ratio on channel and the combined location correction factor of the variation in the difference between the interfering LTE SDL signal and the wanted DTT signal. The distribution factor depends on the wanted location correction. and are the standard deviations of location variation for the wanted and interfering signals, expressed in dB. Use of 5.5 dB standard deviation for both wanted and interfering signals and location correction for 95% of places is agreed in GE06, which results in value of 1.64. The resulting combined location correction factor is 12.8 dB.

The compatibility of current DTT transmitters with LTE SDL concept in the allotment areas of Finland and its neighboring allotment areas was simulated using (3) in [98]. This study used measured DVB-T PRs from [108]. Based on the simulations in [120], MI of 10 dB was chosen. The simulations performed in the scenario of Figure 8 concluded that there would be broadcasting spectrum available for LTE SDL use complying with GE06 in Finland [98].

Other coexistence study methods can be used to further validate the simulation results and to provide more accurate parameters, which result in more realistic results. Results from laboratory measurements can be used to provide more realistic DTT receiver PRs to correspond to the improvements in the receiver coexistence performance. This would further increase the amount of available spectrum for LTE SDL. However, statistical information from the customer DTT receiver base is difficult to obtain, and receivers with inferior coexistence performance will be in use for long even though a DTT receiver Harmonised Standard now exists.

Field measurements can be used to obtain observations from the simulation scenarios and to determine if they actually are the worst-case scenarios. The simulation results can also be improved by adopting real household antenna gains from measurement campaigns, such as [121]. It is also debatable whether 5.5 dB is a realistic value for deviations and as field measurements have shown lower values [122, 123]. Using smaller values for standard deviation would result in less restrictive power limitations and a further increase in available spectrum for LTE SDL. Still, to study compliance with GE06 the agreed 5.5 dB value must be used.

5. Guidelines for Conducting Field Measurements

Field measurements require substantial human resources, investments in test network infrastructure, professional level measurement devices, and radio licenses. Thus, the time spent on the field should be minimized and the measurement scenario complexity should be limited to avoid excessive costs. Some interference scenarios are very difficult and technically challenging to study through measurements, and, for example, aggregate interference is more convenient to study through simulations [124] than measurements. A major problem with field measurements is that only a limited number of measurement campaigns can be made, and the limited statistical basis does not allow making strong conclusions. Field measurement results thus need to be further studied and verified through other coexistence study methods.

This section describes guidelines to conduct field measurements for DTT-MBB coexistence studies. Figure 5 illustrates the step-by-step procedure on a high level and the following subsections give a detailed description of each step.

5.1. Determining Field Measurement Scenarios for DTT-MBB Coexistence Studies

The scenarios for field measurements are usually chosen to represent the worst cases in terms of interference from the MBB to the DTT reception. The scenarios are built using link budget analysis and are known as reference geometries. They represent geometries where the antenna installation heights and horizontal and vertical separation distances between the MBB and DTT antennas cause maximal amount of interference to the DTT reception. If the DTT reception is protected in such worst-case reference geometry, it can be assumed that it is also protected in all other possible scenarios. Reference geometries can be created both for interference originating from a mobile terminal and for interference originating from a mobile BS.

The ECC studies on coexistence between DTT and TVWS devices [31, 94, 96] provide extensive number of different reference geometries used in determining the protection criteria DTT reception. Simulations, theoretical analyses, laboratory measurements, and other existing research should be taken into account when determining the relevant measurement scenarios for different types of DTT-MBB coexistence. It is difficult to determine how probable or realistic a studied scenario is in practice. Analyzing large number of real interference events rather than a limited set of field measurement results from the expected worst-case scenarios would be very beneficial and, for example, data from the interference events resulting from the introduction of LTE to 700 and 800 MHz bands would be very useful in determining the feasibility of LTE SDL concept.

5.1.1. Interference from a Mobile Terminal

The interference from a mobile terminal is at its worst when the geographical separation between the terminal and the DTT reception antenna is small. The most challenging interference scenario for a DTT reception system is when a mobile terminal transmits at maximum power at the DTT coverage edge, where the DTT signal level is at the minimum level required for its reception. Figure 6 illustrates a reference geometry widely used for rooftop DTT reception in coexistence studies where interference originates from TVWS terminals [94, 96] or LTE uplink operating in 700 and 800 MHz bands [103, 106].

The coupling gain value between the mobile interferer and the DTT receiver causing maximum amount of interference to arrive at a DTT receiver input is called minimum coupling loss (MCL). The horizontal and vertical separation in the reference geometry are chosen to achieve lowest possible MCL and thus maximum possible interference towards the DTT receiver. The free-space path loss (FSPL) in the scenario is −56.15 dB at 650 MHz, and the DTT reception antenna gain is +9.15 dBi with ITU-R BT.419-3 Recommendation [119] compliant antenna and the antenna angular discrimination is −0.45 dB. The MCL can be calculated with this information using

Thus, the MCL in this scenario is −56.15 dB + 9.15 dBi + −0.45 dB = −47.45 dB.

Lower MCL values than theoretical analyses predict have been reported in several field measurement campaigns [10, 11, 87]. This may be due to strong reflections in the signal multipath, which are omitted in the theoretical analyses of the worst-case scenarios. If lower MCLs than predicted occur, they lead to a higher amount of interference towards DTT reception than expected. The rooftop antennas are also often installed at a lower height than the 10 m used in the reference geometries and broadcast planning, and such installations are more prone to interference from a terminal [10]. The measurement campaign in [10] noticed that the measured DTT signal levels were lower than the values predicted with UK Planning Model (UKPM). Combination of a lower MCL than predicted and a lower DTT signal level than predicted led to negative PR margins in UK TVWS framework [10].

The reference geometries to represent worst-case scenarios for indoor reception are diverse and depend greatly on the materials used in the walls and windows [10, 11]. DTT broadcasting is often planned to provide only outdoor coverage, and in such cases the protection of indoor DTT reception does not need to be considered. Indoor DTT reception is especially vulnerable to a mobile terminal in an adjacent room [10] or in the same room [87].

The hidden node margin (HNM) problem [22, 23] needs to be considered only when the mobile terminals sense spectrum [125, 126] and make decisions regarding which channel to use for their transmissions. The margin refers to the difference between the signal strength measured from a rooftop DTT reception antenna and the signal strength measured at a street level or indoors by the terminal, which might not detect the DTT service and incorrectly interprets the channel as unoccupied. HNM problem does not exist in the most likely DTT-MBB coexistence scenarios, where information from geolocation databases is used to select the transmission channel or the spectrum is allocated using GE06.

5.1.2. Interference from Mobile Network Base Stations

The interference from a BS affects only a portion of the users who are located within certain distance from the BS. Figure 7 illustrates the area where the interference from an MBB BS affects the DTT reception within the DTT coverage area. The overloading effect occurs in the close vicinity of the MBB BS, while the interference from an MBB BS degrades the DTT reception SNR over a larger geographical area around the BS. As described in Section 3.3, the use of power amplifiers degrades the DTT reception system performance in presence of interference and thus increases the size of the area around the MBB BS where the DTT reception SNR degradation and overloading occur.

Large geographical separation is needed for cochannel operation, but the initial field measurements and the experience from coexistence between DTT and LTE BSs in 800 MHz band have shown that adjacent channel interference events rarely occur with distances larger than 1.3 km from a BS [127, 128]. Inside this area, either the combination of DTT and LTE BS signal strengths is high enough to cause overloading in the DTT receiver or the DTT receiver rejection performance against strong OOB interference might cause degradation in DTT reception SNR.

The directional DTT reception antennas complying with [119] have a front-to-back ratio of 16 dB. Thus, if the reception antenna is pointed away from the LTE BS, the interference is mitigated by 16 dB compared to a situation where the DTT reception antenna is pointing towards the BS. The DTT reception system susceptibility to interference is thus largely determined by the antenna radiation patterns of LTE BS, DTT transmitter, and DTT receiver and their location in relation to each other.

Figure 8 illustrates a reference geometry which can be used to determine the interference between an LTE SDL BS and fixed rooftop DTT reception at a height of 10 m. A typical high power high tower DTT transmitter at a height of 300 m is used. The worst-case installation height of 60 m for LTE SDL BS represents a typical maximum installation height for a rural LTE SDL BS with a large coverage area. The terminal using LTE SDL does not contribute to the interference towards DTT transmission in the UHF TV frequency band as the PCC operating in a different frequency range carries the traffic in uplink direction. If the scenario is used to determine protection criteria against non-SDL LTE transmissions, the uplink transmission interference from the terminal also needs to be considered.

5.2. Obtaining Radio Licenses and Building Test Network Infrastructure

Interference to licensed commercial DTT users is not allowed in the field measurements under any circumstances, and thus test networks for both the DTT and MBB are needed. The test networks need radio licenses for their operation, and thus the first steps before conducting field measurements are to obtain the radio licenses from the national regulatory authority, install the infrastructure of test networks and verify their operation.

When the source of interference is an MBB terminal, it can also be simulated with a signal generator and an antenna. A step attenuator can then be used to control the level of interference and an amplifier can be used if the interfering power level is not high enough to cause harmful interference. The simulated MBB terminal signal needs to comply with the relevant BEM requirements. If real MBB terminals are to be used in field measurements, they need to allow changing their operational parameters. Otherwise it is not possible to properly study the effect of different transmission modes and traffic loads.

5.3. Determining the Field Measurement Locations

Depending on the measured coexistence scenario and geometry, a location with specific signal levels for both DTT and MBB may be required for the field measurements. In a mobile terminal interference scenario, only the level of DTT signal is relevant, as the measurement reference geometry can be created and the level of interference controlled by moving the terminal.

Determining a measurement location is more difficult when the interference from a MBB BS to DTT reception is being measured, as both the DTT signal and the MBB BS signal need to be on a specific level where interference events can occur. Locations where overloading events occur are in the vicinity of the MBB BS, but finding locations where the relative difference between the DTT and the MBB BS signal levels is suitable for other types of interference events to occur is more difficult.

When measuring adjacent channel PR, the wanted DTT signal could, for example, be of the order of −60 dB. Depending on the frequency offset between the DTT and the interfering signals, the required interferer power level might need to be, for example, 30 to 60 dB higher than the DTT signal level before interference events occur. In case of interference from a mobile terminal, only the DTT signal needs to be on a specific level.

Field strength predictions for both DTT and the interfering BS should be used to preselect possible measurement locations, after which their suitability needs to be confirmed by measuring the signal levels at the location. Typical path loss models are not accurate without excessive amount of terrain data and building models [129]. Signal level field measurements can be used to calibrate their parameters or to create more accurate radio environment maps with interpolation methods [97].

5.4. Building the Measurement Setup in Laboratory Environment

Before conducting field measurements, the measurement setup should be built and tested in laboratory conditions to verify its operation. The operation of the measurement equipment in specific scenarios can be practiced beforehand, and the properties of the measured devices, such as sensitivity, ACS, overloading performance, and interfering transmission ACLR need to be carefully measured to understand their behavior in field conditions. EN 303 340 Harmonised Standard [103] defines measurement methodologies to determine if a DTT receiver complies with the standard.

The basic field measurement setup is very similar to the laboratory measurement setup described in ITU-R BT.2215: Measurements of Protection Ratios and Overload Thresholds for Broadcast TV Receivers [75]. Same measurement procedures can be applied and same professional level measurement devices need to be used, but field measurements naturally use the test networks and antennas to transmit the signals in real propagation environments [130] instead of the signal generators and cabling used in laboratory environment.

A general measurement setup in Figure 9 illustrates how the DTT and interfering transmissions are sent and received with antennas. The signals received by the DTT reception antenna are split to a DTT receiver to observe the reception quality and to a spectrum analyzer to measure the power levels and impulse responses.

As only a limited number of DTT receivers can be measured in field measurements, initial laboratory measurements should be conducted to determine the DTT receivers whose behavior could produce most interesting results in field measurements. Even though classical superheterodyne DTT tuners are disappearing from the market, they are still very common in the European households and DTT receivers with this tuner type should also be measured. Results from previous laboratory measurement campaigns of DTT-MBB coexistence are widely available, for example, in [75, 76, 7886].

5.5. Initial Measurements at the Field Measurement Locations

Before constructing the whole measurement scenario at the intended field measurement location, the signal levels of the DTT and MBB transmissions should be measured to initially determine their suitability for the measured scenario. The measured signal levels can also be used in the studies on the accuracy of propagation prediction models [131], such as the ITU-R P.1546 [132] used in GE06 interference analyses.

Receiver sensitivity is defined by the minimum signal level needed to correctly receive the DTT transmission and thus is a measure of the DTT receiver performance. The DTT receiver sensitivity without additional interference thus needs to be measured at each measurement location to be able to correctly assess the effect of the interference. Figure 10 illustrates the sensitivity levels of 3 different DTT receivers at 7 different field measurement locations and in additive white Gaussian noise (AWGN) and Rician (multipath line-of-sight conditions representing rooftop DTT reception) channels in laboratory measurements. The sensitivity levels have been observed when there has been no interference from MBB.

The green line at −75 dBm corresponds to the DTT receiver sensitivity requirement set in EN 303 340 Harmonised Standard [103]. The field measurements have been conducted with DVB-T2 signal at channel 60 in Espoo test network in Finland, and the results are shown in points 1 to 7 of the -axis. The AWGN and Rician laboratory measurement results are represented by points 8 and 9. The -axis represents the measured DTT receiver sensitivity in dBm in each reception scenario. The used receivers mostly comply with the EN 303 340 requirement, but receiver 1 is fulfilling the requirements only at some locations. This emphasizes that the behavior and performance of different DTT receivers in different reception conditions should be studied and analyzed before conducting the actual field measurements.

A receiver might perform well in most operating conditions, but it is typical that each receiver design has a worst-case reception condition where it performs very poorly. The different DTT reception signal multipaths are caused, for example, by reflections from terrain or buildings and ionospheric reflection and refraction. The reception conditions at different locations incorporate different types of phenomena, such as pre- and postechoes with fluctuating power levels, absence of direct transmission path, and phase-shifting of the signal [9]. Impulse responses should be measured to allow analysis of the signal multipath.

5.6. Conducting the Field Measurements

The field measurements introduce variables impossible to reproduce, as the human activity and the signal propagation environment are never perfectly identical to the previous measurements. Thus, it is important to record as much data as possible regarding the propagation environment and the radio signal transmissions during the measurement campaign.

To validate their results, field measurements should be repeated on consequent days in very similar conditions. Special attention has to be paid to the repeatability of the time-variance of the interfering signal, and at least the traffic loading should be similar between different measurements to make the results comparable. In laboratory conditions, a recorded signal can be replayed over and over again, but in field measurements such repeatability is more difficult to achieve.

Recording the power spectral density (PSD) and the amplitude of the DTT and interfering signals as a function of time allows further analysis and comparisons between different measurement campaigns to be made. The signal levels in the measurement frequency band and adjacent frequency bands should also be recorded, as this data can be used to identify sources of interference other than the intended interferer, as they might cause additional degradations in the reception of the DTT signal. The spectrum data can also be used to validate the operation of the DTT and interfering signal transmissions during the measurement campaigns.

Figure 11 illustrates a spectrum waterfall of signal levels recorded during a field measurement campaign to determine the protection of DVB-T wireless camera link coexisting with LTE in 2.3 GHz band in [88]. The colors in the picture represent signal levels in dBm, the -axis the frequency in MHz, and the -axis the time. The LTE transmission was permanently on the same channel in this measurement campaign, and the protection criteria for the DVB-T transmissions were studied on the adjacent channels and with a frequency separation of 9 channels from the LTE to study the receiver image channel performance [80]. The figure allows seeing when the transmissions have been operational, their power levels, and if there have been other signals or sources of interference present within the measurement band. The selection of correct parameters for recording such spectrum data is discussed in [97, 133].

5.7. Initial Analysis of the Measurement Results

Initial analysis of the measurement results should be made during the measurements to determine if there are any unexpected phenomena contradictory to the theoretical hypotheses or simulations. If there are, the measurement plan should be modified to include further measurements of the phenomena. If there are not, further measurements can be conducted according to the original measurement plan.

To allow modifications to the measurement plan, a quick analysis of the results should be made at least after each measurement location. Meetings involving all the participants of the measurement campaigns should be arranged after each measurement day to maximize the intellectual resources in the analysis of the results. During the measurements, such analyses involving all the participants are not possible as the participants are not colocated as they are operating the measurement and transmission equipment and observing the DTT transmission quality at different locations.

6. Conclusions

This article has presented a survey on the use of UHF TV broadcasting band in Europe and a survey on coexistence between DTT and MBB and described how different methods to study coexistence between DTT and MBB can be used to provide realistic and accurate results. The article also defines guidelines for conducting field measurements to study DTT-MBB coexistence and considers how observations from field measurements could be used to obtain more realistic results from simulations.

The coexistence studies presented in the article contribute to the current development phase of wireless communications systems, where the aim is to shift from static exclusive spectrum allocations to more dynamic coexistence of different systems and more efficient utilization of the scarce spectrum resources. The field measurement guidelines presented in this article can be applied to any DTT-MBB coexistence scenarios and to a wide range of spectrum sharing and cognitive radio system coexistence measurements. For example, the guidelines have been applied to study the coexistence between MBB and wireless camera links (as defined in [134]) operating in the 2.3–2.4 GHz band [88].

DTT broadcasting on UHF TV frequencies in Europe has not been a stable operating environment in the past years, and changes to the use of the broadcasting spectrum are expected to continue in the future. DTT reception system coexistence performance has taken a step forward with ETSI EN 303 340 [103] Harmonised Standard for DTT receivers. Further steps could be taken by standardizing the installations of DTT reception systems and power amplifiers.

In near future, LTE SDL seems to be the most feasible MBB coexistence scenario in the UHF broadcasting band in terms of technical compatibility with DTT and in terms of compatibility with the GE06 agreement. However, further field measurements and coexistence studies are needed to validate its feasibility. In long term, MBB and DTT are expected to converge into one ecosystem which delivers all types of content and autonomously chooses if the optimal transmission mode is broadcast, multicast, or unicast. The aim in the development of 5G technologies is to create such a converged ecosystem [135].

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Acknowledgments

This study has been conducted in the FUHF (The Future of UHF Frequency Band) project funded by Tekes, the Finnish Funding Agency for Technology and Innovation, in 5th Gear programme. This article is based on the doctoral dissertation of Kalliovaara [136].