Abstract

We investigate the feasibility of Internet of Things (IoT) technology to monitor and improve the energy efficiency and spectrum usage efficiency of broadcasting networks in the Ultra-High Frequency (UHF) band. Traditional broadcasting networks are designed with a fixed radiated power to guarantee a certain service availability. However, excessive fading margins often lead to inefficient spectrum usage, higher interference, and power consumption. We present an IoT-based management platform capable of dynamically adjusting the broadcasting network radiated power according to the current propagation conditions. We assess the performance and benchmark two IoT solutions (i.e., LoRa and NB-IoT). By means of the IoT management platform the broadcasting network with adaptive radiated power reduces the power consumption by 15% to 16.3% and increases the spectrum usage efficiency by 32% to 35% (depending on the IoT platform). The IoT feedback loop power consumption represents less than 2% of the system power consumption. In addition, white space spectrum availability for secondary wireless telecommunications services is increased by 34% during 90% of the time.

1. Introduction

The traditional fixed spectrum allocation has led to a considerable inefficiency in the spectrum usage, in both the spatial and temporal domain [1]. Several spectrum surveys demonstrate that the spatial and temporal usage of the spectrum are generally lower than 20% [2]. For developing countries, where most villages have a relatively low population density, the inefficient spectrum allocation has a negative impact on connectivity and broadband access equality.

The cognitive radio paradigm has been a key way for a more efficient use of the radioelectric spectrum [1]. Cognitive radio achieves a higher spectrum usage efficiency via an opportunistic access to the spectrum that is not being used by a licensed primary service at a certain location and time [3]. Technologies based on cognitive radio can sense and monitor in “real-time” the surrounding radio spectrum, learn from the environment, and take intelligent decisions to establish the communication using the best available spectrum resources, with a minimal interference and optimal radiated power.

Several standards based on cognitive radio techniques have been developed. For instance, IEEE 802.22b and IEEE 802.11af implement cognitive radio techniques allowing to dynamically access the Very High Frequency (VHF) and Ultra-High Frequency (UHF) bands [4, 5]. Most of the spectrum resources are generally allocated for digital TV broadcasting, as a primary licensed service. The TV Band spectrum allocations not in use at a certain location and time frame are known as TV White Spaces (TVWS). In the same way, to overcome the licensed spectrum shortage and its inefficient usage, Long-Term Evolution networks in the unlicensed spectrum (LTE-U) have been proposed and are under consideration by the Third Generation Partnership Project (3GPP) [6]. In [6], authors propose a novel wireless access point architecture (hyper-AP), which implements both Wi-Fi and LTE and jointly coordinates the spectrum allocation and interference management.

Because of interference and coexistence issues with broadcasting services, TVWS technologies have implemented the provision to access a white space database [7]. This database provides to Base Stations (BS) the white spaces availability, based on the geo-location of all network devices, the primary broadcasting transmitters frequencies, and its emission foot-prints [5, 8]. TVWS databases are designed and updated based on network modeling and measurement campaign data. TVWS databases are defined based on different rules and optimization strategies to achieve a high level of protection, a high amount of available TVWS, or a certain trade-off between both [9]. However, this constraint may limit in some way the potential spectrum usage efficiency of cognitive radio techniques. A collaborative approach between the primary licensed service (digital TV Broadcast) and secondary unlicensed services (TVWS technologies) is required to optimize spectrum usage efficiency.

Traditional broadcasting networks are designed based on a certain technology and environment-related link budget parameters. Environmental parameters such as shadow margin (sm) and fade margin () are chosen according to the most appropriate propagation model for a certain scenario [10, 11]. These values are usually large enough to guarantee a high coverage percentage, for example, 95% area coverage and 99% network availability. However, the use of excessive implementation margins reduces the energy efficiency and spectrum usage efficiency of the radio communication system [12]. For instance, for the UHF band, the vehicular traffic density and other object movements have a considerable effect on the signal fading variation. A of 7.4 dB (99-percentile) for dense vehicular traffic and 3.6 dB (99-percentile) for low vehicular traffic (3.8 dB difference) are reported in a measurement campaign in Bilbao and Madrid, Spain (urban scenarios) [13].

The detection of TVWS interference to the broadcasting network may not always be reliable, even among systems with the same radio access technology, due to hidden nodes [14]. In addition, typical TVWS BSs compute the allocation of spectrum resources based on the surrounding user devices’ information and self-sensing information. This requires a database with spectrum usage constraints for avoiding interference, because TVWS BSs do not get information from other BSs or other TVWS networks or broadcasting networks. Although the interfered primary system (broadcasting) could be aware of interference, it has no support to provide feedback to the TVWS system. A social sharing paradigm for channel status information exchange, allowing heterogeneous networks to cooperate regarding spectrum resource allocation, is assessed in [15]. A collaborative approach between both networks should help in reducing interference and optimizing the spectrum usage. For this, a dedicated feedback network is required to optimize the broadcasting network and to provide “real-time” perceived interference data to TVWS databases.

TVWS network deployments usually provide networking solutions in areas that are not fully serviced by traditional wired solutions. TVWS technology was proven to be a reliable solution for underserved areas (low broadband connectivity and Internet access density). Several trials have been successfully run in suburban and rural areas of Africa, but also in the United States and United Kingdom [16]. The monitoring solution must be ubiquitous (i.e., always and everywhere connected) in order to guarantee the same coverage as the traditional broadcast network. In addition, wired services (e.g., ADSL, ethernet, and optic fiber) are not cost-effective to satisfy the ubiquitous network requirements in the intended scenarios (suburban and rural underserved areas).

For green network planning, the power consumption is a major constraint. Most recent advances on standardization and industrialization for Internet of Things (IoT) technologies could contribute to solving the lack of optimization in the broadcasting network. In addition, it could solve the hidden node and availability issues in TVWS networks, satisfying minimum power consumption constraints. For instance, a typical IoT transceiver has a power consumption of 39 to 70 mW [17, 18] compared with ethernet transceivers that have a power consumption of 248 to 341 mW [19]. Notice that an ethernet topology might additionally include several switching and routing devices between each user and the management server, leading to an even higher power consumption. Finally, the minimum length of ethernet frame (64 bytes) is not efficient for applications with low traffic density.

The novelty of this paper is the real-time optimization of the spectrum usage in broadcasting networks, by LoRa and NB-IoT platforms using an approach like the one presented in [20], but that was for an indoor ZigBee network. The IoT network allows a collaborative approach to jointly coordinate the spectrum usage and manage the interference to improve the coexistence between both broadcasting and TVWS networks. The spectrum usage efficiency optimization through an IoT monitoring network has not been studied before, according to the authors’ knowledge. A novel spectrum usage efficiency metric is proposed to assess the broadcasting network performance with IoT feedback loop. For the first time, we optimize the power consumption of a broadcast network by monitoring fading throughout a day. A model to optimize IoT technology power consumption is proposed, taking into consideration the time-variant characteristic during reception, transmission, and sleep mode. For this aim, the heuristic algorithm presented in [2, 21] was improved to reduce the IoT BS infrastructure and optimize their locations. Additional modifications are presented to support specific characteristics of IoT physical layer and medium access.

To overcome the broadcasting network power consumption inefficiency, an adaptive power control is proposed in [12]. The adaptive broadcast system, emulated in laboratory conditions using the European Telecommunications Standard Institute (ETSI) channel model for DVB-T, allows reducing the power consumption net balance, by reducing the radiated power when the propagation conditions allow it [12]. The performance of the measurement device (i.e., accuracy and standard deviation) is not considered. The end-devices’ power consumption is estimated for a nondedicated feedback network delivering a UDP (User Data Protocol) packet per second per device [12].

In [9], the authors analyzed the white space availability in a dynamic network able to switch between broadcast and broadband modes and decide whether to pretransmit content or not, to optimize white space availability. In such scenario, Dynamic Broadcast significantly increased the local TVWS availability [9]. However, the feedback network platform feasibility is not investigated. Further research is required to analyze the impact and feasibility of a time-variant radiated power by broadcasting networks on TVWS availability. Because of its low power consumption and large covered area, IoT technologies could be a feasible solution.

IoT networks have grown quickly in recent years. Several standardization efforts have been conducted and commercial deployments of IoT Low-Power Wide Area Networking are ongoing. The LoRaWAN (Long-Range Wide Area Network) is one of the most widely adopted IoT standards [22]. LoRa technology is generally intended to operate in the unlicensed 433 MHz and 868 MHz bands with a minimum channel bandwidth of 125 kHz. The physical layer (PHY) implements a Chirp Spread Spectrum modulation (CSS), which provides great robustness against interference [22]. For instance, for the highest spreading factor (SF=12) the required SNR is as low as -20 dB [17]. LoRa can provide data rates for machine to machine communications (M2M) of up to 5.46 kbps per channel with spreading factor 7 (SF=7) [22].

LoRa has three operational class of devices. Class A devices use a medium access control based on ALOHA [22]. ALOHA is a contention-based medium access scheme, allowing devices to transmit in a random time and/or channel. After sending the message, the Class A device listens to the channel for two receiving windows waiting for an acknowledge message. Class B is typically implemented in applications where additional downlink traffic is required. Class C devices are receiving all the time except when a message requires being transmitted [22].

SigFox, is an ultra-narrowband-IoT technology requiring a channel bandwidth of 100 Hz. The PHY layer implements a proprietary ultra-narrowband modulation based on Binary Phase-Shift Keying (BPSK). The constraint of 14 packets/day for the maximum daily packet delivery and the maximum packet size (12 bytes) has limited the range of applicability for this technology. These restrictions, together with a business model where SigFox is the network owner, have shifted the attention of IoT operators towards LoRa [22].

Several standardization efforts have been done recently by the Third-Generation Partnership Project (3GPP) to support LPWAN for IoT applications. 3GPP Release 13 included relevant improvements for a better support of IoT applications. Narrowband-IoT (NB-IoT) allows flexibly introducing IoT solutions by using a small portion of traditional Long-Term Evolution (LTE) networks spectrum [23]. The total bandwidth requirement is 180 kHz, allowing a wide range of joint deployment and (partial) reuse of already existing GSM and/or LTE infrastructure. For instance, NB-IoT can be jointly deployed with LTE by using a single LTE Physical Resource Block (PRB), in the LTE guard-band or “standalone” (reusing GSM spectrum) [2325]. The in-band mode and guard-band mode allow reusing the BS front-end (radiofrequency signal processing) and LTE numerology, i.e., subcarrier spacing, frame structure, and sampling factor. The in-band and guard-band modes can be implemented in the LTE BSs by a software upgrade [25].

The outline of this paper continues as follows. In Section 3, we present the collaborative dynamic broadcasting network design with IoT feedback loop. Furthermore, the required feedback parameters on typical terrestrial television receivers are investigated. In Section 4, we present the network optimization methodology for both the broadcasting and IoT feedback network, the power consumption models, and spectrum usage efficiency (SUE) metric. A real broadcasting scenario is considered, including experimentally determined fade margin variability through the day for a realistic modeling of the dynamic broadcasting network and IoT feedback loop. In Section 5, we present the optimization results for the network designs and the feedback loop for different IoT platforms, power consumption, and spectrum usage efficiency. A performance benchmark of two IoT technologies is assessed for a real scenario. Conclusions are presented in Section 6.

3. Collaborative Network Design

For the implementation of a collaborative dynamic broadcasting network, we designed a network architecture able to retrieve the required feedback data to the transmitter station and TVWS database manager in real-time through an IoT feedback loop. Figure 1 shows a block diagram of the proposed network architecture.

By means of the IoT platform the Dynamic Broadcast Manager in Figure 1 collects and analyzes Quality of Service (QoS) data retrieved from the Set-Top Boxes (STBs) to adapt the radiated power to the real propagation conditions in the covered area. The QoS data also allows detecting interference from secondary services (e.g., TVWS network) and taking further actions. The actions to take would depend on the interference level and range from limiting the TVWS devices maximum transmitted power to block the channel from usage in the TVWS manager. In this way, we implement a collaborative approach between the broadcasting and TVWS network. The IoT network has to be designed and optimized to overlap with the broadcasting network coverage.

STBs include a System on Chip (SoC) with enough computational performance to implement complex measurement tasks [26]. A measurement device implemented by software in a commercial DTMB (Digital Terrestrial Multimedia Broadcast) STB reported a dispersion of 2 dB for signal level measurements (absolute value) and Signal-to-Noise Ratio (SNR) measurements (relative value) [26]. Later optimization of the measurement algorithm and calibration allowed reducing the relative measurement accuracy to 1 dB. Note that temporal fading variations reported for urban and suburban scenarios are higher than 2 dB [12, 13]. Measurement capability can run in the background in the middleware. Hence, the STB can retrieve several parameters related to the broadcasting signal quality and interference at the receiver location site.

Table 1 lists the parameters required to implement the broadcasting network optimization and to provide feedback about interference issues to the TVWS database management system of Figure 1. The bit size for each parameter is hardware dependent, i.e., tuner and demodulator integrated circuits.

For simulations and modeling, we use estimated audience percentages from the local content providers. However, for a real implementation and to take optimal decisions on whether to block a certain channel in the TVWS database manager, real-time audiences per program are required. This information can be obtained through the current Program ID (PID) field. The PID size corresponds with an MPEG-II part 10 Transport Stream. The frequency offset is required to correct the local frequency reference and achieve a higher measurement accuracy. A 7-byte packet is sent to the broadcasting and TVWS network managers. An IoT transceiver will collect the data packets from the STB and later transmit it to the optimization servers every 5 minutes. Retrieved data is based on the 99-percentile for the previous 5 minutes. Management servers will make decisions based on the worst-case retrieved data. For LoRa packets of 10 Bytes, the time on air with SF=7 is around 40 ms [22]. Hence, latency of IoT network is not a critical constraint for this application.

4. Method

4.1. Configuration and Scenario

To model the dynamic broadcasting network performance in terms of power consumption and spectrum usage efficiency, we consider a realistic suburban scenario in Havana, Cuba, and the currently deployed DTMB broadcasting network. We consider a broadcast transmitter in Lawton (neighborhood in Havana) covering an area of 47 km2. Figure 2 shows the coverage area of the broadcast transmitter (dotted line).

The number of devices served by the broadcasting transmitter in the covered area is approximately 145,000 (based on household densities). We assume the highest peak audience rating per channel is 0.6. Hence, during peak times up to 87,000 devices will send a 7-byte packet every 5 minutes.

To evaluate the required resources (i.e., infrastructure, network power consumption, and spectrum usage) for the IoT feedback loop, we design, optimize, and compare two IoT networking solutions in the proposed scenario: LoRa and NB-IoT. Notice that the SigFox constraint of maximum packets delivered per day does not fit this application. In addition, the ownership of the network by SigFox is not in line with the local operators’ business model and the regulatory authorities’ policies.

First, we define the proper link budgets for each technology, to perform the dynamic broadcasting and the IoT network design and optimization. We consider the IoT transceiver to be integrated in the digital TV receiver and share the antenna system (over the roof outdoor configuration). Table 2 lists the most relevant link budget parameters for DTMB, LoRa, and NB-IoT.

A link budget accounts for all the gains, losses, and implementation margins in the transmitter, the receiver, and the propagation channel. Based on the link budget it is possible to calculate the maximum allowable path loss (PLmax) for each technology in a certain scenario [10].

The OFDM parameters (including Frequency Sampling Factor) and bitrate of the broadcast transmitter were retrieved from the DTMB standard specifications [27]. Other specifications of the broadcasting network, such as radiated power, radiation efficiency, frequency, bandwidth, antennas parameters, and receiver parameters, depend on the setup, network planning, and technology in use by the service provider (current deployed network). These parameters were retrieved from private interviews with the service provider. Notice that the transmitter efficiency takes into consideration both the high-power amplifier and radiation system efficiency. The shadowing standard deviation was retrieved from the Regulation for Digital Television Broadcast by the local regulatory authorities.

The fade margin accounts for the temporal fading in the transmission channel [10]. In the UHF band, the highest fading variations are caused by the human activity (e.g., vehicular traffic) [13]. The shadow margin accounts for the signal variations caused by the topography and obstacles in the propagation path from the transmitter to the receiver. Shadowing is modeled by a lognormal distribution with a certain standard deviation. The fading margin was obtained from a measurement campaign in the current suburban scenario. The fading in the evaluated scenario exceeds 7.4 dB for only 1% of the time (i.e., 99% signal availability in a period of 24 hours).

For this application, we assume that the IoT end-device transceiver is integrated into the same digital terrestrial television receiver. The end-users’ reception system (e.g., antennas, splitters, and feeder) matches with the technical specifications to be used by both IoT technologies under evaluation. Hence, the receivers’ antennas height, gain, and feeder losses correspond with typical terrestrial television infrastructure in the evaluated scenario. Notice that the IoT transceivers should compensate any additional loss due to the required diplexing solution (in this application the constraint is in the uplink channel).

For LoRa we consider Class A end-devices. LoRa devices can radiate a signal level higher than 12 dBm, but due to the regulation of the maximum allowable radiated power in the 317 MHz and 433 MHz unlicensed bands, the maximum EIRP has been limited to 12 dBm. The NB-IoT end-devices have a maximum EIRP of 23 dBm [18].

LoRa BSs allow emulating up to 49 virtual channels [28] based on different orthogonal spread spectrum chirps and bandwidth. Here we consider the maximum available physical channels only (eight channels) [28]. This is because there are no previous reports of the emulation performance by LoRa BSs.

LoRa PHY layer implements a larger range of modulation schemes, allowing bit rates from 0.25 kbps to 5.46 kbps (for a single channel) [29]. The SNR is in the range from -7.5 dB to -20 dB. The spread spectrum modulation encodes each bit of information into multiple chirps. Hence, the spread spectrum processing gain allows receiving signal powers below the receiver noise floor. However, a drawback is that a larger bandwidth is required [17].

For NB-IoT, we consider a joint deployment with LTE BS infrastructure, considering in-band mode. We assume the LTE band A5 will be released for mobile communications after analog television switch-off. The occupied bandwidth per channel for LoRa is 125 kHz [22] and for NB-IoT is 180 kHz using a single LTE PRB [18]. The radiation efficiency of the NB-IoT power amplifier and radiation system will be the same as for LTE. Also, the sampling factor is the same as LTE. The OFDM parameters correspond to a single LTE PRB [25].

NB-IoT uses QPSK modulation allowing an uplink bit rate of 204.8 kbps [24] requiring an SNR of 3 dB, based on a theoretical Maximum Coupling Loss (MCL) of 145 dB [30]. For the uplink, end-devices can use the coverage enhancement mode. This mode uses BPSK modulation and a 15 kHz single tone allowing an SNR as low as -12.6 dB. Nevertheless, the data rate is as low as 20 b/s [23]. This data rate does not satisfy the traffic demand in our application. Hence, the coverage enhancement mode (MCL=168 dB) is not considered.

For NB-IoT, an additional 2 dB loss should be accounted for in the cell interference margin. This is because the LTE cell frequency distribution requires considering a permissible interference among nearby cells [10].

4.2. Dynamic Broadcasting Network Optimization

Figure 3 shows a process chart for the optimization of the dynamic broadcasting network.

First, we account for the fading variation throughout the day in Figure 3 (more details of fading measurement will be explained in Section 4.4). In a realistic scenario, we assume this information is properly retrieved and delivered through the IoT network to a Dynamic Broadcasting Manager by all STBs matching the actual transmitter PIDs. The EIRP is adapted by the Dynamic Broadcasting Manager (Figure 3) according to the actual propagation environment to keep the intended area of coverage. In this way, the broadcasting network power consumption will decrease if the propagation conditions allow it. The area of coverage is defined to guarantee a 95% coverage and 99% of time availability. The threshold of 95% of locations covered 99% of the time by the broadcasting transmission is defined for a mean signal level of -72 dBm at the cell edge (for the worst-case fading and considering a receiver sensitivity of -84 dBm) [31].

The required EIRP to guarantee the intended coverage area is calculated by means of a path loss model for a suburban city considering the maximum allowable path loss determined by the link budget (Table 2) and the instant fading retrieved from the STB. Both the ITUR P.1546 method for point-to-area radio propagation predictions for terrestrial services [32] and Okumura-Hata model [33] yield similar results in our suburban scenario. Once the required EIRP of the broadcast transmitter is adjusted to the fading, the power consumption and spectrum efficiency are calculated (see Figure 3 and Sections 4.54.7).

The application of this method reduces the interferences from broadcasting to TVWS networks. A lower interference level also leads to a reduction on power consumption [2]. Moreover, the main benefit is related to increasing spectrum reusage. With the proposed method the broadcast transmitter will reduce the radiated power when the propagation conditions allow it. The reduction of radiated power is equivalent to a reduction of the protected area of the transmitter. This means that the area where the TVWS devices can reuse the same channel of the broadcast transmitter is increased. By means of the spectrum efficiency metric, it is possible to account for the TVWS current channel availability and update in real-time a TVWS database (see Section 4.7).

The feedback data retrieved through the IoT platform also solves the hidden node problem in the TVWS network, minimizing the interferences to the primary broadcasting service. When the adaptive broadcasting manager detects a SNR considerably lower than the average, then it will report interference issues to the TVWS database blocking the actual channel or limiting the maximum allowable power due to interference constraints.

4.3. IoT Feedback Network Optimization

To optimize the dynamic broadcasting network with adaptive radiated power, also the power consumption of the dedicated feedback channel has to be minimized. To this aim, IoT LoRa and NB-IoT network (Table 2) are designed, optimized, and benchmarked.

To account for the minimal required infrastructure and optimize the network power consumption, the heuristic algorithm presented in [2, 21] was improved to reduce the number of Base Stations required in the IoT physical layer. Additional modifications were performed to account for the medium access characteristics of LoRa (i.e., ALOHA) and the timing constraints of the frequency band to be used (i.e., 1% duty cycle for end-devices and 10% for the BSs). Figure 4 shows the process chart for the design and optimization for minimal infrastructure and power consumption of the IoT feedback network. The design and optimization process is performed in two steps (two heuristic cycles of the algorithm). In the original algorithm presented in [2, 21] only one cycle is present (no optimal BS location selection is implemented). For the same conditions described in [2], the developed algorithm in the first stage improves the BS infrastructure resources around 20%.

The network design tool is capacity-based, meaning that the traffic density and end-devices density are input parameters. The software also receives as input parameters the target area and an array of possible BS geo-locations including the BS antenna height. A pseudorandom distribution of the end-devices according to the receivers density is performed in the target area for each simulation [21]. 40 simulations are performed to assess the mean power consumption of the whole network.

In the first step (Figure 4), the best set of BSs among the whole set of possible BS locations is chosen. In the evaluated scenario, a total of 25 possible BS locations was considered. The software connects each user to the active BS with the lowest path loss if this BS still has enough capacity to support the user. A new BS is marked active only if no other already active BS can support the current end-device.

For LoRa the maximum traffic and density of users per BS (capacity) are not only related to the bitrate, but also with the ALOHA medium access mechanism and imposed restrictions for the assigned frequency band. The LoRa network is designed considering a Class A user device using ALOHA mode. An acknowledge message will confirm the proper reception of the STB transmitted information and perceived signal level by the LoRa BS. The ALOHA medium access control mechanism reduces the maximum throughput per channel to 36% [34]. Hence, the maximum capacity per BS is reduced. The algorithm checks the remaining capacity of the BS considering enough margin to guarantee at least 3 transmission attempts per device, the transmission of an acknowledge message from the BS, and validates that neither the end-devices, nor the BSs overpass the frequency band timing constraints.

The best BS locations in terms of path loss are statistically chosen after 40 simulations (step 1). The maximum number of BSs chosen will depend on the traffic demand and effective coverage per BS that guarantee at least 95% percent of end-devices actually covered by the network. The path loss PLEd -BS [dB] between the end-devices (Ed) and each Base Station (BS) is calculated as a function of the distance d [km], the frequency f [MHz], the BS antenna height hBS [m], and the end-device antenna height hEd [m] by means of the following equation [10]:The function depends on the propagation path loss model used to account for the propagation environment losses [10]. For our scenario, we use the Okumura-Hata path loss model [33] which fits well with the scenario topology and related technology parameters (i.e., frequency, maximum range, and effective heights). A user is considered covered by a certain BS when PLEd -BS PLmax. In the second step of the algorithm (Figure 4), the power consumption is optimized by connecting users to the active BSs with the lowest path loss and by reducing the EIRP while PLEd -BS PLmax [21]. Furthermore, when a new BS is activated, the algorithm checks if users already connected can be switched to balance the network load. A certain user is switched if the new active BS satisfies the condition that PLEd -BS PLmax.

4.4. Fading Variability

We considered fading variations over 24 hours and performed fading measurements following the procedure described in [13]. Note that the goal of these measurements is not to account for the fade margin for a certain percentage of availability, but the signal fading over the time at a certain location.

To account for the worst-case fading, we performed a measurement campaign at 20 locations with the highest vehicular traffic at the peak time. The fading was measured during 5 minutes at four different times of the day. The experimental system was designed to perform field strength measurements at a height of 2.9 m above ground level. The receiver is equipped with an omnidirectional antenna for UHF television signals [13]. The sampling rate is 26 Hz [13]. First, we measured the fading at all locations at the peak time and null traffic time. The maximum fading (not exceeding 1% of the time throughout 5 minutes) is measured and recorded for the worst location throughout 24 hours. Hence, the radiated power at each time stamp t (i.e., 5 minutes) can be dynamically adjusted taking into consideration the difference between the current fading for the worst locations (t) and the maximum reported fading (for which the traditional broadcasting network is planned to guarantee a certain percent of time availability). The instant radiated power level can be accounted for by means of the following equation:where [dBm] is the maximum radiated power of the broadcast transmitter. Notice that when the instant fading equals the maximum fading the radiated power corresponds to the radiated power of traditional broadcast network design. Hence, the radiated power cannot be reduced because the propagation conditions correspond to the worst-case considered for the network planning. Once the radiated power is dynamically adjusted according to the fading variability, it is possible to account for the power consumption of the dynamic broadcasting network (in Section 4.5).

4.5. Broadcasting Network Power Consumption Model

To account for the power consumption of the broadcasting station we consider four major power consumption components: the optical transceiver, the modulator, the high-power amplifier (HPA), and the cooling. We assume that the HPA has a near-linear correlation between the radiated power and its power consumption [10, 35]. The broadcasting instant power consumption is calculated as follows:where is the broadcast station total power consumption, is the optical transceiver power consumption, is the modulator power consumption, is the cooling power consumption, is the radiated power, is the power amplifier and radiation system equivalent efficiency factor, and is the corresponding time stamp.

The energy consumption in a period of 24 hours is calculated as follows:The energy consumption will depend on the instant power consumption at each time stamp . We consider for our scenario 288 time stamps (), corresponding to a time interval of 5 minutes and total evaluated time of 24 hours.

Table 3 lists the average power consumption values from a local service provider. Note that dynamically varies with the signal level data retrieved from the receivers through the IoT feedback loop (see (2)).

4.6. Power Consumption Models for IoT Networks

To account for the IoT network power consumption, a power consumption model has to be defined. Here, we will account for both the end-devices’ power consumption related to the IoT data transmission and the BSs power consumption. Figure 5 shows a power consumption model for LoRa and NB-IoT integrated into LTE macrocell BS.

The power consumption of the transceivers depends on its operational mode throughout the evaluated period. The transceiver operational modes are as follows (Figure 5): sleep, receiving, and transmitting [36]. LoRa transceivers have a power consumption in sleep mode of just 0.33µW [37] and NB-IoT transceiver 3 mW [18]. The power consumption of the transceivers in reception mode is approximately 39.6 mW for LoRa [17] end-devices and 70 mW for NB-IoT [18] end-devices. The end-device transceivers remain in receiving mode after each transmission during two receiving windows and switch to sleep mode. The power consumption of a LoRa BS transceiver in reception mode for 8 active paths is approximately 1.4 W (including the transceiver and Digital Signal Processing Engine) [17, 28]. In transmitting mode, the transceiver power consumption is related to the power amplifier and radiation system efficiency; for LoRa the power amplifier efficiency is approximately 15% in the range of our application (based on power consumption per radiated signal level data) [17]. For NB-IoT we assume the same efficiency as for LTE state-of-the-art implementations, being approximately 14.6% [10].

Other power consuming components (Figure 5) in the LoRa BS are the optical backhaul and rectifier with a power consumption of 1.5 W [38] and 4 W [39], respectively. A macrocell LTE BS, where we assume the NB-IoT solution is embedded, has other power consuming components, i.e., the Band Base Digital Signal Processing (100 W), the rectifier (100 W), and the optical backhaul (35 W), and most macrocell BS requires a cooling system (225 W) [40]. The power consumption of the Band Base Digital Signal Processing depends on the traffic load by a factor [0:1], being α = 0 when there is no traffic and α = 1 for the maximum traffic supported by the BS [40].

NB-IoT uses a single PRB of the LTE macrocell BS. Hence, we assume that the IoT-related traffic represents a traffic load of and the equivalent power consumption of the Band Base Digital Signal Processing is 1 W. For a fair comparison, we assume that the NB-IoT implementation will not increase the cooling, rectifier, and optical backhaul power consumption in the macrocell LTE BS. Hence, to account for the NB-IoT network power consumption these components will not be considered.

Besides the transceiver, the IoT end-devices have other power consuming components: the data acquisition system and the digital signal processing [36]. As the acquisition system and digital signal processing are already included by the digital TV receiver hardware because of its functionality we assume no additional power is required to retrieve and to process the data by the STB.

4.7. Spectrum Usage Efficiency Metric and TVWS Availability

ITU-R-SM.1046-2/2006 [41] provides general criteria for the evaluation of spectrum utilization factor and spectrum efficiency. This recommendation considers the spectrum utilization factor defined as the product of the used bandwidth, time-sharing, and space (i.e., geometric/geographic space defined by the covered area, including the antenna pattern). The spectrum utilization efficiency of a certain radio communication system is defined by the useful effect obtained with the aid of the communication system divided by the spectrum utilization factor [41]. The useful effect for a broadcast system is generally defined by the number of programs received by a certain ratio of users per total population [41]. The actual number of users per channel at a certain time stamp (audience) is not considered. Moreover, the spectrum efficiency metric will retrieve the same value even if there are no users watching television (no useful effect). For instance, within a certain digital television channel, for the same effective throughput, a broadcast network with standard definition programs will be rated with a higher spectrum efficiency than a broadcast network with a single Full-High Definition (FHD) program. Therefore, to account for the spectrum utilization efficiency of the dynamic broadcast network, we propose the following metric:where SuEWL [bit/Hz] is the spectrum utilization efficiency. The spectrum utilization related parameters are the required bandwidth (per channel) and the total evaluated time interval t. The benefit-related parameters are the bitrate Bi provided at the instant of time i, and ui, the total audience ratio of all programs broadcasted in a certain channel. Based on private interviews with a content provider, we assume that there is a low audience ratio of 0.01 during 25% of the time, medium audience of 0.3 during 50% of the time, and high audience of 0.6 during 25% of the time. Although dynamic broadcast allows switching from broadcast mode to multicast mode, depending on the user audience rating, in this paper we investigate the broadcast mode only. The factor αA is the coverage efficiency factor. This factor accounts for the covered area (Ac) and the protected area (Ap) where no other services can be deployed (e.g., TVWS). This factor is defined in The coverage area of the broadcasting television can be considered as part of the benefit on the spectrum efficiency metric. Theoretically, an infinite number of users within the coverage area can benefit from the system. Nevertheless, Ap is considered as a resource. This is due to the fact that the SNR is not enough for the primary service (e.g., television) but other services (e.g., TVWS) cannot use the spectrum in this area due to interference constraints. Note that ideally if no protection area is needed then Ap = 0 and αA = 1.

TVWS availability is usually accounted for as the number (percentage) of channels that can be used by secondary wireless telecommunications services in a certain location (area) without harmful interference to the primary licensed service [42]. Most TVWS databases are based on data from 3D propagation and interference modeling and spectrum surveys based on measurement campaigns [43]. However, the major constraint is the lack of real-time reliable data to dynamically update the TVWS databases.

The benefits of the spectrum reusage for the TVWS network are found in the reduction of the protected area of the broadcast transmitter. This follows from the reduced transmitter power in the dynamic broadcasting network in case the propagation conditions allow it. The reduction of radiated power is equivalent to a reduction of the protected area Ap of the transmitter. Thus, the area where the TVWS devices can reuse the same channel of the broadcast transmitter is increased in the same way that Ap decreases. In this paper, we determine the protected area of the broadcast transmitter based on a protection margin of -95 dBm [43].

5. Results and Discussion

This section presents the results of the network simulations and optimizations in the considered scenario.

5.1. IoT Feedback Loop Optimization

The Figure 6 shows the resulting IoT feedback network coverage map for LoRa (Figure 6(a)) and NB-IoT (Figure 6(b)), using the method and scenarios of Section 4. The BSs chosen to optimally satisfy the density of connected devices and traffic are highlighted with a darker color.

The required number of BSs for the IoT network is 8 for LoRa and 9 for NB-IoT. For LoRa, the main constraint in our application is the capacity, taking into consideration the large density of users and traffic density (1851 end-devices/km2 transmitting 7 bytes every 5 minutes) and the throughput cut-off caused by LoRa medium access control mechanism. Theoretically, the area coverage requirement (47 km2) is satisfied with only 1 to 3 BSs (depending on the SF). Nevertheless, a larger number of BSs is required to satisfy the capacity demand for the worst-case traffic generated by the monitoring application of the receivers. For this reason, the best network performance is achieved if all devices are capable of connecting with SF7 to the nearest BS (in terms of path loss). The reason for this is that SF7 achieves the highest bit rate and lowest time on air. NB-IoT has a higher capacity (with QPSK modulation scheme) but the maximum coverage per BS (i.e., ~2 km) is lower in comparison with LoRa SF7 (i.e., ~3.1 km).

Notice that both for LoRa and NB-IoT more than 50% of the possible BS locations are never chosen. As such, the algorithm is a heuristic: the best set of BSs is never found, but a set that satisfies the optimization problem: minimization of BS infrastructure. The average percentage of end-devices covered by LoRa and NB-IoT network is 99.2% and 98.7% of the digital TV receivers that are providing feedback data. Both sets of BSs largely satisfy the design criteria of minimal coverage higher than 95%.

The power consumption of the IoT network considers the BSs and users’ power consumption (related to feedback data transmission) at each simulation. The power consumption of the feedback network implemented with LoRa technology is on average 71.2 W. The power consumption of NB-IoT is almost 3.7 times higher (266.9 W). This is because NB-IoT end-devices and BSs require a higher EIRP to communicate with each other. As a consequence, the power amplifier has a higher power consumption. Notice that LoRa transceivers have a power consumption 43% lower than NB-IoT. This has a major impact on the IoT feedback network power consumption due to the considerable number of end-devices in our scenario.

5.2. Dynamic Broadcasting Network EIRP and Power Consumption

First, we measured and recorded the fading over a day. Figure 7 shows the maximum measured fading not exceeding 99% of the time in 5 minutes over 24 hours (at the worst location).

The fading exceeded 7.4 dB only 1% of the time. However, the fading was lower than 4.4 dB during 90% of the time and lower than 2.4 dB for 50% of the time. The currently deployed broadcasting network was designed for a fading margin of 7.4 dB. As a consequence, the actual broadcasting network is radiating 5 dB more than required during 50% of the time.

Therefore, the radiated power of the transmitter is adjusted in steps of 1 dB according to the actual fading at each time stamp, retrieved through the IoT feedback loop. The equivalent power consumption of the broadcasting network is calculated by means of (3). The total power consumption is calculated adding the IoT network power consumption. Figure 8 shows the power consumption of the dynamic broadcasting network with adaptively radiated power throughout the day (including the power consumption of the IoT feedback network for both LoRa and NB-IoT solutions).

The total energy consumption of the traditional broadcasting network with fixed radiated power is approximately 360.3 kWh throughout a day. The total energy consumption (during 24 h) of the dynamic broadcasting network with adaptive radiated power (see (4)) is approximately 301.6 kWh for LoRa and 306.3 kWh for NB-IoT solutions. This represents a saving of approximately 15 % to 16.3%. Notice that the LoRa feedback loop power consumption represents less than 1% of the total network power consumption. The power consumption of the NB-IoT feedback network is only 2% of the total power consumption.

For an actual deployment, the density of users reporting feedback data in the areas with worst fading conditions must be large enough to decrease the delay between the measurement, data transmission through the IoT feedback loop, and network adaptation decision to an acceptable value (i.e., 1% of the time without service). Otherwise the transmission interval of the STB should be increased in order to avoid a coverage reduction of the broadcasting network.

5.3. Spectrum Usage and TVWS Availability

The reduction in the radiated power has an additional benefit in the spectrum usage efficiency and additional availability of TVWS. Figure 9 shows the received signal level for the worst-case fading (i.e., 99-percentile, Figure 9(a)) and the 90-percentile fading (Figure 9(b)). Notice that Figure 9(a) also matches the traditional broadcasting network design.

For a protection margin of the broadcast transmitter of -95 dBm, the protected area decreases by 34% during 90 % of the time (Figure 9). This is because the reduction of EIRP causes a reduction of the Ap, when the propagation conditions allow it. As a default condition, TVWS technologies consider a channel occupied when sensing a signal level higher than or equal to -90 dBm [44]. As a consequence, the Ap reduction is equivalent to an increase of TVWS availability.

Figure 10 shows the spectrum usage efficiency (SuEWL) of the dynamic broadcasting network over the day for three windows of audience rating. The higher the SuEWL is, the better the network performs regarding spectrum usage efficiency. The SuEWL of the dynamic broadcasting network with adaptive radiated power and with fixed radiated power are compared. We also account for the IoT feedback network spectrum usage as a bandwidth resource in the SuEWLmetric. For LoRa, 8 channels of 125 kHz are required while the NB-IoT architecture allows reusing 4 channels of 180 kHz. Hence, a slight difference in the spectrum usage efficiency between LoRa solution and NB-IoT solution is expected.

The spectrum usage efficiency of the dynamic broadcasting network with LoRa feedback solution in Figure 10 is on average 32% higher than traditional broadcasting, and 35% higher with NB-IoT solution. This difference is 5% higher at night (18:00 to 00:00 hours) due to a slightly lower average fading and equivalent radiated power. The best propagation conditions take place in the early morning (00:00 to 06:00 hours). As a consequence, the average radiated power is then 5 dB lower. However, the spectrum usage efficiency is then the worst among the three evaluated windows because of the low audience ratio (0.01). Hence, a dynamic switch to multicast has to be applied in order to increase the spectrum usage efficiency at that period of time.

5.4. IoT Platform Deployment Cost Considerations

For the network deployment (CAPEX) we consider only the BS infrastructure that is required to deploy the feedback network. We do not account for the end-devices cost. A single LoRa BS has deployment cost up to $1,000 USD and a single NB-IoT of $15,000 USD [24], considering the reusage of LTE infrastructure. The difference in infrastructure is not significant (1 BS more for NB-IoT). However, the deployment cost for NB-IoT is thus considerably higher. The total LoRa BSs infrastructure cost is just $8,000 (8 BSs). For NB-IoT, the total BSs deployment cost is $135,000 (approximately 16.8 times higher for 9 BSs).

6. Conclusions

In this paper, we investigated the feasibility of an IoT-based management platform for providing real-time feedback data to improve the spectrum usage efficiency and power consumption by establishing a broadcasting network with adaptive radiated power. A novel network architecture including IoT feedback loop is presented. The IoT feedback network is designed and optimized for minimal power consumption and infrastructure. In addition, to assess the improvement on spectrum usage, a novel spectrum usage efficiency metric is proposed.

The radiated power of the broadcasting network is dynamically adjusted taking into consideration the instantaneous propagation conditions in a realistic scenario. The use of an IoT feedback network allowed reducing the broadcasting network power consumption by 15% to 16.3% and increased the spectrum usage efficiency by 32% to 35%, depending on the IoT technology platform (LoRa and NB-IoT). In addition, TVWS channel availability is increased by 34% during 90% of the time.

Future research will consist of the emulation of the dynamic broadcasting network with an IoT feedback loop, in a real scenario.

Data Availability

The data used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.

Acknowledgments

R. Martinez Alonso is supported by LACETEL and a doctoral grant from the Special Research Fund (BOF) of Ghent University, Belgium.