Abstract

The whale optimization algorithm (WOA) is a popular swarm intelligence algorithm which simulates the hunting behavior of humpback whales. WOA has the deficiency of easily falling into the local optimal solutions. In order to overcome the weakness of the WOA, a modified variant of WOA called OCDWOA is proposed. There are four main operators introduced into the OCDWOA to enhance the search performance of WOA. The operators include opposition-based learning method, nonlinear parameter design, density peak clustering strategy, and differential evolution. The proposed algorithm is tested on 19 optimization benchmark functions and a seismic inversion problem. OCDWOA is compared with the classical WOA and three typical variants of WOA. The results demonstrate that OCDWOA outperforms the compared algorithms in terms of obtaining the global optimal solution.

1. Introduction

In recent years, the nature-inspired algorithms have attracted lots of attention from researchers. Swarm intelligence (SI) algorithm is a main kind of nature-inspired algorithms. Swarm intelligence algorithms are designed by simulating the behaviors of animals or plants. The widely used SI algorithms include particle swarm optimization (PSO) [1], Ant colony optimization (ACO) [2], artificial bee colony algorithm (ABC) [3], grey wolf optimization (GWO) [4], wolf pack algorithm (WPA) [5], brain storm optimization [6], and whale optimization algorithm (WOA) [7].

The whale optimization algorithm (WOA) is proposed in 2016. WOA searches the global optimal solution by simulating the foraging behaviors of the whales, including encircling prey, researching prey, and attacking prey. Many studies about WOA have been presented. Generally, these studies can be classified into two categories. One is to apply applying the WOA to solve some practical issues. The other one is to improve the performance of WOA with some strategies. A modified WOA named chaotic WOA (CWOA) [8] was proposed by Kaur and Arora in 2017. The CWOA introduces chaos theory into WOA. The main parameter of WOA is tuned by chaotic map to control exploration and exploitation. In [9], Lévy flight trajectory-based WOA (LWOA) is presented. The Lévy flight trajectory is used to increase the diversity of the population and enhance the performance of jumping out of local optimal solutions. The balanced variant of WOA (BWOA) proposed in [10] introduced Lévy flight and chaotic local search into WOA to enhance the inclusive exploratory and neighborhood-informed capacities of the conventional technique. Chen [11] proposed a modified WOA based on adaptive convergence and Lévy features (IWOA). The adaptive convergence operator is used to balance the local and global optimization ability and the Lévy flight mechanism is introduced to improve global searching ability. For solving the large-scale global optimization problems, a modified self-adaptive WOA (MWOA) [12] is proposed. A nonlinear dynamic strategy is applied to update the control parameter and balance the exploration and exploitation abilities. The Lévy-flight strategy is utilized to enhance the search performance and a quadratic interpolation method is adopted to improve the exploitation ability of the algorithm. For solving high-dimensional function optimization problems, A WOA based on Lamarckian learning (WOALam) [13] brought in the evolutionary theory of Lamarck to enhance the local search. Trivedi et al. [14] proposed an adaptive whale optimization algorithm (AWOA) with a randomization technique. The results show that the performance of AWOA is better than the performance of the standard WOA. Sun et al. proposed a whale optimization algorithm based on quadratic interpolation (QIWOA) [15]. A modified exploration process and quadratic interpolation are used to improve the exploitation ability and the search accuracy. A reinforced variant of WOA named RDWOA [16] was proposed in 2019. A strategy of random spare or random replacement is used to enhance the convergence speed. In addition, a strategy of double adaptive weight is introduced to improve the exploration ability in the early stage and exploitative ability in the later stage. Trivedi et al. [17] proposed a hybrid PSO-WOA. The PSO operator is used in exploitation phase and the basic WOA operator is adopted to explore in uncertain environments. The experimental results reveal the effectiveness of PSO-WOA compared to standard PSO and WOA. Wei et al. [18] proposed a modified WOA based on different searching paths and perceptual disturbance (PDWOA). Several spiral curves have been tested and the results show that the equal-pitch of Archimedean spiral curve is more capable than other types of spiral curve. Mafarja and Mirjalili [19] proposed a hybrid SA-WOA algorithm. In order to enhance the exploitation of WOA, the simulated annealing (SA) algorithm is introduced into the algorithm by searching the most promising regions located by basic WOA. An improved whale optimization algorithm (IWOA) [20] is proposed in 2016. Inertia weight, as a new parameter, is introduced to adjust the best solution in each iteration. Abdel-Basset et al. [21] proposed a memetic algorithm combining the WOA with a Tabu Search for solving quadratic assignment problems.

In addition, WOA has been applied to many engineering problems. Oliva et al. [22] used an improved chaotic whale optimization algorithm to estimate the parameters of solar cells. The design problem involves the solution of the complex nonlinear and multimodal objective functions. The proposed algorithm solves the design problem with the chaotic map and automatically adapt parameters. Zhang et al. [23] proposed an enhanced whale optimization algorithm (EWOA) to optimize the flow shop construction scheme. The enhanced WOA with Lévy flight is used to improve the robustness and efficiency of the existing construction stage and zone optimization approaches. Bentouati et al. [24] proposed a new variant WOA with pattern search algorithm (PS). The results prove the effectiveness and the performance of the proposed algorithm for solving the optimal power flow problem. Abd Elaziz and Oliva [25] proposed an enhanced WOA based on opposition (OBWOA). The opposition-based learning is used to enhance the exploration of the search space. OBWOA has the exploration abilities for different benchmark optimization functions and has been applied to estimate three different diode models of solar cells. Aljarah et al. [26] applied WOA to neural networks training. The proposed algorithm has been tested on 20 datasets. The results showed that the proposed method can obtain the best solutions and outperform the other compared algorithms on convergence speed. Medani et al. [27] proposed a modified WOA to solve the optimal reactive power dispatch problem. The results show that the proposed algorithm is more desirable than two comparative algorithms.

Prestack seismic inversion technology is an important seismic inversion technology to solve the problems of reservoir lithology and fluid prediction in oil and gas exploration and development. Physical properties obtained by general prestack inversion methods provide many detailed subsurface characteristics, but the method is not mature and limited by the high resolution, low speed, and poor stability of prestack inversion. Currently, this technology is in the research stage. Some inevitable technology problems should be solved before its large-scale applications [28, 29].

Many researches on solving the problems of prestack inversion by optimization algorithms have been studied. Xia et al. [30] and Sen [31] improved the speed and stability of prestack seismic waveform inversion by using the step inversion method. Simulated annealing (SA) and genetic algorithm (GA) have been successfully used in prestack seismic waveform inversion. Ingber and Rosen [32] conducted a comparative study on SA and GA. The experiment results proved that both algorithms could achieve global optimization and the speed of fast simulated annealing (VFSA) was the fastest. Liu et al. [33] proposed a new adaptive genetic algorithm for seismic inversion to overcome slow convergence rate and falling into local optimal solutions. Wu et al. [34] proposed a parameter inversion method based on improved differential evolution algorithm. The proposed method was effective in solving parameter inversion problems of prestack seismic data.

WOA has a good convergence rate, but the performance of WOA in finding the global optimal solution of the multimodal problems with many local optimal solutions is not ideal. In order to enhance the global search ability and solve the prestack seismic waveform inversion, a modified whale optimization algorithm named OCDWOA is proposed in this paper. There are four operators used in the proposed algorithm. Nonlinear parameter design is introduced to balance the ratio of exploitation phase and exploration phase. Density peak clustering strategy is adopted to decompose population into numbers of groups for enhancing the diversity of population. Opposition-based learning method and differential evolution are used to improve the search performance. Experiments on 19 optimization benchmark functions and a seismic inversion problem indicate the effectiveness of the proposed algorithm. The proposed algorithm OCDWOA represents outstandingly the experiments. The results demonstrate that the proposed algorithm OCDWOA can solve the seismic inversion problem successfully.

The main contributions are summarized as follows:(1)The opposition-based learning method adopted by OCDWOA could avoid the solutions trapping into local optimal, which improves the ability of global search.(2)The nonlinear parameter design of a in OCDWOA could achieve the global exploration in early stage and local exploitation in late stage.(3)Density peak clustering strategy is adopted to decompose population into numbers of groups, which enhances the diversity of population.(4)The proposed OCDWOA has an outstanding performance of solving seismic inversion problem, which verifies its universality for solving practical engineering problems.

The rest of this paper is organized as follows. Section 2 describes the WOA algorithm. Section 3 explains the proposed algorithm and the operators. The experimental results and comparisons are shown in Section 4. Finally, in Section 5, the conclusions are presented.

2. Whale Optimization Algorithm

WOA is a kind of population-based meta-heuristic algorithm. WOA simulates the hunting mode of humpback whales, which includes encircling prey, bubble-net attacking method, and search for prey.

2.1. Encircling Prey

WOA supposes that the optimal solution with the best fitness in current population is the prey. All the other solutions will encircle the prey and update themselves toward the optimal solution. The mathematical model of encircling behavior is as follows:where t indicates the number of the current iteration, X is the current solution, is the best solution in the population, and and are the coefficient vectors calculated as follows:where is linearly decreased from 2 to 0 over the iterations and is a random vector in [0, 1].

2.2. Bubble-Net Attacking Method

In order to simulate the behavior of bubble-net attacking, shrinking encircling mechanism and spiral updating position are designed as follows.

2.2.1. Shrinking Encircling Mechanism

The mathematical model of shrinking encircling mechanism is defined as equations (1) and (2). Because of is linearly decreased from 2 to 0, the value of is in the range of [−1, 1]. The current solution can update itself toward the best solution with a random step.

2.2.2. Spiral Updating Position

The new position of current solution updated with the best solution is obtained as follows:where b is constant for defining logarithmic spiral and l is randomly generated number in the range [−1, 1]. To balance the two models, a selection parameter p is set. The selection parameter p in WOA is 0.5; there is a probability of 50% to choose either the shrinking encircling mechanism or the spiral updating position model. The finial mathematical model of bubble-net attacking is as follows:where p is a random number in [0, 1].

2.3. Search for Prey

In nature, whales will go away from each other and randomly search for a new prey. In order to simulate the search behavior, a random search operation is used in the exploration phase when |A| ≥ 1. The current solution will update with a random solution in the current population instead of the best solution. The mathematical model is as follows:where is a random solution in the current population.

The pseudocode of the WOA is shown as Algorithm 1.

Input: Population size N
The halting criterion
Output: The best solution
Generate the initial population Xi (i = 1, 2, …, N)
(1)Calculate the fitness for each solution in the population
(2)Find the best solution with the best fitness
(3)while the halting criterion is not satisfied do
(4) Update a
(5)for i = 1 to N do
(6)  Update A, C, l and p
(7)  for j = 1 to m do
(8)   ifthen
(9)    if (|A| < 1) then
(10)     
(11)     
(12)    else if (|A| ≥ 1) then
(13)     Select a random solution Xrand
(14)     
(15)     
(16)    end if
(17)   ifthen
(18)    
(19)    
(20)   end if
(21)  end for
(22)  Evaluate the fitness for Xi
(23)end for
(24) Find the best solution
(25)end while

3. The Proposed Approach

This section introduces a modified WOA named OCDWOA. Four strategies are introduced into the WOA. Opposition-based learning method (OBL) is used to initialize population and randomly update solutions in each iteration. Nonlinear parameter design is introduced to balance the ratio of exploitation phase and exploration phase. Density peak clustering strategy is adopted to decompose population into numbers of groups for enhancing the diversity of population. The effect of differential evolution (DE) is to exchange information between groups and improve the ability of jumping out of the local optimal solutions.

3.1. OCDWOA

The pseudocode of the OCDWOA algorithm is shown as Algorithm 2.

Opposition-based learning is used to initialize the population (see lines 1–6 of Algorithm 2) and to estimate the opposition solutions (see lines 39–46 of Algorithm 2). The fitness of each solution in the population is calculated and the best solution is found (see lines 7-8 in Algorithm 2). The parameters a, CR, A, C, l, and p are updated (see lines 10 and 13 in Algorithm 2). The population is decomposed into k groups by density peak clustering algorithm (see line 11 of Algorithm 2). The mutation operator of DE is introduced into the proposed algorithm (see lines 20–23 of Algorithm 2). A solution Xi is updated by r1 and r2 under mutation operator, where r1 is from the group including Xi and r2 is from the groups excluding Xi (see lines 14-15 of Algorithm 2). With the increase of iterations, the operation probability of DE declines. The basic three operators of WOA are used in lines 25-26, 29–31, and 34-35, respectively. The encircling prey is used in lines 25-26, the search for prey is introduced in lines 29–31, and the spiral updating position is applied in lines 34-35.

Input: Population size N
   Group size k
   The halting criterion
Output: The best solution
(1)Generate the initial population Xi (i = 1, 2, …, N)
(2)for i = 1 to N do
(3)if (rand < 0.6)
(4)  Generate new solution with OBL
(5)end if
(6)end for
(7)Calculate the fitness for each solution in the population
(8)Find the best solution with the best fitness
(9)while the halting criterion is not satisfied do
(10) Update a and CR
(11) Decompose population into k groups by density peak clustering algorithm
(12)for i = 1 to N do
(13)  Update A, C, l and p
(14)  R1 is the group with Xi and R2 are the groups without Xi
(15)  Select randomly r1 from R1 and r2 from R2
(16)  jrand = randint (1, m)
(17)  for j = 1 to m do
(18)   ifthen
(19)    if (|A| < 1) then
(20)     if rand (0, 1) ≤ CR and j == jrandthen
(21)      Update F
(22)      Select randomly r1 from R1 and r2 from R2
(23)      
(24)     Else
(25)      
(26)      
(27)     end if
(28)    elseif (|A| ≥ 1) then
(29)     Select a random solution Xrand
(30)     
(31)     
(32)    end if
(33)   ifthen
(34)    
(35)    
(36)   end if
(37)  end if
(38)end for
(39)for i = 1 to N do
(40)  if (rand < 0.6)
(41)   Generate new solution Oi with OBL
(42)   if Oi is better than Xithen
(43)    Xi = Oi
(44)   end if
(45)  end if
(46)end for
(47) Find the best solution with the best fitness
(48)end while
3.2. Opposition-Based Learning Method

It has been tested that opposition-based learning (OBL) method proposed by Rahnamayan et al. [35] is helpful for avoiding trapping into local optimal solutions.

A solution Xi in d-dimension search space is denoted as  = {Xi,1, Xi,2, Xi,3, …, Xi,d}. The opposition point of Xi can be calculated aswhere Xub,j and Xlb,j denote the upper and lower bound of j-th dimension.

OBL is used in the initialization step and solution update step at the end of each iteration. In the initialization step, OBL can generate a new solution on opposition side and enhance the diversity in the search space. In solution update step at the end of each iteration, OBL can generate a new solution on opposition side and choose the better solution between the new solution and the original solution. OBL can help WOA improve the search performance and jump out the local optimal solutions.

3.3. Nonlinear Parameter Design

An exponential function is introduced into the modified WOA. The value of coefficient vector controls the proportion of exploitation phase and exploration phase. The nonlinear parameter design is designed as follows [36]:where amin and amax denote the minimum and maximum of , t is the number of the current iteration, and Max_iter is the maximum iteration.

A large value of generates a large step size in exploitation phase, which can improve the searching speed. A small step size with a small value of can be helpful for the exploration phase.

3.4. Density Peak Clustering

A density-based clustering strategy named density peak clustering algorithm [37] is used in the proposed WOA. The density peak clustering algorithm defines two important parameters for each solution i: the density ρi and distance δi. A scatter graph called decision diagram is constructed based on these two parameters. The decision diagram could decide the center of each clustering group and assign the remaining solutions to each clustering center.

The mathematical model of the density ρi of the ith solution used in this paper is as follows:where j represents the j-th solution in current population, and i ≠ j, dij is the Euclidean distance between i-th and j-th solutions, and dc is a pre-set cut-off distance. A moderate value of dc could make the average number of neighbors around 1% to 2% of the population size.

The distance parameter δi denotes the minimum distance among the i-th solution and the j-th solution, where the local density of j-th solution is higher than that of the i-th solution. The equation of δi is as follows:

Figure 1(a) shows the distribution of 28 solutions in two dimensions and Figure 1(b) is the decision diagram obtained by these 28 solutions. In Figure 1(b), the horizontal axis is ρ and the vertical axis is δ. The solutions with both large values of ρ and δ are defined as the clustering centers. In Figure 2, the 1st and 10th solutions with both large density and distance will be selected as the clustering centers. The 26th, 27th, and 28th solutions have large distance and low density, and the 7th and 8th solutions have large density and low distance. Therefore, these solutions cannot be selected as the clustering centers.

According to the rules of selecting the clustering centers, a modified equation of calculating quality of solutions is designed as follows:

In this paper, a modified model of cut-off distance dc is introduced [38]. The transition distance Dis is calculated as follows:

The cut-off distance dc is calculated as follows:where Dismax and Dismin are the maximum and minimum of transition distances in the current iteration.

3.5. Differential Evolution

For enhancing abilities of searching and jumping out of the local optimal solutions, the DE operator is introduced in this paper. The crossover and mutation operators in differential evolution (DE) [39] are calculated as follows:where and are the j-th dimension of the r1-th and r2-th solutions, respectively, U is the offspring, and F is the zoom factor. CR is the crossover probability updated as an exponential function:

4. Experiment Results and Discussion

4.1. Parameter Settings and Benchmark Functions

In this section, in order to analyze the performance of the proposed algorithm, 19 benchmark functions [40] F1F19 have been used. The expressions of all the functions are given in Table 1. F1F7 are the unimodal functions with only one global optimal solution. F8F12 are the multimodal functions with many local optimal solutions. These functions can test the performance of jumping out the local optimal solutions. F13F19 are the fixed-dimension multimodal functions, which can evaluate the convergence property of an algorithm.

The proposed OCDWOA is compared with the classical WOA and its typical variants, such as OWOA [41], OEWOA [42], and IWOA [43]. The parameter b is set to 1 for all the algorithms. The parameter decreases linearly from 2 to 0 for WOA, OWOA, and IWOA. For OEWOA, the parameter nonlinearly decreases from 2 to 0. For IWOA, the crossover probability CR is set to 0.9, and the zoom factor F is random within (0.2, 0.8).

For all the algorithms, population size is 30 and maximum iteration is 500. The experiment is repeated 30 times for each benchmark function.

4.2. Experiment Results

The statistics of the experimental results is demonstrated in Table 2. The best fitness, mean fitness, and standard deviation of OCDWOA and each compared algorithm are given.

For F6, F9, F11, F12, F13, F15, F17, and F18, the performance of OCDWOA is the absolutely best. For F14, the results obtained by all the variants of WOA are comparable. For F1, F2, F3, F4, F5, F10, and F16, OCDWOA obtains the second rank. For the simple functions F1 and F2 without any local optimal solution, OCDWOA is worse than OEWOA. The DE operator used in OCDWOA can enhance the search performance. The convergence speed would be reduced, though the operation probability of DE declines in exploration phase. For the nonseparable unimodal functions F3, F4, and F5, OCDWOA is only worse than IWOA. For the multimodal function F16, the result reveals the positive influence of DE.

For F7 and F8, the performance of OCDWOA is similar to ones of OWOA and OEWOA. For F10, OCDWOA gets the second rank. For F8 and F10, IWOA can get the optimal solution. However, IWOA fails in solving the function F7. The mean value and standard deviation value of IWOA are also not ideal. It shows that the DE operator is helpful for jumping out the local optimal solutions, but the stability may be impaired for the multimodal functions with easy or few local optimal solutions. In OCDWOA, the clustering operator and reducing the operation probability of DE can drop off the influence of DE. The DE operator can improve the performance of jump out the local optimal solutions. The DE operator and OBL can enhance the search performance of algorithm. Dynamical adjustment of can avoid premature convergence. The clustering operator enhances information exchange between solutions.

The results analyzed by the Wilcoxon rank-sum test are shown in Table 3. In the Wilcoxon rank-sum test, a significance level is set to 0.05. In Table 3, “+,” “−,” and “ = ” indicate that the result of the proposed algorithm OCDWOA is significantly better, significantly worse, and statistically similar to that of the compared algorithm, respectively. For most test functions, the performance of OCDWOA is significantly better or statistically similar to the other variants of WOA.

The convergence curves and boxplots are shown in Figures 2 and 3, respectively.

In Figure 2, for most functions, the convergence curves of OCDWOA are better or statistically similar to ones of the other variants of WOA. For F3 and F4, the convergence curves of OCDWOA are stabler than those of IWOA, but the convergence rates of OCDWOA are slower than those of IWOA. In OCDWOA, the solutions selected in the crossover and mutation operators of DE are based on the clustering. The clustering operator reduces the randomness of the DE operator. The stability of OCDWOA has been improved.

In Figure 3, for most functions, the stability of OCDWOA is better or statistically similar to the stability of the other variants of WOA. Only for F3 and F4, the stability of OCDWOA is worse than the stability of IWOA.

4.3. Application of Seismic Inversion Problems
4.3.1. Seismic Inversion Problem

Seismic inversion is the key technology of quantitative reservoir prediction and prestack waveform inversion [44] is a newly developed seismic inversion technique. In prestack waveform inversion technique, the complex wave propagation effect is considered and the complete waveform information is fully used. The prestack waveform inversion method assumes that each CMP/CDP point of underground medium layer structure with local one-dimensional or level uses the reflectivity method forward prestack data [45]. The prestack waveform inversion method searches the best fitting with the actual prestack seismic data of underground through genetic algorithm, simulated annealing, and other global optimization algorithms [46]. At present, there are many problems in prestack waveform inversion technology that need to be further studied and the resolution of prestack waveform inversion technology needs to be improved. The existing optimization algorithms used in prestack waveform inversion are mainly traditional intelligent optimization algorithms. These algorithms are prone to premature convergence and local optimal solution when dealing with large data volume, highly nonlinear, multiparameter, and multiextremum prestack waveform inversion problems.

In this paper, a theoretical geological model with 11 layers is established [47]. Each layer in the model contains 3 parameters: velocity of longitudinal wave (VP), velocity of shear wave (VS), and density (ρ). The values of model are shown in Table 4.

In this paper, the reflection coefficients of the model will be calculated by Zoeppritz equation as follows:where VP1, VS1, and ρ1 are the parameters of the up-layer, VP2, VS2, and ρ2 are the parameters of the low-layer, θ1 and θ2 are the incident angle and the refraction angle of the P-wave, respectively, and Φ1 and Φ2 are the incident angle and the refraction angle of the S-wave, respectively.

Φ1, θ2, and Φ2 are calculated as follows, respectively:

Incident angles θ1 of 50 reflection coefficients are from 1° to 50°. The 50 seismic records are obtained by convolving the reflection coefficients with the Ricker wavelet with 35 Hz as follows:where S is the seismic record, R is the reflection coefficient, and is the ricker wavelet (35 Hz).

The 50 seismic records will be divided into three groups. The three groups are small incident angle group (Group 1), medium incident angle group (Group 2), and large incident angle group (Group 3). Group 1 consists of the seismic records whose incident angles are from 1° to 15°. Group 2 consists of the seismic records whose incident angles are from 16° to 32°. Group 3 consists of the seismic records whose incident angles are from 33° to 50°. Superimposed seismic records S1, S2, and S3 are obtained by synthesizing three groups of seismic records, respectively. The superimposed seismic records S1(X), S2(X), and S3(X) of the solution X are obtained the same way as the model.

The fitness of the solution X is expressed as follows:

4.3.2. Experiment Results

The experiment is repeated 10 times for each algorithm. Table 5 shows the best fitness, mean fitness, and standard deviation of test results on seismic inversion problem. The convergence curves and boxplots are shown in Figure 4. From the test results, the performance of OCDWOA for seismic inversion problem is the best.

In Table 5, the best fitness, mean fitness, and standard deviation of OCDWOA are all obviously smaller than those of the comparative algorithms. In Figure 4(a), the convergence rate of OCDWOA is better than those of the comparative algorithms. The convergence curve of OCDWOA is stable. In Figure 4(b), the stability of OCDWOA is the best among the comparative algorithms.

5. Conclusions

In this paper, a modified WOA named OCDWOA is proposed. In order to balance the exploitation phase and exploration phase, a nonlinear parameter design is used. OBL, DE operator, and a density-based clustering strategy are introduced into the proposed algorithm to improve the performance of the search phase and the ability of jumping out of the local optimal solution. The proposed algorithm is tested on 19 optimization benchmark functions and a seismic inversion problem. The proposed algorithm is compared with WOA and the other three variants of WOA. Experimental results show that the proposed algorithm can enhance the performance of WOA and is better than other compared algorithms.

Our future work will pay attention to the following two points: first, we would like to research the relation between the parameters and the performance of the OCDWOA. Second, we would like to apply the OCDWOA to machine learning.

Data Availability

Some or all data, models, or codes that support the findings of this study are available from the corresponding author upon reasonable request.

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.

Acknowledgments

This work was supported by National Natural Science Foundation of China under Grants 41772123, 61772365, 61802280, and 61806143.