Computational Intelligence and Neuroscience

Computational Intelligence and Neuroscience / 2020 / Article

Research Article | Open Access

Volume 2020 |Article ID 6524919 | https://doi.org/10.1155/2020/6524919

Xiyu Liu, Lin Wang, Jianhua Qu, Ning Wang, "A Complex Chained P System Based on Evolutionary Mechanism for Image Segmentation", Computational Intelligence and Neuroscience, vol. 2020, Article ID 6524919, 19 pages, 2020. https://doi.org/10.1155/2020/6524919

A Complex Chained P System Based on Evolutionary Mechanism for Image Segmentation

Academic Editor: Rodolfo E. Haber
Received01 Dec 2019
Revised06 Feb 2020
Accepted25 Feb 2020
Published07 Aug 2020

Abstract

A new clustering membrane system using a complex chained P system (CCP) based on evolutionary mechanism is designed, developed, implemented, and tested. The purpose of CCP is to solve clustering problems. In CCP, two kinds of evolution rules in different chained membranes are used to enhance the global search ability. The first kind of evolution rules using traditional and modified particle swarm optimization (PSO) clustering techniques are used to evolve the objects. Another based on differential evolution (DE) is introduced to further improve the global search ability. The communication rules are adopted to accelerate the convergence and avoid prematurity. Under the control of evolution-communication mechanism, the CCP can effectively search for the optimal partitioning and improve the clustering performance with the help of the distributed parallel computing model. This proposed CCP is compared with four existing PSO clustering approaches on eight real-life datasets to verify the validity. The computational results on tested images also clearly show the effectiveness of CCP in solving image segmentation problems.

1. Introduction

Membrane computing, also known as membrane systems or P systems, is a novel research of bioinspired computing initiated by Păun [1]. It seeks to discover novel biological computing models from the structure of biological cells as well as the cooperation of cells in tissues, organs, and populations of cells. Over the past years, there are three investigated P systems, cell-like P systems, tissue-like P systems, and neural-like P systems, including spiking neural P systems.

P systems have several characteristics: nondeterminism, programmability, extensibility, and readability [2]. Research shows that the some models of P systems present the same computing power as Turing machines and is more efficient to some extent [3]. Therefore, the analysis of computing power and computational efficiency of P systems is one of the important basic studies [4, 5]. Other studies are focused on the variation of P systems to solve optimization problems, including the variant of rules and structures [6, 7]. In addition, some intelligence techniques, such as evolutionary computation and fuzzy theory, is also introduced to the variant P systems in some specific optimization problems [8].

Because the parallel computation in membrane systems can avoid the increase in time consumption with the increase in the number of data points, the membrane systems are suitable for solving clustering problems [9]. There are a lot of interesting researches in variant P systems to solve clustering problems. Liu and Xue [10] proposed a new cluster splitting technique based on Hopfield networks and P systems. Liu et al. [11] presented an improved Apriori algorithm, named ECTPPT-Apriori, based on evolution-communication tissue-like P system with promoters and inhibitors. Peng et al. [12] developed an extended P system with active membranes, in which a modification differential evolution mechanism is used to find the optimal clustering centers in clustering problems. Peng et al. [13] introduced a multiobjective clustering framework using a tissue-like P system for solve fuzzy clustering problems. Wang et al. [14] proposed a new cell-like P system using a modified genetic algorithm to evolve the objects and using communication rules in the cell-like P system.

Image segmentation is an important part of image processing; it also has critical impact on the final quality of image analysis and subsequent tasks [15]. In the previous studies, the segmentation technique can be divided into region-based methods, edge-based methods, cluster-based methods, and threshold-based methods [16], in which the threshold methods can be classified into bilevel and multilevel threshold methods based on the number of clusters [17]. The region-based methods can obtain high segmentation quality but are sensitive to the parameters. The edge-based methods have high segmentation quality in different regions or targets and are also more sensitive to noise. The cluster-based methods are simple and easily implemented, but the clustering results rely on the number of clusters and feature selection in the colour space. The threshold-based methods are simple to computation, requiring no prior knowledge, but the continuity of the regions is not guaranteed due to the lack of space information [18]. So far, image segmentation has been used with wide recognized significance in machine vision, computer-aided diagnosis of the medical imaging, feature extraction, and analysis [19]. In this paper, the experiment on tested images is simple and easily implemented, and the application of image segmentation is not used in our works, so more details are not described in the following.

Because these segmentation methods mentioned above have their respective advantages and limitations, a lot of works have been done to find robust and optimum segmentation techniques [20]. The threshold technique is one of the most popular segmentation techniques which are based on the gray level of images. It is simplicity and easy implementation, which has lower computational complexity [21]. Li et al. [22] presented a novel thresholding extraction method based on variational mode decomposition (VMD). Zhao et al. [23] introduced a gradient-based adaptive particle swarm optimization (PSO) combined with improved external optimization (EO) to overcome being trapped into local optima in high-dimensional tasks. Wang et al. [24] designed a new P system with related interaction rules; the PSO mechanism is used to maximize entropy threshold. Tan et al. [25] proposed a hybrid clustering models using ensemble deep neural networks for skin lesion segmentation.

P systems are a class of distributed parallel computing models that can be used to improve the global search ability of PSO [26]. The commutation rules of chained P systems can be used to accelerate convergence of PSO. Compared with the crossover and mutation operation of DE, the partitioning information only can be used in the velocity updating of particles [27, 28]. Besides, although the neural networks have high quality for solving the optimization problems, the space and time consumption is too much, and the parallel computing model of P systems is also not executed on the neural networks. Therefore, a new variant of P systems based on PSO mechanism is proposed, which is named complex chained P (CCP) systems. And, the concepts of membrane, objects, and rules based on a special chained structure are introduced in the literature [29]. In this CCP system, two kinds of evolution rules in different chained membranes are introduced to enhance the global search ability. One of evolution rules is using the traditional and modified PSO mechanism to evolve the objects, and the partitioning information is introduced as the environmental factor to improve the clustering performance. Another is based on differential evolution (DE) to evolve the global chained objects in order to enhance the global search ability. Compared with genetic algorithm (GA) [30], the DE is simple and easily implemented, which has less predefined parameters and quick convergence speed. The communication rules for global objects between chained membranes are used to accelerate the convergence speed and avoid prematurity [31, 32]. At last, the CCP system is evaluated on eight benchmark clustering problems and eight tested images with the compared clustering and image segmentation techniques; the experimental results verify the validity and performance of proposed CCP.

The rest of this paper is organized as follows: the framework of the chained P systems is described in Section 2. Section 3 gives more details about the complex chained P system for clustering problems, and the evolution rules and communication rules are described in this section. In order to verify the validity of CCP, some experiments which are conducted on benchmark clustering problems, are taken in Section 4.2. Furthermore, some tested images are used to evaluate the competitive performance of CCP in Section 4.3. Section 5 provides some conclusions and outlines future research directions.

2. Chained P Systems

Some concepts can be briefly defined as follows. A special membrane is called chained membrane, which contains many objects based on a chained structure, and these objects are called chained object. The formal descriptions of the chained P system can be expressed as the following tuple [29].where (1) is a finite nonempty alphabet, and the symbols are called chained objects.(2) is the membrane structure consisting of membranes, which is composed of two parts: the structure of chained membranes and the structure of the whole chained P system.(3) are the initial finite sets of chained objects; represents the chained objects in membrane , for , .(4) are the finite sets of chained rules, and represents a finite set of chained rules in membrane , for , which consists of subrules that are executed on a special order. There are many chained rules on objects, for example, object addition rule, object subtraction rule, object crossover rule, and object variation rule.(5) is the input region or membrane in the chained P systems, which contains the initial objects in the systems.(6) is the output region or membrane in the chained P systems. If a certain chained rule cannot be executed in the chained P system, the computation process will be stopped and the computation results or objects will be transported in the output region or membrane.

3. Proposed Complex Chained P System for Clustering

The proposed chained P system based on evolutionary mechanism with a complex chained structure for clustering problems is presented in this section. First, the general framework of this complex chained P system (CCP) is described, and the basic membrane structure is given. Next, the evolution and communication mechanisms of chained P system are introduced in the CCP system. Finally, the computational process of the CCP system is described in the last parts.

3.1. General Framework of CCP System

The general framework of CCP is similar to that of the chained P systems, but the main differences are the membrane structure of whole P system and the evolution rules for chained objects. In complex chained P system, there are two kinds of chained membranes, chained membranes to and membranes , which contains different evolution rules. These chained membranes are labelled from 1 to . Respectively, the formal descriptions of the CCP system are as follows :where(1) is a finite set of alphabets, which includes all chained objects or strings in the CCP systems.(2) is the membrane structure of CCP system consisting of membranes.(3) are the multisets of the initial chained objects, with , for , and represents the chained objects.(4) represent finite sets of evolution rules in chained membranes to , represent a set of evolution subrules in chained membrane , and the number of the subrules are denoted by . The subevolution rules are of the form: , with , which means that chained object will be evolved to chained object .(5) represent the communication rules in chained membranes to . The communication rules are of the form: , with , for , which means the chained object in membrane will be changed in object and transported into membrane .(6) represents finite sets of evolution rules in chained membrane . For two arbitrary chained objects and , , there are two kinds of evolution rules existing in the membranes can be executed: crossover rules and variation rules, .(7) is the input region or membrane in the CCP system, which contains the initial objects of the whole system.(8) is the output region or membrane in the CCP systems. Once the computation is completed, the computation results or objects will be transported to the output region or membranes. Figure 1 gives the membrane structure of the proposed CCP system.

In this proposed CCP, the chained objects are considered to be feasible solutions in the search space, and the partition obtained by the clustering technique is represented by , for . Thus, the -th chained object can be defined as , where represents the -th cluster center, is the number of the clusters, and the dimension of cluster center is denoted by . At last, the best chained object that represents the best partitioning results in the system will be output to the when the computation is completed.

3.2. Evolution Rules on Chained Objects

There are two kinds of evolution rules on chained objects existing in the chained membranes, the evolution rules in chained membranes to and another evolution rules in chained membrane .

3.2.1. Evolution Rules in Chained Membranes to

There are two kinds of evolution subrules in chained membranes to , which are based on different velocity updating strategies for chained objects. Firstly, the particle swarm optimization (PSO) mechanism is used to evolve the chained objects, and the traditional velocity model of PSO is introduced to update the velocity of chained object. At time, the velocity formula of the -th object in -th chained membrane is determined by (3) in the following equation:where the inertia weight is denoted by and and represent the learning factor, which is usually restricted to 2. The iteration counter is denoted by . and are two independent uniform random numbers. The local best of in history is denoted by , specially, the global best in is denoted by .

Another velocity updating strategy for objects is based on environmental factors. At time, a modified velocity formula of EPSO is determined by (4) in the following equation:where represents a positive constant, is a uniform random number, and represents the environmental factor around to the object , which is based on the information of clusters. The partitioning information of cluster is changed dynamically through the evolution of objects, and the geometric center of the data point belonging to the corresponding cluster in the object is used as the environment factor, where , for , which is given by (5) at time in the following equation:where represents the -th cluster centroid of the environment factor , represents the -th data point in the dataset, for , is the number of the data point in the datasets, and number of data points belonging to the corresponding cluster is denoted by .

At time, the position formula of is determined by (6) in the following equation:and the local best of at time is updated according to (7) in the following equation:where represents the fitness function of for clustering problems, which can be defined by (8) in the following equation:

The purpose of clustering problems is to find a partitioning result obtained by techniques to minimize the values of fitness function according to equation (8). The global best at time is updated according to (7) in the following equation:

At time, a success rate of chained objects is defined by the following equation:where represents the success rate of . The success rate in the chained membrane , for , is computed by (11) in the following equation:where represents the number of the chained object in chained membrane and is the success rate of , , which means the improvement rate at last computation. In this study, a linearly increasing strategy is introduced to adjust the values of the inertia weight dynamically; the possible range is based on the mapping relationship of , which is given by (12) in the following equation:where and represent the minimum and maximum of inertia weight and .

3.2.2. Evolution Rules in Chained Membrane

The evolution rules in chained membrane contain objects crossover rules and object variation rules, differential evolutionary (DE) approach is further used as the variant of evolution rules in , and the mutation and crossover mechanism is also introduced to help the objects escape the local optima. At time, the -th chained object in is denoted by , where represents the dimension of the objects and and are two randomly chained objects in ; the mutation operation is defined by (13) in the following equation:where represents the created donor, represents the best object in , the scaling factor is denoted by , and the value is given by , . The mutant comes from the created donor through crossover mechanism, which can be defined bywhere the crossover rate is denoted by , it is a predefined constant within the range from 0 to 1, is an independent uniform random number, and is the random dimension from 1 to . At time, the position formula of is determined by

3.3. Communication Rules in Chained Membranes

The communication rules of chained P system are used to enhance the cooperation between the chained membranes, which provide good foundation for exchange and sharing of the information. There are two kinds of communication rules in the CCP system:(1), for . At time, the global best in is sent to the chained membrane and transformed into the -th object . At each iterative computation, the chained membrane only contains objects that come from the chained membranes to .(2), for . At time, the global best in is transported to the chained membranes to , which is considered to be the global best in chained membrane . Meanwhile, the global best in membrane is sent to the output membrane , which is viewed as the best object or the final computation results of the CCP at time.

3.4. Computation Process of CCP
3.4.1. Initialization

The input membrane contains all initial objects in the P system, denoted by . The position of chained object is randomly initialized in the search space. After initialization, the chained objects in the membrane will be transported to the chained membrane to , and each chained membrane contains objects.

3.4.2. Evolution in Chained Membranes to

The evolution rules on chained objects are used to completed the evolution process according to the equations (3), (4),and (6) in chained membrane . And, the selection for velocity formula is based on a random strategy. The local best and global best in are updated through the equations (7) and (9).

3.4.3. Communication between Chained Membranes

The first kinds of communication rules are used to transport the global best in the to the chained membrane .

3.4.4. Evolution Rules in Chained Membranes

The evolution rules on chained objects are used to completed the evolution process according to the equations (13) and (14) in chained membrane , and global best in is updated by the equation (15).

3.4.5. Communication between Chained Membranes

The second kinds of communication rules are used to transport the global best in to each chained membrane .

3.4.6. Halting and Output

The evolution and communication will be implemented repeatedly with an iterative form during the computation process. The termination criterion of the system is stopped to whether the maximum number of iterations or computation is reached. When the system halts, the output membrane will send the global best to the environment, and this object is regarded as the final computing results of CCP system.

4. Experimental Analysis of Clustering Problems

In this section, the feasibility and effectiveness of the proposed CCP will be demonstrated through the experimental analysis. The datasets of the experiment are introduced first, and the artificial dataset from the previous studies [33] is used to tune the parameters in CCP. Eight real-life datasets from UCI machine leaning repository [34] is used to compare the performance with currently existing clustering approaches. All clustering approaches, including CCP, are implemented on MATLAB 2016b, and all the experiments are conducted on a DELL desktop computer with an Intel 4.00 GHz i7-8550U processor and 8 GB of RAM in a Windows 10 environment.

4.1. Parameter Setting

The numbers of chained membranes have important influences on the performance of CCP. Therefore, four artificial datasets [33], Data_5_2, Data_9_2, Size5, and Square4, are used to tune this parameter in order to ensure the equity of the experiment. The details of the four artificial datasets are given in Table 1.


DatasetsDataFeatureClass

Data_5_225025
Data_9_290029
Size_5100024
Square4100024

Different CCP systems with different degrees of , , [12] are used to evaluate the effects of the number of the chained membranes. The maximum of iterations is set to , and the positive constant is a random number which distributed to 0.6 and 3. The lower and upper limits of the inertia weight are set to and . The mutation probability is randomly generated from 0.2 to 0.8. Other parameters which are not tested in this experiment will be maintained the same values for the fairness of the comparison experiments. And, the number of independent running is set to 30 to eliminate the effect of the random factors. The values of mean and SD of the fitness function obtained by CCP system with different degrees are reported in Table 2.


Datasets
MeanSDMeanSDMeanSD

Data_5_2326.50710.0301326.46960.0123326.46410.0110
Data_9_2590.85870.0624590.75890.0463590.74130.0287
Size_52493.14504.80482491.96800.07892491.90200.0565
Square42367.64430.03172367.60520.02602367.59170.0141

Table 2 reports the clustering results of the CCP system with different degrees on four artificial datasets. The best values of mean and SD for each dataset are given in bold. These results show that the mean and the SD values obtained by the CCP system when are the best among these different systems.

4.2. Clustering Problems

The performance of the CCP is compared with four clustering approaches that have been reported in the literature to further evaluate the effectiveness, such as standard particle swarm optimization (PSO) [35], differential evolution (DE) [36], environment particle swarm optimization (EPSO) [37], and adaptive particle swarm optimization (APSO) [38]. The comparison experiments are conducted on eight real-life datasets from the UCI machine learning repository, Iris, Newthyroid, Seeds, Diabetes, Yeast, Glass, CMC, and Lung Cancer. More details about these datasets are presented in Table 3.


DatasetsDataFeatureClass

Iris15043
Newthyroid21553
Seeds21073
Diabetes76882
Yeast1484810
Glass21496
CMC147393
Lung Cancer32263

In these compared clustering approaches, the crossover probability of the DE approach is randomly generated from 0.2 to 0.8. An environment factor based on the clustering information is embedded in the velocity updating model of EPSO approach, and the learning factor is randomly generated from 0.6 to 3. A nonlinear regressive function of APSO based on the population diversity is used to the adjustment formula of the inertia weight, represents a predefined constant, and is a predefined constant. All adjustable parameters in these clustering approaches are set to the appropriate values which are reported in the respective publications from Table 4.


ParametersPSODEEPSOAPSOCCP

Population ()100100100100100
200200200200200
,2,2N2,2N2,2
NN(0.6, 3)N(0.6, 3)
,(0,1)N(0, 1)N(0, 1)
NN(0, 1)NN
(,)1N(0.4, 0.6)N(0.4, 1.2)
N(0.2, 0.8)NN(0.2, 0.8)
NNN2.1N
NNN2N
NNNN10

Each clustering approach, including CCP, ran for 50 times for each dataset to eliminate the effects of random factors. Simple statistics including worst values (Worst), best value (Best), Mean, and SD of fitness function according to (8) are used in the experiments as the evaluation criteria of clustering results. The experimental environment is the same for all comparative clustering approaches.

Figure 2 shows the convergence of these clustering approaches on the eight test datasets for typical runs of these approaches. The fitness value obtained by CCP declines faster at the beginning of the evolution process and then obtains fine convergence for each dataset. The values of the fitness function of PSO and DE decrease slowly at the beginning of the evolution process and do not apparently have better convergence performance than other approaches. Although EPSO and APSO show better performance than the above clustering approaches, they are also easily trapped into local optima, as shown in Figures 2(e), 2(f), and 2(h). Therefore, CCP has better convergence speed and higher clustering quality than the comparative approaches for all these datasets, as shown in Figure 2.

Simple statistics of the fitness function values of these clustering approaches on these datasets are reported in Table 5. Results in Table 5 show that CCP has the overall best performance on these eight test datasets. Due to the characteristics of the test datasets, some clustering approaches performed better on some specific datasets with smaller SD, but the performance of CCP on these ten datasets is all considered comparable. Table 6 provides the average computation time in seconds taken by each of the five clustering techniques when running 50 times on each of the datasets. It can be seen from Table 6 that the proposed CCP has a larger average computation time as compared with PSO, DE, EPSO, and APSO. The evolution process in the chained membranes is time-consuming so that CCP takes more time than other techniques does.


DatasetsStatisticsClustering approaches
PSODEEPSOAPSOCCP

IrisWorst153.8256109.996996.818396.711396.6555
Best114.6361101.988896.706396.667296.6555
Mean133.4267105.255596.743196.686396.6555
S.D.8.71502.33620.03740.00931.23E13

NewthyroidWorst2400.01521977.37291943.60911908.26181895.9965
Best2040.00581924.23651887.75781885.49501866.5183
Mean2271.05291950.82371903.76991897.21141888.1277
S.D.91.131714.021711.12255.08637.5021

SeedsWorst452.8039353.1875312.1602311.9663311.7978
Best384.2174333.9361311.8824311.8493311.7978
Mean422.1335342.6432312.0224311.9148311.7978
S.D.23.01065.40780.07150.02574.27E07

DiabetesWorst59885.083547911.271447629.862147590.465547561.1362
Best51628.108347693.640347585.401247561.612447561.1262
Mean54535.433547779.187947607.617647573.166847561.1262
S.D.2147.431973.224811.85706.93633.79E08

YeastWorst430.0923378.2874248.9934287.4479245.0999
Best391.0852348.2327236.9439256.7699235.3785
Mean407.6461367.3743242.7818271.6039239.6706
S.D.10.97767.20503.30828.08482.4396

GlassWorst451.0029334.2308226.0186253.7243213.4793
Best328.2842292.1006212.6117210.8631212.1629
Mean418.7840314.2182216.2731230.8009212.9308
S.D.31.27438.91583.529014.89390.3110

CMCWorst6852.76975720.00025537.81425535.56755532.3096
Best6335.67135628.69905535.58535533.14035532.1857
Mean6551.65585683.09785536.82425534.17845532.1913
S.D.174.044328.14800.55430.59620.0278

Lung CancerWorst170.2152153.9227127.4346125.6464125.6485
Best184.9632159.9095138.4711126.3331125.6685
Mean178.8476157.0802133.1676125.7308125.6609
S.D.4.15421.60513.34660.15950.0052


DatasetsClustering approaches
PSODEEPSOAPSOKEPSO-CCP

Iris2.46302.73153.06082.67083.3833
Newthyroid2.47852.82513.33502.83603.6576
Seeds2.42632.70723.28172.84463.5799
Diabetes2.73413.11164.91563.23155.2833
Yeast5.16745.23559.82555.683110.3316
Glass2.65263.00693.63782.99464.0859
CMC3.75504.03918.05833.99408.1726
Lung Cancer2.34512.73822.82672.76553.2006

In order to evaluate the clustering performance of these compared clustering approaches, clustering accuracy (CA) is used to evaluate the quality of clustering results obtained by the clustering techniques; the overall accuracy of partitioning results is defined bywhere represents the number of the data points in both belonging to actual cluster and partitioning cluster , for . The simple statistics of clustering approaches on eight datasets are reported in Table 7. It can be seen that CCP has overall better performance on these datasets. Although some approaches show a better performance on some specific datasets, the performance of CCP on these specific datasets is also considered compared from Table 7.


DatasetsStatisticsClustering approaches
PSODEEPSOAPSOCCP

IrisWorst0.66670.89330.89330.90000.8667
Best0.92670.90000.90000.90000.9600
Mean0.80770.89830.89970.90000.9013
SD0.09040.00270.00151.03E170.0248

NewthyroidWorst0.69770.74880.77210.77210.7767
Best0.83720.80930.80470.80470.8047
Mean0.73910.77770.80000.80000.8014
SD0.03370.01870.01140.01140.0080

SeedsWorst0.64760.84770.89520.89520.8952
Best0.89530.91430.89520.89520.8953
Mean0.83670.88140.89520.89520.8952
SD0.08500.02071.14E − 161.14E − 161.14E16

DiabetesWorst0.65100.65100.65100.65100.6510
Best0.65100.65100.65100.65100.6654
Mean0.65100.65100.65100.65100.6518
SD1.14E − 161.14E – 161.14E − 161.14E160.0032

YeastWorst0.31200.33090.45080.39420.4683
Best0.33760.40160.53700.45690.5418
Mean0.31710.36400.49650.43770.5200
SD0.00570.02480.02750.01340.0152

GlassWorst0.35510.47200.56070.50930.5841
Best0.48130.52340.59810.58410.5841
Mean0.41400.50210.58340.54860.5841
SD0.04440.01440.01220.02671.14E16

CMCWorst0.43320.43580.45150.45620.4515
Best0.46300.45820.45690.45620.4569
Mean0.45100.44920.45560.45620.4562
SD0.00870.00740.00165.66E170.0012

Lung CancerWorst0.40620.40630.53130.40630.5625
Best0.56250.50000.56250.62500.5625
Mean0.46410.42810.55780.51880.5625
SD0.04330.02500.01140.06441.28E16

4.3. Proposed CCP for Image Segmentation

In this section, some typical experiment and analysis for tested image are proposed to evaluate the segmentation performance of this proposed CCP. These tested images are used in the previous studies and researches, which are provided from the Berkeley segmentation dataset and benchmark [39]. The size of the tested image is 481 × 321; Figure 3 gives the original image of these tested images.

OSTU proposed by Qtsu [40] is one of most popular segmentation methods, and it has been used to determine whether the optimal threshold method can give a satisfactory segmentation results. The following discriminant criterion measure of OSTU can be described as follows: An image contains pixels from 0 to , usually is set to 255, need to segmented in clusters. Thus, thresholds are needed that divided the original image. The number of -th gray level pixels or frequencies is denoted by , represents the probability of -th gray level pixel in the image,, where . The optimal thresholds are determined by the following equations:where , , for .

The compared experiments are performed on the test images with a different number of thresholds, , , and [41], to evaluate the performance of the CCP in both low- and high-dimensional multilevel thresholding problems. And, the proposed CCP is compared with PSO, DE, EPSO, and APSO approaches as mentation above. The purpose of image segmentation is to find a set of thresholds to maximize the values of Ostu’s function according to equation (17). Figures 47 give the segmented results on tested images with a different number of thresholds.

Figures 47 provide the segmented images on church, starfish, surfer II, and elephants obtained by the compared techniques. It can be observed that the segmented quality has been improved with the increasing number of thresholds. And, CCP has a better performance than others on these tested images. Respectively, the segmented images obtained by APSO and CCP achieve better consistency than those by PSO, DE, and EPSO when . The gray-level histogram is often regarded as a kind of distributions to determine the thresholds for the image segmentation [42]. And, the peak value of the histogram is one of the important factors that affect the segmentation accuracy. Therefore, the thresholds of the compared approaches on church and elephant images are shown in gray-level histogram as follows.

Figures 8 and 9 show the segmented results of church and elephant at 8 thresholds level obtained by CCP and compared approaches. The optimal thresholds values of CCP with {40, 79, 114, 130, 158, 191, 213, 235} is similar to those of the PSO with {40, 77, 114, 130, 157, 190, 213, 235} on church images and also similar to those of the elephant image. From these figures, it can be seen that the optimal thresholds values by DE and EPSO are very different from those by PSO, APSO, and CCP in most cases. Because the segmentation results depend on the information of classes according to the thresholds level, the optimal thresholds of EPSO have statistical difference with others. Furthermore, it is not difficult to find that the selected optimal thresholds heavily depend on the objective function that is chosen. Besides EPSO, other threshold selection approaches, including the CCP approach, can segment the test images more reasonably, as shown in Figures 47. Table 8 provides the mean and S of Ostu’s function with 3, 5, and 8 thresholds achieved by all compared approaches in 50 run times.


ImagesThPSegmentation approaches
PSODEEPSOAPSOCCP

Church3Mean3276.83673276.8373276.76853276.83673276.8367
SD1.87E − 121.87E − 120.1558311.87E − 120.00E+00
5Mean3381.91073381.49233380.49293381.92363381.9246
SD0.01710.22451.12400.00469.25E13
8Mean3422.56213420.95783416.20823420.78343423.2006
SD2.65790.91813.55742.867251.2323

Train3Mean2606.59512606.59512606.56232606.59512606.5951
SD4.67E134.67E − 130.05004.67E − 134.67E13
5Mean2736.22572735.31812735.26762735.63282736.2331
SD0.01350.52410.67432.67879.33E13
8Mean2790.34352788.03382786.98292790.40682790.4461
SD0.09130.57242.55110.08970.0389

Roman3Mean2142.34402142.34402142.27522142.34402142.3440
SD9.33E − 139.33E − 130.10739.33E − 131.39E13
5Mean2221.30262220.84652220.59982221.30512221.3056
SD0.00470.24760.46360.00121.67E13
8Mean2251.72182250.57662249.71172251.72582251.7325
SD0.05920.42111.21470.06730.0573

Starfish3Mean2784.22722784.22722784.07952784.22722784.2272
SD1.4E − 121.4E − 120.12671.4E − 129.25E13
5Mean2916.25572915.64662915.58622916.27262916.2730
SD0.02400.36930.67160.00100.0009
8Mean2973.99762971.81862971.22072974.03912974.0829
SD0.04040.76031.87860.09420.0151

Surfer II3Mean7953.41817953.41817953.34897953.41817953.4181
SD2.80E − 122.80E 120.07182.78E122.80E − 12
5Mean8025.58808025.10568024.61608025.60458025.6055
SD0.01470.26391.22320.00310.0019
8Mean8055.97548054.33878052.11598056.04438056.0530
SD0.05050.54972.27410.03610.0189

Cow3Mean3858.22023858.22023858.18723858.22003858.2202
SD0.00E + 000.00E + 000.08860.00090.00E+00
5Mean3965.28063964.37223964.07703965.29253965.2932
SD0.01960.45311.24800.00324.67E13
8Mean4028.84474026.04904024.42134028.92994028.9450
SD0.07580.86352.64470.05480.0107

Crocodile3Mean3155.43593155.43433155.34683155.43593155.4359
SD9.33E − 130.00690.09699.33E 139.25E13
5Mean3291.42963290.60653290.47433291.42633291.4407
SD0.01630.49431.05940.03230.00E+00
8Mean3348.63143346.48093345.80103348.64253348.7305
SD0.07190.63802.08890.10330.0168

Elephants3Mean1626.72051626.72051626.67101626.71981626.7205
SD0.00E+000.00E + 000.12200.00350.00E + 000
5Mean1695.86631695.52981695.23351695.86631695.8676
SD0.00390.14150.87550.00230.0005
8Mean1728.17061726.92761726.15341728.04081728.2503
SD0.05920.39311.52280.67640.0232

The simple statistics of compared approaches on tested images are reported in Table 8. The best values of mean and SD for each image are highlighted, and it is not hard to see that the CCP system is able to find the best values. Because the PSO, DE, EPSO, and APSO are not specially techniques for image segmentation, some segmented approaches based on multilevel threshold, whale optimation algorithm (WOA) [43], gray wolf optimizer (GWO) [44], whale optimization algorithm based on thresholding heuristic (WOA-TH) [41], and gray wolf optimizer based on thresholding heuristic (GWO-TH) [41], are used to evaluate the clustering effectiveness of CCP. The maximum number of iterations is set to 2200, 3000, and 3600 with the number of thresholds being 3, 5, and 8. The mean and SD of Ostu’s function are obtained by the 100 run times to avoid the effects of random factors.

Table 9 provides the values of mean and SD obtained by different segmented approaches on tested images. Obviously, traditional approaches are easily trapped into local optima, and WOA and GWO have worse segmented performance than that of the others. And, the thresholding heuristic has finetuned the best thresholds to enhance the global search ability of WOA and GWO. It also can be observed that the CCP has better and stable performance than compared segmented approaches on the tested images.


ImagesThPSegmentation approaches
WOAGWOWOA-THGWO-THCCP

Church3Mean3271.44273271.43833271.44273271.44453276.8367
SD1.24E – 029.93E – 039.14 E138.27E − 051.87E − 12
5Mean3375.87213374.21973375.91183371.98133381.9246
SD2.65E + 006.74E + 002.62E + 009.65E + 009.33E − 13
8Mean3416.68153415.44153417.08763415.65583423.5841
SD2.40E + 001.74E + 001.78E + 003.51E + 002.72E − 02

Train3Mean2606.59512611.49872611.50812611.50812606.5951
SD2.26E − 022.53E − 024.57E − 124.57E − 129.52E  − 13
5Mean2740.67492740.55342740.68002740.67842736.2331
SD1.57E − 021.16E − 015.77E − 036.18E − 032.32E  − 03
8Mean2794.69312793.92322794.71702794.27332790.4901
SD5.57E − 021.19E +001.90E − 025.32E − 011.12E − 02

Roman3Mean2138.78932138.78372138.80572138.79922142.3440
SD2.52E − 022.62E − 020.00E +001.69E − 029.33E  − 13
5Mean2218.66352218.79362218.96222218.96872221.3056
SD2.53E + 007.93E − 011.11E − 022.35E − 049.25E  − 13
8Mean2249.37682248.64082249.64572249.60802251.7798
SD1.28E + 001.17E + 001.37E − 012.00E − 013.77E  − 02

Starfish3Mean2779.92522779.91672779.92522779.92142784.2272
SD3.20E − 126.64E − 033.20E − 125.91E − 031.40E  − 12
5Mean2912.85322912.70582912.85622912.83712916.2730
SD1.18E − 021.66E − 015.48E − 034.62E − 029.34E-04
8Mean2972.22182971.37252972.34792972.21592974.0989
SD1.21E + 001.19E + 007.36E − 031.99E − 012.23E − 03

Surfer II3Mean7953.41677953.40637953.41677953.41357953.4181
SD5.07E − 031.89E − 024.82E − 037.80E − 036.48E  − 12
5Mean8025.07038025.49598025.60498025.60258025.6055
SD7.73E − 035.56E − 027.11E − 033.20E − 032.80E  − 12
8Mean8055.42158054.41358056.06178055.84638056.0646
SD8.27E − 037.36E − 012.05E − 031.25E − 012.01E  − 03
Cow3Mean3858.22023858.21883858.22023858.22023858.2202
SD5.94E − 121.03E − 025.94E − 125.94E − 121.85E  − 12
5Mean3965.27953965.17393965.29323964.60963965.2932
SD2.01E − 029.72E − 026.40E − 122.93E + 001.39E  − 12
8Mean4028.94104027.78924028.84744027.72104028.9605
SD1.83E − 029.47E − 016.11E − 011.69E + 001.50E  − 02

Crocodile3Mean3155.43593155.43373155.43593155.43593155.4359
SD9.14E − 131.39E − 029.14E − 139.14 E  − 139.33E − 13
5Mean3291.43403291.29453291.43663291.42873291.4407
SD1.02E − 021.22E − 015.38E − 031.35E − 024.67E − 13
8Mean3348.03843347.45283348.74533348.65043348.7454
SD2.79E +001.75E +001.94 E  − 037.60E − 023.27E − 03

Elephants3Mean1626.71831626.71921626.72051626.71961626.7205
SD3.53E − 039.10E − 031.37E − 129.25E − 039.25E  − 13
5Mean1695.04821695.32081695.86641695.82311695.8676
SD3.99E +002.84E +001.05 E  - 038.42E − 027.12E − 02
8Mean1727.17251726.53791728.25961728.14521728.2676
SD2.86E + 001.34E + 002.55E − 021.35E − 011.20E − 02

5. Conclusions

A complex chained P system (CCP) is proposed for solving clustering problems, which combines complex chained P systems and evolution mechanisms, including PSO and DE mechanisms. Two kinds of evolution rules for objects in different chained membranes are introduced to enhance the global search ability of PSO. One of evolution rules contain two subevolution rules, which are based on traditional and modified PSO techniques. The partitioning information as environmental factor is introduced to improve the clustering performance of PSO. Another is based on the DE mechanism to evolve the global chained objects in the chained membrane to enhance the global search ability. In addition, two kinds of communication rules in the chained P systems are defined to enhance the cooperation between chained membranes and avoid prematurity. In order to verify the validity and the performance of CCP, this proposed system is evaluated on eight benchmark clustering problems from the UCI machine learning repository as compared with four developed clustering approaches. Furthermore, eight tested images which from the Berkeley segmentation image databases BSDS300 are used to further evaluate the performance of CCP compared to four existing segmentation techniques. These experimental results verify the validity and performance of this proposed CCP.

P systems, as parallel computing models, are highly effective and efficient in solving optimization problems with linear or polynomial complexity. These parallel computing models based on evolution mechanisms provide new ways for solving clustering problems. The extended clustering chained P system uses the chained P system as the computation structure, and the communication rules between chained membranes are single directional. Although these single directional communication rules are simple and easy to implement, bidirectional communication rules may be introduced in future studies to further accelerate the convergence and improve the diversity of populations. Some more complicated communication structures between different membranes may be used in future studies to improve the performance of the approach. Furthermore, the experiments only used small datasets from the artificial datasets and the UCI Machine Learning Repository, and the proposed approach may have some limitations on high dimensional and large datasets. Future studies may test the effectiveness of CCP using large datasets. Balancing the local and global search abilities is also a hard problem to resolve in the future studies. Future studies may also focus on extended P systems based on tissue-like P systems and other bioinspired computing models. More works are needed to apply these extended membrane systems to solve automatic and multiobjective clustering problems.

Data Availability

The two artificial datasets that were manually generated and often used in the existing literature are from the artificial datasets, available at https://www.isical.ac.in/content/databases (accessed June 2018). The eight real-life datasets are often used in the existing literature from the UCI Machine Learning Repository, available at http://archive.ics.uci.edu/ml/datasets.html (accessed June 2018). The eight tested images are from the Berkeley computer vision group, Berkeley segmentation dataset, and benchmark (BSDS300), available at https://www2.eecs.berkeley.edu/Research/Projects/CS/vision/grouping/segbench/(accessed October 2018).

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.

Acknowledgments

This research project was partially supported by the National Natural Science Foundation of China (61876101, 61802234, and 61806114), the National Natural Science Foundation of Shandong Province, China (ZR2019QF007), the Ministry of Education Humanities and Social Science Research Youth Foundation, China (19YJCZH244), the Social Science Fund Project of Shandong Province, China (16BGLJ06 and 11CGLJ22), the Special Postdoctoral Project of China (2019T120607), and Postdoctoral Project of China (2017M612339 and 2018M642695).

References

  1. G. H. Păun, “Membrane computing: an introduction,” Theoretical Computer Science, vol. 287, no. 1, pp. 73–100, 2002. View at: Google Scholar
  2. X. Song, J. Wang, H. Peng et al., “Spiking neural P systems with multiple channels and anti-spikes,” BioSystems, vol. 170, pp. 13–19, 2018. View at: Publisher Site | Google Scholar
  3. A. Leporati, L. Manzoni, G. Mauri et al., “The counting power of P systems with antimatter,” Theoretical Computer Science, vol. 701, no. 21, pp. 161–173, 2017. View at: Publisher Site | Google Scholar
  4. A. A. Mahmood, A. Maroosi, and R. C. Muniyandi, “Membrane computing to enhance time efficiency of minimum dominating set,” Mathematics in Computer Science, vol. 10, no. 2, pp. 249–261, 2016. View at: Publisher Site | Google Scholar
  5. A. Leporati, L. Manzoni, and G. Mauri, “Characterising the complexity of tissue P systems with fission rules,” Journal of Computer and System Sciences, vol. 90, pp. 115–128, 2017. View at: Publisher Site | Google Scholar
  6. L. Q. Pan, G. H. Păun, and B. Son, “Flat maximal parallelism in P systems with promoters,” Theoretical Computer Science, vol. 623, pp. 83–91, 2016. View at: Publisher Site | Google Scholar
  7. A. Alhazov, R. Freund, and S. Ivanov, “Extended spiking neural P systems with white hole rules and their red-green variants,” Natural Computing, vol. 17, no. 2, pp. 297–310, 2017. View at: Publisher Site | Google Scholar
  8. T. Pan, X. Shi, and Z. Zhang, “A small universal spiking neural P system with communication on request,” Neurocomputing, vol. 275, pp. 1622–1628, 2018. View at: Publisher Site | Google Scholar
  9. L. Wang, X. Liu, and M. Sun, “An Extended clustering membrane system based on particle swarm optimization and cell-like P system with active membranes,” Mathematical Problems in Engineering, vol. 2020, Article ID 5097589, 18 pages, 2020. View at: Publisher Site | Google Scholar
  10. X. Y. Liu and J. Xue, “A Cluster splitting technique by hopfield networks and P systems on simplices,” Neural Processing Letters, vol. 46, no. 1, pp. 171–194, 2017. View at: Publisher Site | Google Scholar
  11. X. Liu, Y. Zhao, and M. Sun, “An improved Apriori algorithm based on an evolution-communication tissue-like P system with promoters and inhibitors,” Discrete Dynamics in Nature and Society, vol. 2017, no. 4, pp. 1–11, 2017. View at: Publisher Site | Google Scholar
  12. H. Peng, J. Wang, and P. Shi, “An extended membrane system with active membranes to solve automatic fuzzy clustering problems,” International Journal of Neural Systems, vol. 26, no. 3, Article ID 1650004, 2016. View at: Publisher Site | Google Scholar
  13. H. Peng, P. Shi, and J. Wang, “Multiobjective fuzzy clustering approach based on tissue-like membrane systems,” Knowledge-Based Systems, vol. 125, pp. 74–82, 2017. View at: Publisher Site | Google Scholar
  14. Y. H. Wang, X. Y. Liu, and L. S. Xiang, “GA-based membrane evolutionary algorithm for ensemble clustering,” Computational Intelligence Neuroscience, vol. 2017, Article ID 4367342, 11 pages, 2017. View at: Publisher Site | Google Scholar
  15. M. Maitra and C. Amitava, “A hybrid cooperative-comprehensive learning based PSO algorithm for image segmentation using multilevel thresholding,” Expert Systems with Applications, vol. 34, no. 2, pp. 1341–1350, 2008. View at: Publisher Site | Google Scholar
  16. A. Mur Angel, R. Dormido, and N. Duro, “Determination of the optimal number of clusters using a spectral clustering optimization,” Expert Systems with Applications, C, vol. 65, pp. 304–314, 2016. View at: Publisher Site | Google Scholar
  17. S. Shirly and K. Ramesh, “Review on 2D and 3D MRI image segmentation techniques,” Current Medical Imaging Formerly Current Medical Imaging Reviews, vol. 15, no. 2, pp. 150–160, 2019. View at: Publisher Site | Google Scholar
  18. M. Zhang, L. Zhang, and H. D. Cheng, “A neutrosophic approach to image segmentation based on watershed method,” Signal Processing, vol. 90, no. 5, pp. 1510–1517, 2010. View at: Publisher Site | Google Scholar
  19. T. P. Xuan, P. Siarry, and H. Oulhadj, “Integrating fuzzy entropy clustering with an improved PSO for MRI brain image segmentation,” Applied Soft Computing, vol. 65, pp. 230–242, 2018. View at: Publisher Site | Google Scholar
  20. R. Rahmat and D. B. Harris-Birtill, “Comparison of level set models in image segmentation,” IET Image Processing, vol. 12, no. 12, pp. 2212–2221, 2018. View at: Publisher Site | Google Scholar
  21. J. L. Pei, L. Zhao, and X. J. Dong, “Effective algorithm for determining the number of clusters and its application in image segmentation,” Cluster Computing, vol. 20, no. 4, pp. 2845–2854, 2017. View at: Publisher Site | Google Scholar
  22. J. F. Li, W. Y. Tang, and J. Wang, “Multilevel thresholding selection based on variational mode decomposition for image segmentation,” Signal Processing, vol. 147, pp. 80–91, 2018. View at: Publisher Site | Google Scholar
  23. X. L. Zao, J.-H. Hwang, and Z. J. Fang, “Gradient-based adaptive particle swarm optimizer with improved extremal optimization,” Applied Intelligence, vol. 48, no. 12, pp. 4646–4659, 2018. View at: Publisher Site | Google Scholar
  24. B. Wang, L. L. Chen, and J. Cheng, “New result on maximum entropy threshold image segmentation based on P System,” Optik, vol. 163, pp. 81–85, 2018. View at: Publisher Site | Google Scholar
  25. T. Y. Tan, L. Zhang, and C. P. Lim, “Evolving ensemble models for image segmentation using enhanced particle swarm optimization,” IEEE Access, vol. 7, pp. 34004–34019, 2019. View at: Publisher Site | Google Scholar
  26. R. Eberhart and Y. Shi, “Particle swarm optimization: developments, applications and resources,” in Proceedings of the 2001 Congress on Evolutionary Computation, pp. 81–86, Seoul, South Korea, South, May 2001. View at: Google Scholar
  27. R. Storn and K. Price, “Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces,” Journal of Global Optimization, vol. 11, no. 4, pp. 341–359, 1997. View at: Publisher Site | Google Scholar
  28. T. F. Araújo and U. Waded, “Performance assessment of PSO, DE and hybrid PSO-DE algorithms when applied to the dispatch of generation and demand,” International Journal of Electrical Power & Energy Systems, vol. 47, pp. 205–217, 2013. View at: Publisher Site | Google Scholar
  29. Y. Z. Zhao, X. Y. Liu, and W. X. Sun, “The chained P system with application in graph clustering,” in Proceedings of the Sixth Asian Conference on Membrane Computing, pp. 46–57, Chengdu, China, September 2017. View at: Google Scholar
  30. D. E. Goldberg, “Genetic algorithm in search, optimization and machine learning,” 1989. View at: Google Scholar
  31. A. H. Cao, “A passive location algorithm based on differential evolution and genetic algorithm using the Doppler frequency,” Signal Processing, vol. 25, no. 10, pp. 1644–1648, 2009. View at: Google Scholar
  32. S. H. Liao, C. C. Chiu, and M. H. Ho, “Comparison of dynamic differential evolution and genetic algorithm for MIMO-WLAN transmitter antenna location in indoor environment,” Wireless Personal Communications, vol. 71, no. 4, pp. 2677–2691, 2009. View at: Google Scholar
  33. Datasets, http://www.isical.ac.in/content/database.
  34. H. C. Blake, “UCI repository of machine learning databases,” 1998. View at: Google Scholar
  35. J. Kennedy and R. Eberhart, “Particle swarm optimization,” in Proceedings of the IEEE International Conference on Neural Networks, pp. 1942–1948, Perth, Western Australia, November 1995. View at: Google Scholar
  36. R. Storn and P. Kenneth, “Differential evolution–A simple and efficient heuristic for global optimization over continuous spaces,” Journal of Global Optimization, vol. 11, no. 4, pp. 341–359, 1997. View at: Publisher Site | Google Scholar
  37. W. Song, W. Ma, and Y. G. Qiao, “Particle swarm optimization algorithm with environmental factors for clustering analysis,” Soft Computing, vol. 21, no. 2, pp. 283–293, 2017. View at: Publisher Site | Google Scholar
  38. H.-G. Han, W. Lu, and Y. Hou, “An adaptive-PSO-based self-organizing RBF neural network,” IEEE Transactions on Neural Networks and Learning Systems, vol. 29, no. 1, pp. 104–117, 2018. View at: Publisher Site | Google Scholar
  39. “The berkeley segmentation dataset and benchmark,” https://www2.eecs.berkeley.edu/Research/Projects/CS/vision/grouping/segbench/. View at: Google Scholar
  40. N. Qtsu, “A threshold selection method from gray-level histograms,” IEEE Transactions on Systems, Man, and Cybernetics, vol. 9, no. 1, pp. 62–66, 1979. View at: Publisher Site | Google Scholar
  41. B. V. Kumar and K. V. Arya, “A new heuristic for multilevel thresholding of images,” Expert Systems with Applications, vol. 117, pp. 176–203, 2019. View at: Google Scholar
  42. D. P. Panda and A. Rosenfeld, “Image segmentation by pixel classification in (gray level, edge value) space,” IEEE Transactions on Computers, vol. C-27, no. 9, pp. 875–879, 1978. View at: Publisher Site | Google Scholar
  43. S. Mirjalili and A. Lewis, “The whale optimization algorithm,” Advances in Engineering Software, vol. 95, pp. 51–67, 2016. View at: Publisher Site | Google Scholar
  44. S. Mirjalili, M. Seyed, and A. Lewis, “Grey wolf optimizer,” Advances in Engineering Software, vol. 69, no. 3, pp. 46–61, 2014. View at: Publisher Site | Google Scholar

Copyright © 2020 Xiyu Liu et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


More related articles

 PDF Download Citation Citation
 Download other formatsMore
 Order printed copiesOrder
Views99
Downloads326
Citations

Related articles

Article of the Year Award: Outstanding research contributions of 2020, as selected by our Chief Editors. Read the winning articles.