Abstract
Delaunay refinement is a technique for generating unstructured meshes of triangles for sensor network configuration engineering practice. A new method for solving Delaunay triangulation problem is proposed in this paper, which is called endpoint triangle’s circumcircle model (ETCM). As compared with the original fractional node refinement algorithms, the proposed algorithm can get well refinement stability with least time cost. Simulations are performed under five aspects including refinement stability, the number of additional nodes, time cost, mesh quality after intruding additional nodes, and the aspect ratio improved by single additional node. All experimental results show the advantages of the proposed algorithm as compared with the existing algorithms and confirm the algorithm analysis sufficiently.
1. Introduction
Recently, the concept of intelligent network system is very popular in the world. Actually, how can we deploy and optimize the sensor? It is still a difficult issue to scientists that affects both cost and detection capability, which are required considerations of both coverage and connectivity. A sensor node may perform the dual function of sensing the environment and acting as a relay node. In a real sensor network system, all sensor nodes distribute as a discrete data set, which will form a mesh network to provide monitoring of the environment. The terms mesh network will be used throughout this paper to describe a sensor network configuration [1].
Delaunay triangulation (DT) is an effective method to carve up a discrete data region, which is especially widely used in sensor network configuration engineering field [2–5]. In most cases, there is a constricting relationship among the discrete data. The discrete data may comprise some vector lines and close polygons, which must be included in the result of partition. In general, the Delaunay triangulation will not contain all edges of the graph. So far there are two types of DT algorithm: constrained Delaunay triangulation [6, 7] and conforming Delaunay triangulation [8, 9]. The former method is a best approximation of the Delaunay triangulation, given that it must contain all features in the graph. Generally, the DT property cannot be preserved and the quality of the mesh declines in constrained Delaunay triangulation, which will influence the stability and convergence of finite element numerical calculation. In the meanwhile, the conforming DT method can be considered as a degenerate Delaunay triangulation, whose relationships to the constrained graph are that each vertex of the graph is a vertex of the triangulation and each edge of the graph is a union of edges of the triangulation and also satisfies the DT property. But any introduced new node will lead the original graph to change. Therefore, the method should be used restrainedly depending on the actual situation.
Usually, constructing conforming DT is more difficult than constructing constrained DT, as it requires a number of points to achieve conformity. The core technique of the conforming DT is to subdivide the constraints. This paper presents a novel node refinement algorithm, which has better triangulation quality and fewer additional nodes than other algorithms.
The rest of this paper is organized as follows. In Section 2, the basic Delaunay triangulation problem is briefly introduced and the main idea of endpoint triangle’s circumcircle model (ECTM) is described in detail in Section 3. Then the convergence and complexity of ECTM are analyzed in Sections 4 and 5. Section 6 gives the simulation experiments’ results of the new ECTM with other methods. Finally, a conclusion is drawn in Section 7.
2. Problem Description
Suppose is a planer straight line graph, where , is exterior feature constraint, are interior feature constraints, and is the collection of all discrete points and endpoints of feature lines.
If is a close constraint, describes the single connected domain of ; should satisfy the following conditions.
All defined regions by interior constraints are in the defined region by exterior constraints; namely, .
The mutual parts of feature constraints are the finite points in collection ; namely, , where .
For any with the above conditions, inserting some additional points on the features, how can we get a geometrical equivalent DT graph by using the whole discrete data and additional points corresponding to former graph? Two types of refinement algorithms for conforming DT are considered for solving the problem. One is refining feature lines at first and then executing DT, such as in the literature of [10–12]. Others have the different ideas exactly. Firstly they execute DT and then refine the feature lines. References [10, 13–17] are the representative algorithms.
Inspired by the algorithms of [14, 16, 17] an improved feature refinement algorithm named endpoint triangle’s circumcircle model (ETCM) is proposed in this paper.
3. Endpoint Triangle’s Circumcircle Model
3.1. Basic Idea of ETCM Definition
Endpoint’s triangle containing a feature is such a triangle which uses one of the endpoints as a vertex and intersects one edge of the triangle with the feature simultaneously.
Suppose is a feature that is not contained in DT meshes, is the endpoint’s triangle of , and is the endpoint’s triangle of ; and are the circumcircles of and , respectively. The basic idea of ETCM is as follows.
Let , . If , choose the longer one between and as the line to be inserted; the corresponding point of intersection is as additional point. Then let the shorter one be the remainder feature line; execute the approach to the shorter one as mentioned above. It will stop execution until and take the midpoint of as additional point at that time. In particular if the feature line being treated influences the feature line which has been inserted, the influenced feature line segment should be transacted again.
3.2. Description of ETCM
For designing the ECTM algorithm, we must confirm a data structure at first. There are mainly four structures being considered as shown in Algorithm 1.

The structure Triangle records the information of triangles, including the index number, the index of three vertexes, the tag of intersection, the pointers of three neighboring triangles, and the pointers indicating front or next in the linked list. The structure TriIndex is a doubly linked list of triangle. The structure FeatureSegment records the information of constraint feature, including the index number, the index of start point and endpoint, and the pointers indicating front or next in the linked list. The structure Vertex records the information of vertex, including the index number, the vertex coordinate, and the head pointer of neighboring triangles list. The data structure definitions are shown in Algorithm 1.
The overview of the ECTM algorithm is as shown in Algorithm 2.

The function InsertNewNode(crosstri, ) is described as shown in Algorithm 3.

For the ETCM algorithm, it can ensure the distribution of additional points is unique. Due to the fact that the additional point is in the influence polygons of feature segment, we can search the additional points in the influence polygons instead of in the whole domain. So it has a highlevel efficiency compared to other algorithms.
4. Convergence of the Number of Additional Points
In the process of dealing with the feature line, take the point corresponding to the longer one in the two interceptive segments, which is generated by the two endpoints’ circumcircles and feature line as the adding point. It is the point that makes the ratio largest between the interceptive segment and the whole feature line and constructs two triangles which share the interceptive segments and satisfy DT property. By dealing with the remainder feature line as the above operation, until the intersection of two interceptive segments is not empty, there is no doubt that the quantity of additional points for each single feature line would be least compared to any other method. But for the whole planer straight line graph, it is not true in fact. Because of some triangle taking the treated feature segment as its edge, that may become a part of the influence polygon of the feature segment being processed. So the influenced feature segment should be treated again, which would produce some new additional points.
Assume that, in a planer straight line graph with points and edges, is a feature line, is the largest circle comprising no other points and features except , and has the same definition as . If , stop refining . If , , , the length of is , and is the smallest distance between and other points or edges, divide into segments; each segment length is no longer than . So the circle which takes each segment as its diameter has the character of containing no other points inside. Including and , new points would be added on , that is, . Because the number of additional points for a feature line does not influence the embedded feature line, the number of all additional points is at best.
Actually, it has redundancy in refining by using the method mentioned above. In ECTM algorithm, it only deals with the influenced polygons. So the average radius of the gained is larger, and the number of additional points is smaller. Thus the number of additional points by using ETCM algorithm is convergent.
5. Time Complexity of ETCM
Suppose that is the time cost for calculating the position of the th additional point and is the time cost for inserting the th additional point. It was pointed out in literature [19] that the average neighboring triangle number with a point is about six in a DT mesh. So for finding an endpoint’s triangle, averaged six intersection judgments would be needed, and then the second triangle intersected with the feature line can be confirmed in the meantime. If a feature line crosses with triangles, averaged intersection judgments are required to the influence range of the feature line, and the time cost is a constant for calculating the intersection point between a circumcircle and a feature line; that is, . So for finding the triangle or quadrangle which contains additional point, times of judgments whether a point is included in a triangle are needed, and the time cost of each judgment is also a constant. In literature [9], it was shown that that is a linear process, so . The time complexity can be calculated as follows: So where is the time cost for calculating the positions of all additional points and is the time cost for inserting all additional points.
6. Experiments Results and Analysis
6.1. Assessment Criteria
There are no criteria for evaluating the performance of refinement embedding algorithms in the existing references. So we give five assessment criteria as follows:(1)stability of refinement;(2)number of additional nodes, which means that, by using as small number of additional points as possible, we can change the original datum set as little as possible;(3)time cost;(4)the quality of the mesh after intruding additional nodes: when is 0.5 approximately, it indicates a result of fine quality;(5)the average aspect ratio () improved by single additional point. The model is given by the following equation: where is the average of all aspect ratios of meshes, is the after refinement, is the after constrained DT, and is the number of additional points.
6.2. Performance Test
The performance tests were performed for all refinement algorithms previously described including Tsai [15], Lu et al. [13], Sapidis and Perucchio [14], Wei et al. [17], and ETCM.
Take 15 planer straight line graphs as testing datum, which comprises a group uniformity distribution random points of each. The coordinates of the points are all in , and the number of feature lines is 15% of the number of the points. The testing environment is on a computer with P4 2.8 G, 256 × 2 M RAM, Windows XP, and Visual C++ 6.0, and the time cost testing tool is CodeTest4.0.
Figure 1 shows a group of resulting mesh of a planer straight line graph, where the thick dashed lines are feature lines and square points are additional points. Figures 2 to 5 show the performance evaluating results. For testing stability of refinement, we adopt the method that exchanges the two endpoints of each feature line and then judge whether the DT results are conformable.
(a) DT
(b) CDT
(c) Algorithm Tsai
(d) Algorithm Lu
(e) Algorithm Sapidis
(f) Algorithm Mei (1)
(g) Algorithm Mei (2)
(h) ETCM
In the refinement stability test result as Figure 1, all algorithms have unique result except algorithm Mei. It indicates that, except Mei, all the other algorithms are stable. Figure 2 shows that method in which Tsai requires the largest number of additional nodes, methods Lu and Sapidis take the second place, and methods Mei and ETCM need nearly the same least number. From Figure 3, we can conclude that method Tsai’s time cost is the biggest and ETCM cost the least time. In particular, ECTM’s time cost curve is with the least undulation, which shows a nearly linear relationship with the number of points. It indicates that the time cost of ETCM is mainly affected by points’ number and hardly affected by the distribution of the points. The robustness of ETCM is best. The AAR results are shown in Figure 4; all refinement algorithms can make the AAR of the meshes resume to or be better than the AAR of meshes before inserting feature lines. Method Lu has the most obvious improvement, Tsai’s effect is the worst, and the other three methods are almost at the same level. Figure 5 shows the AAR improved by single additional point. Method Tsai has the worst capability, Lu is a little better, and the others are pretty much the same thing.
7. Conclusions
Based on the performance analysis of the DT algorithm, a new refinement algorithm named endpoint triangle’s circumcircle model (ETCM) is proposed in this paper. The performance analysis is given and confirmed by simulations. Simulation results show that ETCM achieves as well as the best in refinement stability, additional point number, and mesh quality. In particular, the time cost of ETCM is minimal, and it is influenced by the data distribution least. Hence, it can be concluded that the proposed refinement technique ECTM can be used to solve some practical problems with faster requirement by maintaining good quality of solution.
Conflict of Interests
The authors declare that there is no conflict of interests regarding the publication of this paper.
Acknowledgments
This work was supported by the National Natural Science Foundation of China (no. 51379049), the Fundamental Research Funds for the Central Universities of China (no. HEUCFX41302), and the Scientific Research Foundation for the Returned Overseas Chinese Scholars, Heilongjiang Province (no. LC2013C21).