Abstract

Star sensors make use of astronomical information in stars to determine attitude for spacecrafts by star image recognition. For low-cost star sensors with small field of view, fusion of observed images from multiple fields of view is performed and a novel recognition algorithm based on path optimization by randomly distributed ant colony is proposed. According to pheromone intensity, the ant colony can autonomously figure out a close optimal path without starting or ending point, rather than certifying a starting point first. Feature patterns extracted from the optimal path in guiding template and observed image after fusion are compared to perform star recognition. By the proposed algorithm, starting point for path optimization has no influence on the extracted feature pattern. Thus the star recognition rate is improved due to the higher stability of the extracted pattern. Simulations indicate that the algorithm improves recognition accuracy and robustness against noise for sensors with multiple fields of view.

1. Introduction

Astronomical attitude determination for spacecrafts is the course of obtaining celestial body information through celestial body sensors and calculating attitudes of spacecrafts. Among various celestial body sensors applied for spacecraft attitude determination, star sensors provide the optimal accuracy. Star sensor’s working patterns consist of star tracking mode and all-sky star image recognition. For the latter mode, star sensors work independently of all the other outside information. This raises the requirement of fast star recognition and spacecraft attitude reconstruction in the case of attitude losing or upheaval.

Usually autonomous guidance of star sensors with small FOV is restricted by insufficient guidance star quantity. In order to break the restriction, celestial guidance technology for multi-FOV star sensors is researched and improved.

In the field of recognition for multi-FOV star sensors, Ho et al. [1] propose an identification algorithm which combines recognition using extended images and recognition using combined images. In their work, the stars in the image from the two trackers are singly identified, then the rest of the stars or the stars in the extended images are identified using the stars identified previously. Li et al. [2] come up with a novel recognition algorithm for double FOV star sensors. Firstly, the star coordinates in the image system of each FOV are singly obtained. Secondly the star coordinates of the first FOV are transformed into the image space system of the second FOV. Then, in the image space system of the second FOV, all the observed stars are sorted and their pattern characteristics are extracted for recognition.

As for improvement on star recognition algorithms, Hernandez et al. [3] introduce recognition algorithm based on polygons with similarity invariants. They create polygons using neighbouring stars as vertices and map each polygon to a complex number. The number is used as index for match. Quan et al. [4] develop recognition method based on the adaptive ant colony algorithm. This method draws circles, with the centre of each one being a bright star point and the radius being a special angular distance. For the stars in each star point set, the angular distance of any pair of star points in the star sets is calculated and the star point closest to the centre of the star point set is taken as the starting point. Then the optimal path of this star point set beginning from the starting point is retrieved with the AAC. The feature extracted from the optimal path is used for recognition. Li et al. [5] propose an improved triangle algorithm for all-sky recognition. They select guide stars according to celestial subblocks and optimize observation triangle selection. Then they use angular distances between stars for matching identification. Besides, novel recognition algorithms and database formation methods [618] are developed.

We propose an all-sky autonomous recognition algorithm for multi-FOV star sensors based on features of optimal path. First is to fuse information from multiple images observed by different small FOVs. Then we manage to acquire an optimal traversal path in the fused image, which goes through all the chosen stars with angle distances between stars as weight. We use the path features in each image as index to complete recognition.

2. Star Image Fusion

2.1. Theory

We set a double FOV star sensor equipped with FOV A and B. The size of each FOV is 8°× 8° and the observed image size of each FOV is 512×512 pixels. The visual axes of A and B are set at fixed positions in relation to each other. In the case of single FOV’s small size, double FOVs are applied for observation in order to capture enough guiding stars to guarantee recognition rate. For recognition by double FOV sensor, above all we need to fuse the two images observed in both FOV A and B. Here as the first step of fusion, we choose to identify the stars appearing in both of the two images in the meantime.

We assume that, at any moment, the line of sight of A points to (right ascension, declination) = and the line of sight of FOV B points to (right ascension, declination) = . S is a star observed by both of the two FOVs and its coordinate in celestial sphere coordinate system is . Its coordinate can be transformed to coordinate system of FOV through the transform matrix: Then S’s coordinate in the coordinate system of FOV isIf is the transform matrix from coordinate system of FOV A to FOV B, thenAccording to (2) and (3),Then we have the following:Concluded from (6), if the directions of A and B’s visual axes keep the same in relation to each other but and vary from time to time, changes. That means the relation between S’ coordinate in FOV A image plane and its coordinate in FOV B image plane will also change. Therefore, only through the fixed relation between A and B’s visual axes, it is impossible to identify the stars appearing in both of the two images.

2.2. Fusion

According to the conclusion above, we manage to regard the two images from A and B as independent images to identify the same stars in them. We build triangles locally in both of the images and match triangles from A’s image and B’s.

For example, we set declination distance between the visual axes of A and B fixed at 1° and right ascension distance fixed at 5°. Figures 1(a) and 1(b) illustrate star images observed in A and B.

In each of the two images, the observed stars are divided into groups of 3, according to the angle distance between every 2 stars. We set an angle distance range 8.5543° in advance. In each group, the 3 distances are all within this range. By comparing the angle distances of every 2 groups from A and B, we can locate the same stars observed in both A and B. The circles in Figures 1(a) and 1(b) show the 5 same stars. Afterwards, we can fuse the two images according to the relation between these stars’ coordinates in FOV A image plane coordinate system and their coordinates in FOV B image plane coordinate system. Besides, we regard the stars in common as the benchmark, so we can add the stars observed only in one of the two FOVs into the fused image. Figure 1(c) illustrates the fused image from Figures 1(a) and 1(b).

3. Recognition Based on Ant Colony Path Optimization

3.1. Methodology

We propose a recognition algorithm based on path optimization. In fused images from double FOVs, we use virtual ant colony to create the shortest traversal path which links the chosen stars. We extract feature pattern from the optimal path. Afterwards, we match the pattern of the observed image and the simulated template of guiding star catalogue.

Ant colony traversal optimization is an iteration course of simulating ants leaving pheromones on paths in natural world. In each round of the iteration, an ant is assumed to pass each destination once and only once and returns to its initial position as the end. The probability of ant colony choosing paths is decided by distance and positive feedback principle. Through iteration, the ants manage to choose a shortest traversal path which links all their destinations.

According to (7), star images without noise can be generated by simulation if the direction of the star sensor’s line of sight is set. is the coordinate of any star in simulated image coordinate system. is its declination and is its right ascension. is the size of the star sensor’s CCD image plane array. is the declination which is used to express the direction of the line of sight and is the right ascension. In simulation, the declination and right ascension of all the stars in the guiding star catalogue are input to (7) and their coordinates in the image plane system are output. Thus the stars whose coordinates are within the image plane are projected in the simulated image.

Each time we take the declination and right ascension of one star in guiding star catalogue as the direction of the FOV to generate one simulated image with the same pixels to the observed image. Therefore, each of these stars corresponds to one simulated image. These simulated star images are named as guiding templates. In each guiding template, the star which it corresponds to is located in its centre and the stars in a defined neighbourhood of the centre star are taken as destinations for ant colony traversal. In initial status, we assume that ant colony is distributed randomly at these destinations. Each ant departs from its initial position and moves to its next destination according to a function. The function is formed by the pheromone on the path and the distance between the ant’s position and the other destinations. The length of the optimal traversal path is taken as a feature of the guiding template. Besides, the angle distances between the centre star and its two neighbour stars on the optimal path are also calculated to form the feature pattern.

We assume that there are ants and destinations, and is the distance between destination and , which is calculated as the angle distance between the two stars. is the remaining pheromone intensity on the path between and . At the initial moment, the pheromone intensity on all the paths is a constant. At moment , for ant , its transferring probability from to isIn (8), is the assemblage of destinations in which ant has not passed. Heuristic factor is the transferring expectation from to , usually . is the importance of pheromone and is the importance of heuristic factor. Every moment, the pheromone on all the paths is updated:In (9), is volatilization coefficient of pheromone on the path between and . is the change of pheromone on the path between and . At initial moment, . is the pheromone which ant leaves on the path between and . . is the total pheromone which ant leaves. is the path length which ant passes. If ant does not pass this path, [19].

3.2. Feature Extraction

In the observed image shown in Figure 1(c), we define the star with the least distance to the image centre as the main star and define a main star’s neighbourhood within a distance range. All the stars in the neighbourhood are destinations for ant colony traversal. As Figure 2 shows, when =1, =1, =0.5, =100, maximum iteration times =100, and the size of ant colony is the same to the destination stars, we can get the optimal traversal path in observed image after fusion. For any observed image, such path is a closed curve. For the observed image, the feature pattern consists of the optimal path length and the angle distances between the main star and its two neighbour stars on the optimal path.

Similarly, we set the centre star as the main star for each guiding template and use randomly distributed ant colony to figure out an optimal path in it. Then we record the optimal path length projected in the image plane, angle distances between the main star and its two neighbours on the path as feature patterns for guiding templates.

3.3. Pattern Recognition

According to the features of optimal paths in the guiding template and the observed image, we can identify the guiding template which matches the observed image. Figure 3 shows the optimal path in the guiding template which matches the fused image in Figure 2 successfully.

4. Simulation

All of our simulations are performed on a 3.20 Ghz Pentium (R) desktop PC and in a Matlab environment. We generate 1000 pairs of observed star images by Monte Carlo method to validate the feasibility of proposed algorithm. Each pair consists of two observed images from double FOV sensor’s two FOVs. To generate these 1000 pairs of star images by (7), we randomly generate the 1000 pairs of directions of the two sensors’ lines of sight. The prerequisite of this generation is making the declination distance between the two lines, , always be 1° and the right ascension distance, , always be 5°. Here and are the declinations which the two sensors’ lines of sight point to and and are the right ascensions.

The single FOV size is 8°×8° and the single observed image size is 512×512 pixels. The complete basic star database consists of 9098 stars from Catalogue of Bright Stars without variable and double stars. After fusion, recognition by the proposed algorithm, adaptive ant colony algorithm, and improved triangle algorithm is performed on these simulated images. The performance of the proposed algorithm is compared with the other two to assess its efficiency.

4.1. Recognition Rate under Noise

We add false star noise, missing star noise, position noise, and magnitude noise in the 1000 simulated images after fusion and census the total recognition rates with image noise increasing.

4.1.1. Recognition Rate under Position Noise

We add position noise in observed images to assess the robustness of the proposed algorithm. The probability of the position noise is defined as follows. Position noise is assumed to be caused by star pixels moving to any one of its 8 neighbour pixels with probability of 1/8. The probability of the noise is the ratio of such star pixels to all the star pixels of the image. If the probability of the noise is 0, none of the stars in the image moves. If it is 1, each star moves to one of its neighbour pixels. Figure 4 shows recognition rates by when the probability of the position noise increases from 0 to 0.7.

4.1.2. Recognition Rate under Magnitude Noise

Gaussian magnitude noise is added in observed images for us to assess the robustness of the proposed algorithm. Star magnitude is expressed by magnitude level. Star magnitude increases by 2.51 times when star magnitude level decreases 1 level. We assume that when the greyscale of a star is 255, its magnitude level is 5, then each star’s magnitude level and its greyscale level can be transformed to each other according to . [20] The magnitude noise is expressed by standard deviation of magnitude level.

The recognition rates by the three algorithms are shown in Figure 5 when the probability of the position noise is kept constant at 0.4 and the standard deviation of magnitude level varies from 0.5 to 2.5.

4.1.3. Impact of False Stars and Missing Stars

We add false stars, which mean bright pixels created by observation error, in the fused image from each pair to assess the performance of the proposed algorithm under special situation. 1 to 5 false stars are added on the sensor image, and the positions of these false stars can be set in any pixel of the observed image. The false stars have the same magnitude level, i.e., Level 5. Recognition rates on observed images with false stars are shown in Figure 6.

Missing stars refer to the stars that should have been captured but fail to appear in the FOV. We remove 1-5 stars randomly from the observed star images to test the robustness of the proposed algorithm. Recognition rates on observed images with missing stars are shown in Figure 7.

4.2. Recognition Time

To avoid excessive CPU usage by performing the tests on the complete dataset, we divide the 1000 pairs of observed images without noise equally into 10 groups, complete image fusion on each pair and perform recognition for one group each time. In each group of 10 fused images, we census the average time for one recognition. The recognition time for each group is shown in Table 1.

Table 1 shows that the proposed algorithm presents higher speed than adaptive ant colony algorithm. That’s because for the proposed algorithm star point sets and starting point for optimal path are not needed. The recognition time of the improved triangle algorithm is a little less than the other two algorithms because this algorithm improves the procedure of looking up in the star characteristic catalogue. For this algorithm, the angular distances are arranged in an ascending order as guide star characteristic catalogue, thus the time for looking up in the catalogue is reduced.

4.3. Comparison with Single FOV Recognition

We simulate a single FOV sensor. Its FOV size is 8°×8° and the size of its plane array is 512 pixels × 512 pixels. Then 500 random directions of line of sight are generated by simulation, for both the double FOV sensor above and the single FOV sensor. We add same position noise both to the observed image of the single FOV sensor and to the fused image of the double FOV sensor. The probability of this position noise is set to be 0.2.

Obviously, the observed star number changes with the sensor’s line of sight pointing to different declination zones. We census the average star number in each FOV of the double FOV sensor and the star number in the FOV of the single FOV sensor. Then recognition is performed in both of the sensors by the proposed algorithm. The recognition results are shown in Table 2.

We can see from Table 2 that when the line of sight points to declination zones with less observed stars, the recognition becomes more difficult. However, the double FOV sensor can still present higher recognition rates than single FOV sensor in the case of lack of stars (due to observational reason).

4.4. Analysis

The proposed algorithm presents better performance under noise than the other two algorithms. It’s because the ants from ant colony are distributed randomly at the stars at the initial moment, rather than path optimization starts from a designated point which is chosen according to star position and magnitude. Thus the case that the feature pattern is influenced by the choice of the starting point is eliminated. For methods which contain path optimization from a designated point, image noise could cause path optimization starting from different designated points in guiding template and observed image. Then the feature patterns of observed image and guiding template would present difference that leads to wrong match.

In the proposed algorithm, the feature pattern consists of three distances. That means the magnitude of star pixels is not used for the calculation of the feature pattern. So the algorithm stability against magnitude noise is enhanced.

Seen from Table 2, sensors with multiple FOVs can improve recognition performance in the case of lack of observed stars.

5. Conclusions

We propose a full-sky star recognition algorithm for low-cost star sensors with small size FOV and discuss recognition between different FOVs in multi-FOV sensor. Simulations under noise indicate that the algorithm based on path optimization of randomly distributed ant colony can overcome the shortcoming caused by designated starting point of path optimization and improve recognition rate effectively.

Data Availability

The Bright Star Catalogue can be found at http://tdc-www.harvard.edu/catalogs/bsc5.html.

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.

Acknowledgments

This work is funded by advanced research project of navy equipment [grant number is secret because of military security].