Kurs NEWS

Kurs One machine vision system bench testing was successfully conducted

Several successful experiments were conducted to bench-test the serviceability of the obtained solutions to different tasks.
The first task was to detect images of an uncooperative spacecraft (USC) on the still frame and determine the region of ​​interest for further analysis.
To solve the task, a special image inspection function was developed to detect any compact enclosed area with sharp edges. A complete grayscale still frame was the source data.
The work result is shown in Figure 1.1.

Figure 1.1 Detecting the region of ​​interest on the still frame.

The top left image is a source still frame. The gradient and the 16 times compressed image with the area found are below. On the right is the image fragment to be sent for further analysis. The work results with other stills are shown in Figure 1.2.

Figure 1.2 Detecting the region of ​​interest on the still frame.

The second task processed was to solve the problem of detecting the USC mockup’s position and orientation relative to the camera, using the method described above. Docking surface corner points were used as fiducial points. They were detected as intersection points of straight lines being the docking surface edges.

Figure 1.3 shows the result of detecting the visible USC edges. This image is the result of solving the problem described above. By using the detected gradient and threshold value, pixels in the region of ​​interest were binarized — exclusively those image pixels that exceed the specified brightness gradient threshold value were located. These pixels are shown in white in Figure 1.3 on the right.

Figure 1.3 Image brightness gradient binarization.

The resulting binary image was used to detect straight lines by using the method described above.
As a result, it was possible to receive equations of straight lines, the intersections of which allow for highly accurate detection of the docking surface corner points position on each image. The procedure results are shown in Figure 1.4.

Figure 1.4 Detection of docking surface corner points position.

Since USC parameter estimates of its relative position and orientation at any specific time are known because of tracking, it is not difficult to compare them with the points calculated before. The respective pairs of points are shown in Figure 1.5. The screen on the left shows the synthesized image, for which the position of the fiducial points is known. The screen on the right shows fiducial points detected on the image received based on the captured still frame.

Figure 1.5 Fiducial points on the mockup and the image. 

To bench test the accuracy of the method, the subsequent recalculation of the USC mockup’s orientation and position relative to the CVS camera into the bench mockup’s rotation angles was carried out. The algorithm operation example is shown in Figure 1.6.

Figure 1.6 Detecting the position and orientation according to fiducial points — USC mockup’s docking surface corner points.