Research Article

Split Viewer

Korean J. Remote Sens. 2024; 40(1): 103-114

Published online: February 28, 2024

https://doi.org/10.7780/kjrs.2024.40.1.10

© Korean Society of Remote Sensing

Matching Performance Analysis of Upsampled Satellite Image and GCP Chip for Establishing Automatic Precision Sensor Orientation for High-Resolution Satellite Images

Hyeon-Gyeong Choi1, Sung-Joo Yoon2, Sunghyeon Kim3, Taejung Kim4*

1Master Student, Department of Geoinformatic Engineering, Inha University, Incheon, Republic of Korea
2PhD Candidate, Department of Geoinformatic Engineering, Inha University, Incheon, Republic of Korea
3Undergraduate Student, Department of Geoinformatic Engineering, Inha University, Incheon, Republic of Korea
4Professor, Department of Geoinformatic Engineering, Inha University, Incheon, Republic of Korea

Correspondence to : Taejung Kim
E-mail: tezid@inha.ac.kr

Received: February 11, 2024; Revised: February 17, 2024; Accepted: February 25, 2024

This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (https://creativecommons.org/licenses/by-nc/4.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.

The escalating demands for high-resolution satellite imagery necessitate the dissemination of geospatial data with superior accuracy. Achieving precise positioning is imperative for mitigating geometric distortions inherent in high-resolution satellite imagery. However, maintaining sub-pixel level accuracy poses significant challenges within the current technological landscape. This research introduces an approach wherein upsampling is employed on both the satellite image and ground control points (GCPs) chip, facilitating the establishment of a high-resolution satellite image precision sensor orientation. The ensuing analysis entails a comprehensive comparison of matching performance. To evaluate the proposed methodology, the Compact Advanced Satellite 500-1 (CAS500-1), boasting a resolution of 0.5 m, serves as the high-resolution satellite image. Correspondingly, GCP chips with resolutions of 0.25 m and 0.5 m are utilized for the South Korean and North Korean regions, respectively. Results from the experiment reveal that concurrent upsampling of satellite imagery and GCP chips enhances matching performance by up to 50% in comparison to the original resolution. Furthermore, the position error only improved with 2x upsampling. However, with 3x upsampling, the position error tended to increase. This study affirms that meticulous upsampling of high-resolution satellite imagery and GCP chips can yield sub-pixel-level positioning accuracy, thereby advancing the state-of-the-art in the field.

Keywords High-resolution satellite, Sensor orientation, Geometric correction, GCP matching, CAS500-1

Recent advancements in satellite image utilization technology have led to growing demands for high-resolution satellite imagery. In response, Korea has developed and launched a Korean Multi-Purpose Satellite (KOMPSAT) with a resolution of 0.55 m. Furthermore, Compact Advanced Satellite 500 (CAS500) with a resolution of 0.5 m has been developed (Lee et al., 2017; Kim, 2020). Korea is currently participating in the development of comprehensive land management technology using satellite image information big data to improve land management technology (Kim, 2023).

The correction of geometric distortions caused by errors in satellite global positioning system (GPS) receivers and attitude control sensors is necessary when using satellite imagery. This correction involves aligning image coordinates with ground coordinates using ground control points (GCPs). The establishment of a precise sensor orientation through this process is crucial in improving the accuracy of positioning in high-resolution satellite imagery. In this context, even minor errors can have a significant impact, limiting the achievable positioning accuracy to a range of 1–1.5 pixels (Choi et al., 2023).

Traditionally, establishing precise sensor orientation requires manually acquiring GCPs, which incurs significant time and economic costs. To address this challenge, previous studies have introduced GCP chips - small image fragments with precise ground coordinates. They have developed a technology that can automatically match GCP chips and satellite images to establish an automatic precision sensor orientation (Park et al., 2020; Yoon, 2019).

The use of GCP chips for precise sensor orientation establishment automation provided a cost-effective means of producing and distributing calibrated satellite images. As a result, researchers have actively pursued studies aimed at achieving higher positioning accuracy through the establishment of automatic precise sensor orientation. Oh et al. (2022) investigated the feasibility of automating the establishment of high-resolution satellite image precision sensor orientation using GCP chips.

Meanwhile, Shin et al. (2018) conducted an experiment in GCP chip matching using pan-sharpened images to establish an automatic precision sensor orientation tailored for high-resolution satellite imagery. The findings indicated a significant improvement in matching performance when using pan-sharpened images compared to original multispectral images. Lee and Kim (2021) investigated the effectiveness of using high-resolution GCP chips to establish a precise sensor orientation for medium-resolution satellite imagery. They used satellite image upsampling instead of pan-sharpening to improve the matching performance of medium-resolution satellite images without pan bands.

The study’s results indicated that upsampling to match at the medium resolution was more effective in terms of geometric accuracy than matching at the original resolution. Choi and Kim (2022a) conducted a follow-up study in which they adjusted the number of chips used to establish the precision sensor orientation during satellite image upsampling. This was done to reduce the time required to establish a medium-resolution automatic precision sensor orientation and improve GCP chip matching performance. Previous studies have shown that the performance of GCP chip matching is affected by the spatial resolution of both the GCP chip and satellite imagery.

The objective of this study is to develop an improved method for establishing automatic precision sensor orientation in high-resolution satellite images. To achieve this, we conducted chip matching by upsampling both the satellite and GCP chip images. We then performed a comprehensive comparative analysis of the results using the established precision sensor orientation. The study concluded that upsampling both the high-resolution satellite images and the GCP chip images can improve the performance of the sensor orientation when matching them.

2.1. Materials

This study utilized the pre-existing spatial information data of the Korean Peninsula, as described in a previous study (Park et al., 2020). The GCP chips for the South Korean region were created using precise aerial orthoimages with a spatial resolution of 0.25 m. It includes three different types of ground coordinate information (Yoon et al., 2018). These three types of ground coordinate information include the unified control point, triangulation point, and image control point. The Korea National Geographic Information Institute generates and manages the unified control point and triangulation point. The image control point is used in the production of the national basic map and aerial orthoimage.

In the case of North Korea, it is difficult to acquire and continuously manage GCP because it is inaccessible. Therefore, the GCP chips for the North Korean region were created using satellite orthoimages with a spatial resolution of 0.5 m. It includes orthoimagery plane coordinates and digital elevation model (DEM) ellipsoid height. Therefore, The GCP chips in the South Korean region were created from color images extracted from aerial orthoimagery. In contrast, the GCP chips in the North Korean region were created from single-band images extracted from satellite orthoimagery generated from panchromatic images. Table 1 provides detailed specifications of the GCP chips used. An example of the GCP chips is shown in Fig. 1. Fig. 1(a) shows GCP chips in South Korea, while Fig. 1(b) shows GCP chips in North Korea.

Fig. 1. Example of GCP chip.

Table 1 Specifications for GCP chip used

AreaSouth KoreaNorth Korea
Ground coordinates referenceUnified control point (UCP), Image control point (ICP), Triangulation point (TP)Plane coordinates of the orthogonal image, DEM ellipsoidal height
Source imageAerial orthoimageSatellite orthoimage
Chip size1,027 x 1,027 pixel513 x 513 pixel
GSD0.25 m0.50 m
BandRed, Green, BlueGray

GSD: ground sample distance.



The CAS500-1 image, a high-resolution satellite image with a spatial resolution of 0.5 meters, was utilized in this study. The specifications of the satellite images are shown in Table 2. The study area includes Sejong, Andong, and Pyongyang. Table 3 presents a list of the satellite images used in this study. Additionally, the study area’s geographic distribution is displayed in Figs. 2 to 4.

Fig. 2. Distribution of study area (Sejong).

Fig. 3. Distribution of study area (Andong).

Fig. 4. Distribution of study area (Pyongyang).

Table 2 Specifications for satellite images used

Satellite imageCAS500-1
Product LevelLevel 1R
Spectral resolutionPanchromatic: 450–900 nm
Multispectral: 450–900 nm (Blue, Green, Red, NIR)
GSDPanchromatic: 0.5 m
Multispectral: 2.0 m
OrbitCircular sun-synchronous (500 km)
Swath≥ 12km
Radiometric resolution12 bits


Table 3 List of used satellite images

DataAcquisition dateUpper left coordinate (Lat, Lon)Bottom low coordinate (Lat, Lon)Quantity of GCP chip
Sejong 12021.12.0736.509042465, 127.26639565336.426378705, 127.42749891156
Sejong 22021.12.0736.612209951, 127.23868325936.529558877, 127.40001466140
Sejong 32022.02.2736.625661023, 127.20174770236.553286018, 127.40666886168
Sejong 42022.03.0336.533186804, 127.22771661136.450030787, 127.39011731055
Sejong 52023.03.1636.506973164, 127.24423404436.427195422, 127.41496116258
Sejong 62023.03.1636.609898079, 127.21483367336.530137740, 127.38583530940
Andong 12021.11.1136.724915722, 128.87505996536.642481614, 129.03803992839
Andong 22022.06.1236.619879282, 128.68802042436.537777343, 128.85064801136
Andong 32022.12.0236.449640721, 128.89364579236.368206234, 129.05863603643
Pyongyang 12021.10.1939.156498918, 125.64549434439.073884439, 125.83948529192
Pyongyang 22023.01.1039.004960711, 125.71048985338.922023330, 125.88835808474


2.2. Methods

Fig. 5 presents a schematic of the procedure for establishing automatic precision sensor orientation using GCP chip matching, as applied in this paper.

Fig. 5. Flowchart in this study.

Based on the findings of Shin et al. (2018), we conducted image fusion, also known as pan-sharpening, to enhance the matching accuracy between high-resolution satellite images and GCP chips. Pan-sharpening is a technology that combines high-resolution panchromatic band images with low-resolution multi-spectral band images to reconstruct high-resolution color images. This study utilized an algorithm based on the component-substitution (CS) fusion technique, as described in previous works (Park et al., 2020; Vivine et al., 2015).

This paper proposed establishing the initial sensor orientation first. The sensor orientation was established using the rational function model (RFM) for this study. The RFM is a mathematical sensor model mainly used for satellite imagery. It is a model expression that mathematically expresses the relationship between image coordinates and ground coordinates, without physical sensor information. It can be established by applying the rational polynomial coefficients (RPCs) provided with the original satellite image to the formula. Eq. (1) and (2) delineate the formulas of the RFM corresponding to the columns and rows of the image.

cn=P1(Xn,Yn,Zn)P2(Xn,Yn,Zn)= i=03 j=03 k=03 aijk Xni Ynj Znk i=03 j=03 k=03 bijk Xni Ynj Znk
rn=P3(Xn,Yn,Zn)P4(Xn,Yn,Zn)= i=0 3 j=0 3 k=0 3 cijk Xni Ynj Znk i=0 3 j=0 3 k=0 3 dijk Xni Ynj Znk

The formula provided uses cn and rn to represent the normalized image coordinates, and Xn, Yn, and Zn to denote the normalized ground coordinates. In the given equation, aijk, bijk, cijk, and dijk are the coefficients of the rational polynomials P1, P2, P3, and P4, respectively. These coefficients constitute the rational function expression defining the relationship between the image’s column, row, and ground coordinates. We set the values of the RPCs for i + j + k ≥ 4 to zero to formulate the RFM model expression using only the coefficients of the third term or lower in the ground coordinates. Additionally, we set b000 and d000 to 1 to fix the proportionality constant of the rational function expression. The remaining 78 coefficients for ground coordinate terms with i + j + k from 1 to 3 were provided as RPC files with satellite images.

Using the established initial sensor orientation, we conducted a comprehensive search for all GCP chips within the entire coverage of the satellite image. The image corner coordinates, derived from the initial sensor orientation, were used to calculate the minimum rectangular area encompassing the image region. To account for potential errors in the initial sensor orientation, a margin was incorporated into the image area. The margin dimensions were set at 250 meters in both the x and y directions of the satellite image. High-resolution satellites have a limited field of view and are affected by the area’s topography. Therefore, the margin size was determined based on the search for stable GCP chips and the potential for overestimation to establish accurate sensor orientation. The search for GCP chips was confined to the area within which the margin was applied, referencing the GCP database.

Subsequently, we defined the region surrounding each GCP chip and executed upsampling of the satellite image to the desired scale. Previous research has shown that bilinear interpolation is advantageous for upsampling satellite images in terms of both performance and time efficiency (Choi and Kim, 2022).

Therefore, we adopted bilinear interpolation for the upsampling process. In this study, we applied upsampling factors ranging from 1 to 3 times for each set of experimental data, resulting in spatial resolutions of 0.5 m, 0.25 m, and 0.167 m. The corresponding upsampling factors are shown in Fig. 6. Fig. 6 illustrates the original satellite image transformed according to the upsampling factor. It was evident that a larger upsampling factor leads to a smoother appearance of the image. To save processing speed and memory, the upsampling was only applied to the navigation area relative to the GCP, rather than the entire original satellite image.

Fig. 6. Different satellite image patches and a GCP chip according to upsampling factors.

Afterward, we upsampled and remapped the GCP chip image to match the geometry of the satellite image and its newly achieved spatial resolution. To ensure automatic matching using an area-based image-matching algorithm, we deformed the GCP chip to synchronize its resolution and geometry with that of the satellite image. Fig. 7 presents a depiction of the resampling process for the GCP chip, considering the geometry and resolution of the satellite image. This procedure was executed for all identified GCP chips.

Fig. 7. Concept of resampling GCP chip.

Next, we created an image pyramid to improve processing efficiency for image matching. Four levels of image pyramids were constructed for each satellite image patch. The matching process began at the top layer of the pyramid and used the Census algorithm in the last layer and the zero-mean normalized cross-correlation (ZNCC) algorithm in the remaining layers, except for the last one. The previous study (Yoon, 2019) has shown that the ZNCC algorithm is robust to image size variations, but its accuracy in extracting matching points is somewhat lower. In contrast, the Census algorithm has relatively good accuracy in extracting matching points but is sensitive to variations in image size.

Therefore, we used the Census algorithm for the original image and the ZNCC algorithm for the other reduced images. Despite our efforts, there were still some mismatch points among the automatically matched GCPs. To address this issue, we utilized the random sample consensus (RANSAC) technique (Fischler and Bolles, 1981) to automatically remove them. The detailed process of GCP chip matching and mismatch point removal is omitted here for brevity and to avoid redundancy, as it has been covered in previous research papers (Yoon et al., 2018; Son et al., 2021).

As the final step, we established a precise sensor orientation based on the auto-matching results with removed mismatches. We estimated the coefficients of the following error correction equation to rectify any errors in the initial sensor orientation.

c=a11c0+a12r0+a13
r=a21r0+a22r0+a23

In the equation above, c0, r0, c, and r represent the image coordinates before and after correction, respectively. The coefficients of the error correction formula are represented by a11 to a23. After estimating the coefficients of the error correction equation, the RPC coefficients were recalculated using this equation to establish precise sensor orientation. The precision sensor orientation was established for each upsampling factor and then converted back to the original resolution for accuracy comparison (Lee and Kim, 2021). The accuracy of the precision sensor orientation was compared for each upsampling factor.

In this study, model error and check error were employed as performance evaluation metrics. Model error represents the relative root mean square error (rRMSE) measured for the GCPs used to refine the initial sensor model. Check error, on the other hand, is expressed as the rRMSE measured for checkpoints manually extracted from reference points not included in the model points. Eq. (5) details the calculation of rRMSE, where RMSEcol represents the RMSE in the column direction of the image, and RMSErow represents the RMSE in the row direction of the image. Accuracy analyses were conducted based on the original image with a spatial resolution of 0.5 m.

rRMSE=(RMSEcol)2+(RMSErow)2

Table 4 shows the number of GCP chips within the acquisition area and the manually acquired number of GCP chips for verification across the 11 CAS500-1 satellite images used in the experiment.

Table 4 Quantity of GCP chips per experimental case

DataQuantity of GCP chipsQuantity of checkpoint
Sejong 1566
Sejong 2406
Sejong 3686
Sejong 4556
Sejong 5587
Sejong 6406
Andong 1395
Andong 2364
Andong 3435
Pyongyang 19210
Pyongyang 2748


The experimental results are presented in Table 5, and the performance evaluation indicator graphs for each study area are shown in Figs. 8 to 18, illustrating the observed trends in the experimental outcomes. In each figure, (a) and (b) represent the model error and check error, respectively, as functions of the number of GCPs and the upsampling scale.

Fig. 8. Result of Sejong1.

Fig. 9. Result of Sejong2.

Fig. 10. Result of Sejong3.

Fig. 11. Result of Sejong4.

Fig. 12. Result of Sejong5.

Fig. 13. Result of Sejong6.

Fig. 14. Result of Andong1.

Fig. 15. Result of Andong2.

Fig. 16. Result of Andong3.

Fig. 17. Result of Pyongyang1.

Fig. 18. Result of Pyongyang2.

Table 5 Matching result

Sejong 1Sejong 2
GSD (m)Error (pixel)Matching rateGSD (m)Error (pixel)Matching rate
ModelCheckModelCheck
0.5001.2851.54645%0.5001.1981.12550%
0.2500.6261.03636%0.2500.6240.96335%
0.1670.4030.73834%0.1670.4231.04338%
0.1250.2771.80736%0.1250.3461.08145%
Sejong 3Sejong 4
GSD (m)Error (pixel)Matching rateGSD (m)Error (pixel)Matching rate
ModelCheckModelCheck
0.5001.3672.03934%0.5001.2300.93645%
0.2500.7391.41925%0.2500.6050.63236%
0.1670.5024.36328%0.1670.4772.65140%
0.1250.3523.59022%0.1250.3240.88133%
Sejong 5Sejong 6
GSD (m)Error (pixel)Matching rateGSD (m)Error (pixel)Matching rate
ModelCheckModelCheck
0.5001.4331.57341%0.5001.3122.22555%
0.2500.6610.71434%0.2500.5591.66535%
0.1670.4652.23333%0.1670.3862.38743%
0.1250.2941.80934%0.1250.3352.63940%
Andong 1Andong 2
GSD (m)Error (pixel)Matching rateGSD (m)Error (pixel)Matching rate
ModelCheckModelCheck
0.5001.1741.86641%0.5001.0341.39647%
0.2500.6661.47636%0.2500.6001.21139%
0.1670.4462.05431%0.1670.4351.00233%
0.1250.3281.77344%0.1250.3251.54931%
Andong 3Pyongyang 1
GSD (m)Error (pixel)Matching rateGSD (m)Error (pixel)Matching rate
ModelCheckModelCheck
0.5001.2921.45342%0.5001.3462.18941%
0.2500.6541.34037%0.2500.7051.86027%
0.1670.4181.31542%0.1670.4354.30224%
0.1250.3042.34928%0.1250.3473.14121%
Pyongyang 2
GSD (m)Error (pixel)Matching rate
ModelCheck
0.5001.4771.93235%
0.2500.5611.00129%
0.1670.4381.97926%
0.1250.3413.21726%


Part (a) of each experiment shows that the model error consistently decreased as the upsampling factor increased and the ground sample distance (GSD) became smaller. These results indicated that coefficient estimation was successfully applied in the error correction equation for the RPC update. It can be interpreted that the higher the upsampling factor was applied, the closer the GCPs were to the estimated error correction, the closer they were selected as model points, and consequently, the smaller the GSD compared to the original resolution, the smaller the model estimation error.

Part (b) of each experiment demonstrates a decrease in check error when upsampling by 2x from the original resolution for all experiments. However, when upsampling by 3x, the error rate generally increased again. The 4x results also showed inconsistent results for each experiment. This observation is consistent with previous studies, such as Lee and Kim (2021), which have shown that excessive upsampling scaling relative to the original data reduces model estimation error but does not necessarily result in improved accuracy.

When establishing the orientation of a precision sensor at the original resolution, the check error exceeded 2 pixels in many cases, with an average check error of 1.7 pixels. When establishing the orientation of a precision sensor using an upsampling factor of 2, the check error was less than 2 pixels in all experiments, with an average check error of 1.2 pixels. We achieved a 50% improvement over the original resolution. In several experiments, where the original resolution resulted in check errors of over 1 pixel, performance considerably improved when the 2x upsampling factor was applied, resulting in less than 1 pixel check error.

To validate the experimental results objectively, we compared the matching results for each resolution. Fig. 19 shows the results of the matching GCP chip. The matching results of the GCP chip at the original resolution of 0.5 m were roughly aligned to the resolution grid. However, the matching results of the GCP chip at 0.25 m upsampled were more precisely aligned. However, when upsampled to 0.167 meters, the GCP chip over-interpolates, resulting in a smoothed image that makes it difficult to find the exact match point.

Fig. 19. Matching results for each GSD. (a) Original satellite image, (b) matching result of original GSD (0.5 m), (c) matching result of 0.25 m GSD, and (d) matching result of 0.167 m GSD.

In this chapter, we delve into the distinctions between our findings and those of preceding studies utilizing medium-resolution satellite imagery (Lee and Kim, 2021; Choi and Kim, 2022a). Previous research utilizing medium-resolution satellite imagery, with a resolution of 5 m and high-resolution GCP chips featuring a resolution of 0.25 m, identified an optimal upsampling ratio of 3 times, corresponding to a GSD of 1.67 m. In our present study, employing high-resolution satellite imagery at a resolution of 0.5 m along with a GCP chip sharing the same specifications, the optimal upsampling ratio was determined to be 2 times, equating to a GSD of 0.25 m.

In the case of North Korea, both the satellite image and the GCP chip had a resolution of 0.5 m. However, the results were still improved by upsampling to 0.25 m, which may be due to a limitation of the algorithm used for chip matching. The image-matching algorithm used in this study was a pixel-by-pixel similarity search method. In summary, when upsampled to 0.25 m, the similarity search interval was applied to 0.5 pixels of the original satellite image. This improvement was believed to be a result of this adjustment.

During the study, a decrease in accuracy was observed when the upsampling ratio was tripled, which deviates from the findings in low- and medium-resolution images. This phenomenon may be attributed to disparities in the sharpness of the original satellite image or the act of upsampling to a GSD higher than the chip resolution. Future research should investigate whether the clarity of the original image or the chip resolution has a greater impact on matching results. This is crucial for advancing our understanding and achieving higher accuracy in the context of high-resolution satellite images.

In this study, we applied upsampling to both satellite images and pre-existing high-resolution GCP chips to establish a highresolution satellite image precision sensor orientation. We subsequently conducted a comparative analysis of the matching performance. The experimental outcomes confirmed a notable enhancement in matching performance when upsampling was applied to both satellite images and GCP chips.

When establishing a precision sensor orientation with a 2x upsampling factor, an improvement of up to 50% was observed compared to the original resolution. In certain experimental outcomes, the check error approached or fell below 1 pixel. Due to technical constraints, achieving sub-pixel positioning accuracy when establishing precision sensor orientation on high-resolution satellite imagery at its original resolution was challenging. However, our results demonstrated that subpixel-level positioning accuracy can be achieved through meticulous upsampling of both satellite images and GCP chips.

Moreover, despite employing different specifications for GCP chips in the South and North Korean regions, the experimental results exhibited comparable trends. Through this, we confirmed that if it is possible to build GCP chips in inaccessible areas such as North Korea as well as South Korea, this method is sufficiently applicable, and it is possible to establish an improved precision sensor orientation.

In this study, we tested a GCP chip with a GSD similar to that of satellite images. However, if a more sophisticated GCP chip is built and utilized, the position error could be further reduced through the method proposed in this study. Therefore, further research using high-resolution and high-accuracy GCP chips is needed for precise sensor modeling of high-resolution satellite images. It is also necessary to consider the use of ultra-high-resolution data such as drones. We hope that this research will help in the processing and utilization of high-resolution satellite images.

This work was carried out with the support of “Cooperative Research Program for Agriculture Science and Technology Development (Project No. PJ 016233)” Rural Development Administration, Republic of Korea. This work is supported by the Korea Agency for Infrastructure Technology Advancement grant funded by the Ministry of Land, Infrastructure and Transport (Grant No. RS-2022-00155763).

No potential conflict of interest relevant to this article was reported.

  1. Choi, H. G., and Kim, T., 2022a. Analysis of optimal resolution and number of GCP chips for precision sensor modeling efficiency in satellite images. Korean Journal of Remote Sensing, 38(6-1), 1445-1462. https://doi.org/10.7780/kjrs.2022.38.6.1.34
  2. Choi, H. G., and Kim, T., 2022b. Improving automated precision sensor modeling accuracy of mid-resolution satellite images by adjustment matching resolution. In Proceedings of the 2022 Korean Society of Remote Sensing Fall Conference, Busan, Republic of Korea, Nov. 7-9, pp. 89-92.
  3. Choi, S., Lee, D., and Seo, D., 2023. Review of feasibility of compliance with CARD4L geometric correction requirements for high resolution optical satellite images. In Proceedings of the 2023 Korean Society for Aeronautical and Space Sciences Conference, Hongcheon, Republic of Korea, Nov. 15-17, pp. 610-613.
  4. Fischler, M. A., and Bolles, R. C., 1981. Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Communications of the ACM, 24(6), 381-395. https://doi.org/10.1145/358669.358692
  5. Kim, T., 2020. Current research and development status for CAS 500-1/2 image processing and utilization technology. Korean Journal of Remote Sensing, 36(5-2), 861-866. https://doi.org/10.7780/kjrs.2020.36.5.2.1
  6. Kim, T., 2023. Introduction to development of comprehensive land management technology using satellite image information bigdata. Korean Journal of Remote Sensing, 39(5-4), 1069-1073. https://doi.org/10.7780/kjrs.2023.39.5.4.1
  7. Lee, K., Kim, Y., and Choi, H., 2017. KOMPSAT image processing and applications. Korean Journal of Remote Sensing, 33(6-3), 1171-1177. https://doi.org/10.7780/kjrs.2017.33.6.3.1
  8. Lee, Y., and Kim, T., 2021. Determination of spatial resolution to improve GCP chip matching performance for CAS-4. Korean Journal of Remote Sensing, 37(6-1), 1517-1526. https://doi.org/10.7780/kjrs.2021.37.6.1.3
  9. Oh, J., Seo, D., Lee, C., Seong, S., and Choi, J., 2022. Automated RPCs bias compensation for KOMPSAT imagery using orthoimage GCP chips in Korea. IEEE Access, 10, 118465-118474. https://doi.org/10.1109/ACCESS.2022.3217788
  10. Park, H., Son, J. H., Jung, H. S., Kweon, K. E., Lee, K. D., and Kim, T., 2020. Development of the precision image processing system for CAS-500. Korean Journal of Remote Sensing, 36(5-2), 881-891. https://doi.org/10.7780/kjrs.2020.36.5.2.3
  11. Shin, J. I., Kim, T., Yoon, W. S., and Park, H. J., 2018. Improving satellite-aerial image matching success rate by image fusion. In Proceedings of the 2018 European Conference on Electrical Engineering and Computer Science (EECS), Bern, Switzerland, Dec. 20-22, pp. 224-227. https://doi.org/10.1109/EECS.2018.00049
  12. Son, J. H., Yoon, W., Kim, T., and Rhee, S., 2021. Iterative precision geometric correction for high-resolution satellite images. Korean Journal of Remote Sensing, 37(3), 431-447. https://doi.org/10.7780/kjrs.2021.37.3.6
  13. Vivone, G., Alparone, L., Chanussot, J., Mura, M. D., Garzelli, A., and Licciardi, G. A., et al, 2015. A critical comparison among pansharpening algorithms. IEEE Transactions on Geoscience and Remote Sensing, 53(5), 2565-2586. https://doi.org/10.1109/TGRS.2014.2361734
  14. Yoon, W., 2019. A Study on development of automatic GCP matching technology for CAS-500 imagery. Master's thesis, Inha University, Incheon, Republic of Korea.
  15. Yoon, W., Park, H., and Kim, T., 2018. Feasibility analysis of precise sensor modelling for KOMPSAT-3A imagery using unified control points. Korean Journal of Remote Sensing, 34(6-1), 1089-1100. https://doi.org/10.7780/kjrs.2018.34.6.1.19

Research Article

Korean J. Remote Sens. 2024; 40(1): 103-114

Published online February 28, 2024 https://doi.org/10.7780/kjrs.2024.40.1.10

Copyright © Korean Society of Remote Sensing.

Matching Performance Analysis of Upsampled Satellite Image and GCP Chip for Establishing Automatic Precision Sensor Orientation for High-Resolution Satellite Images

Hyeon-Gyeong Choi1, Sung-Joo Yoon2, Sunghyeon Kim3, Taejung Kim4*

1Master Student, Department of Geoinformatic Engineering, Inha University, Incheon, Republic of Korea
2PhD Candidate, Department of Geoinformatic Engineering, Inha University, Incheon, Republic of Korea
3Undergraduate Student, Department of Geoinformatic Engineering, Inha University, Incheon, Republic of Korea
4Professor, Department of Geoinformatic Engineering, Inha University, Incheon, Republic of Korea

Correspondence to:Taejung Kim
E-mail: tezid@inha.ac.kr

Received: February 11, 2024; Revised: February 17, 2024; Accepted: February 25, 2024

This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (https://creativecommons.org/licenses/by-nc/4.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

The escalating demands for high-resolution satellite imagery necessitate the dissemination of geospatial data with superior accuracy. Achieving precise positioning is imperative for mitigating geometric distortions inherent in high-resolution satellite imagery. However, maintaining sub-pixel level accuracy poses significant challenges within the current technological landscape. This research introduces an approach wherein upsampling is employed on both the satellite image and ground control points (GCPs) chip, facilitating the establishment of a high-resolution satellite image precision sensor orientation. The ensuing analysis entails a comprehensive comparison of matching performance. To evaluate the proposed methodology, the Compact Advanced Satellite 500-1 (CAS500-1), boasting a resolution of 0.5 m, serves as the high-resolution satellite image. Correspondingly, GCP chips with resolutions of 0.25 m and 0.5 m are utilized for the South Korean and North Korean regions, respectively. Results from the experiment reveal that concurrent upsampling of satellite imagery and GCP chips enhances matching performance by up to 50% in comparison to the original resolution. Furthermore, the position error only improved with 2x upsampling. However, with 3x upsampling, the position error tended to increase. This study affirms that meticulous upsampling of high-resolution satellite imagery and GCP chips can yield sub-pixel-level positioning accuracy, thereby advancing the state-of-the-art in the field.

Keywords: High-resolution satellite, Sensor orientation, Geometric correction, GCP matching, CAS500-1

1. Introduction

Recent advancements in satellite image utilization technology have led to growing demands for high-resolution satellite imagery. In response, Korea has developed and launched a Korean Multi-Purpose Satellite (KOMPSAT) with a resolution of 0.55 m. Furthermore, Compact Advanced Satellite 500 (CAS500) with a resolution of 0.5 m has been developed (Lee et al., 2017; Kim, 2020). Korea is currently participating in the development of comprehensive land management technology using satellite image information big data to improve land management technology (Kim, 2023).

The correction of geometric distortions caused by errors in satellite global positioning system (GPS) receivers and attitude control sensors is necessary when using satellite imagery. This correction involves aligning image coordinates with ground coordinates using ground control points (GCPs). The establishment of a precise sensor orientation through this process is crucial in improving the accuracy of positioning in high-resolution satellite imagery. In this context, even minor errors can have a significant impact, limiting the achievable positioning accuracy to a range of 1–1.5 pixels (Choi et al., 2023).

Traditionally, establishing precise sensor orientation requires manually acquiring GCPs, which incurs significant time and economic costs. To address this challenge, previous studies have introduced GCP chips - small image fragments with precise ground coordinates. They have developed a technology that can automatically match GCP chips and satellite images to establish an automatic precision sensor orientation (Park et al., 2020; Yoon, 2019).

The use of GCP chips for precise sensor orientation establishment automation provided a cost-effective means of producing and distributing calibrated satellite images. As a result, researchers have actively pursued studies aimed at achieving higher positioning accuracy through the establishment of automatic precise sensor orientation. Oh et al. (2022) investigated the feasibility of automating the establishment of high-resolution satellite image precision sensor orientation using GCP chips.

Meanwhile, Shin et al. (2018) conducted an experiment in GCP chip matching using pan-sharpened images to establish an automatic precision sensor orientation tailored for high-resolution satellite imagery. The findings indicated a significant improvement in matching performance when using pan-sharpened images compared to original multispectral images. Lee and Kim (2021) investigated the effectiveness of using high-resolution GCP chips to establish a precise sensor orientation for medium-resolution satellite imagery. They used satellite image upsampling instead of pan-sharpening to improve the matching performance of medium-resolution satellite images without pan bands.

The study’s results indicated that upsampling to match at the medium resolution was more effective in terms of geometric accuracy than matching at the original resolution. Choi and Kim (2022a) conducted a follow-up study in which they adjusted the number of chips used to establish the precision sensor orientation during satellite image upsampling. This was done to reduce the time required to establish a medium-resolution automatic precision sensor orientation and improve GCP chip matching performance. Previous studies have shown that the performance of GCP chip matching is affected by the spatial resolution of both the GCP chip and satellite imagery.

The objective of this study is to develop an improved method for establishing automatic precision sensor orientation in high-resolution satellite images. To achieve this, we conducted chip matching by upsampling both the satellite and GCP chip images. We then performed a comprehensive comparative analysis of the results using the established precision sensor orientation. The study concluded that upsampling both the high-resolution satellite images and the GCP chip images can improve the performance of the sensor orientation when matching them.

2. Materials and Methods

2.1. Materials

This study utilized the pre-existing spatial information data of the Korean Peninsula, as described in a previous study (Park et al., 2020). The GCP chips for the South Korean region were created using precise aerial orthoimages with a spatial resolution of 0.25 m. It includes three different types of ground coordinate information (Yoon et al., 2018). These three types of ground coordinate information include the unified control point, triangulation point, and image control point. The Korea National Geographic Information Institute generates and manages the unified control point and triangulation point. The image control point is used in the production of the national basic map and aerial orthoimage.

In the case of North Korea, it is difficult to acquire and continuously manage GCP because it is inaccessible. Therefore, the GCP chips for the North Korean region were created using satellite orthoimages with a spatial resolution of 0.5 m. It includes orthoimagery plane coordinates and digital elevation model (DEM) ellipsoid height. Therefore, The GCP chips in the South Korean region were created from color images extracted from aerial orthoimagery. In contrast, the GCP chips in the North Korean region were created from single-band images extracted from satellite orthoimagery generated from panchromatic images. Table 1 provides detailed specifications of the GCP chips used. An example of the GCP chips is shown in Fig. 1. Fig. 1(a) shows GCP chips in South Korea, while Fig. 1(b) shows GCP chips in North Korea.

Figure 1. Example of GCP chip.

Table 1 . Specifications for GCP chip used.

AreaSouth KoreaNorth Korea
Ground coordinates referenceUnified control point (UCP), Image control point (ICP), Triangulation point (TP)Plane coordinates of the orthogonal image, DEM ellipsoidal height
Source imageAerial orthoimageSatellite orthoimage
Chip size1,027 x 1,027 pixel513 x 513 pixel
GSD0.25 m0.50 m
BandRed, Green, BlueGray

GSD: ground sample distance..



The CAS500-1 image, a high-resolution satellite image with a spatial resolution of 0.5 meters, was utilized in this study. The specifications of the satellite images are shown in Table 2. The study area includes Sejong, Andong, and Pyongyang. Table 3 presents a list of the satellite images used in this study. Additionally, the study area’s geographic distribution is displayed in Figs. 2 to 4.

Figure 2. Distribution of study area (Sejong).

Figure 3. Distribution of study area (Andong).

Figure 4. Distribution of study area (Pyongyang).

Table 2 . Specifications for satellite images used.

Satellite imageCAS500-1
Product LevelLevel 1R
Spectral resolutionPanchromatic: 450–900 nm
Multispectral: 450–900 nm (Blue, Green, Red, NIR)
GSDPanchromatic: 0.5 m
Multispectral: 2.0 m
OrbitCircular sun-synchronous (500 km)
Swath≥ 12km
Radiometric resolution12 bits


Table 3 . List of used satellite images.

DataAcquisition dateUpper left coordinate (Lat, Lon)Bottom low coordinate (Lat, Lon)Quantity of GCP chip
Sejong 12021.12.0736.509042465, 127.26639565336.426378705, 127.42749891156
Sejong 22021.12.0736.612209951, 127.23868325936.529558877, 127.40001466140
Sejong 32022.02.2736.625661023, 127.20174770236.553286018, 127.40666886168
Sejong 42022.03.0336.533186804, 127.22771661136.450030787, 127.39011731055
Sejong 52023.03.1636.506973164, 127.24423404436.427195422, 127.41496116258
Sejong 62023.03.1636.609898079, 127.21483367336.530137740, 127.38583530940
Andong 12021.11.1136.724915722, 128.87505996536.642481614, 129.03803992839
Andong 22022.06.1236.619879282, 128.68802042436.537777343, 128.85064801136
Andong 32022.12.0236.449640721, 128.89364579236.368206234, 129.05863603643
Pyongyang 12021.10.1939.156498918, 125.64549434439.073884439, 125.83948529192
Pyongyang 22023.01.1039.004960711, 125.71048985338.922023330, 125.88835808474


2.2. Methods

Fig. 5 presents a schematic of the procedure for establishing automatic precision sensor orientation using GCP chip matching, as applied in this paper.

Figure 5. Flowchart in this study.

Based on the findings of Shin et al. (2018), we conducted image fusion, also known as pan-sharpening, to enhance the matching accuracy between high-resolution satellite images and GCP chips. Pan-sharpening is a technology that combines high-resolution panchromatic band images with low-resolution multi-spectral band images to reconstruct high-resolution color images. This study utilized an algorithm based on the component-substitution (CS) fusion technique, as described in previous works (Park et al., 2020; Vivine et al., 2015).

This paper proposed establishing the initial sensor orientation first. The sensor orientation was established using the rational function model (RFM) for this study. The RFM is a mathematical sensor model mainly used for satellite imagery. It is a model expression that mathematically expresses the relationship between image coordinates and ground coordinates, without physical sensor information. It can be established by applying the rational polynomial coefficients (RPCs) provided with the original satellite image to the formula. Eq. (1) and (2) delineate the formulas of the RFM corresponding to the columns and rows of the image.

cn=P1(Xn,Yn,Zn)P2(Xn,Yn,Zn)= i=03 j=03 k=03 aijk Xni Ynj Znk i=03 j=03 k=03 bijk Xni Ynj Znk
rn=P3(Xn,Yn,Zn)P4(Xn,Yn,Zn)= i=0 3 j=0 3 k=0 3 cijk Xni Ynj Znk i=0 3 j=0 3 k=0 3 dijk Xni Ynj Znk

The formula provided uses cn and rn to represent the normalized image coordinates, and Xn, Yn, and Zn to denote the normalized ground coordinates. In the given equation, aijk, bijk, cijk, and dijk are the coefficients of the rational polynomials P1, P2, P3, and P4, respectively. These coefficients constitute the rational function expression defining the relationship between the image’s column, row, and ground coordinates. We set the values of the RPCs for i + j + k ≥ 4 to zero to formulate the RFM model expression using only the coefficients of the third term or lower in the ground coordinates. Additionally, we set b000 and d000 to 1 to fix the proportionality constant of the rational function expression. The remaining 78 coefficients for ground coordinate terms with i + j + k from 1 to 3 were provided as RPC files with satellite images.

Using the established initial sensor orientation, we conducted a comprehensive search for all GCP chips within the entire coverage of the satellite image. The image corner coordinates, derived from the initial sensor orientation, were used to calculate the minimum rectangular area encompassing the image region. To account for potential errors in the initial sensor orientation, a margin was incorporated into the image area. The margin dimensions were set at 250 meters in both the x and y directions of the satellite image. High-resolution satellites have a limited field of view and are affected by the area’s topography. Therefore, the margin size was determined based on the search for stable GCP chips and the potential for overestimation to establish accurate sensor orientation. The search for GCP chips was confined to the area within which the margin was applied, referencing the GCP database.

Subsequently, we defined the region surrounding each GCP chip and executed upsampling of the satellite image to the desired scale. Previous research has shown that bilinear interpolation is advantageous for upsampling satellite images in terms of both performance and time efficiency (Choi and Kim, 2022).

Therefore, we adopted bilinear interpolation for the upsampling process. In this study, we applied upsampling factors ranging from 1 to 3 times for each set of experimental data, resulting in spatial resolutions of 0.5 m, 0.25 m, and 0.167 m. The corresponding upsampling factors are shown in Fig. 6. Fig. 6 illustrates the original satellite image transformed according to the upsampling factor. It was evident that a larger upsampling factor leads to a smoother appearance of the image. To save processing speed and memory, the upsampling was only applied to the navigation area relative to the GCP, rather than the entire original satellite image.

Figure 6. Different satellite image patches and a GCP chip according to upsampling factors.

Afterward, we upsampled and remapped the GCP chip image to match the geometry of the satellite image and its newly achieved spatial resolution. To ensure automatic matching using an area-based image-matching algorithm, we deformed the GCP chip to synchronize its resolution and geometry with that of the satellite image. Fig. 7 presents a depiction of the resampling process for the GCP chip, considering the geometry and resolution of the satellite image. This procedure was executed for all identified GCP chips.

Figure 7. Concept of resampling GCP chip.

Next, we created an image pyramid to improve processing efficiency for image matching. Four levels of image pyramids were constructed for each satellite image patch. The matching process began at the top layer of the pyramid and used the Census algorithm in the last layer and the zero-mean normalized cross-correlation (ZNCC) algorithm in the remaining layers, except for the last one. The previous study (Yoon, 2019) has shown that the ZNCC algorithm is robust to image size variations, but its accuracy in extracting matching points is somewhat lower. In contrast, the Census algorithm has relatively good accuracy in extracting matching points but is sensitive to variations in image size.

Therefore, we used the Census algorithm for the original image and the ZNCC algorithm for the other reduced images. Despite our efforts, there were still some mismatch points among the automatically matched GCPs. To address this issue, we utilized the random sample consensus (RANSAC) technique (Fischler and Bolles, 1981) to automatically remove them. The detailed process of GCP chip matching and mismatch point removal is omitted here for brevity and to avoid redundancy, as it has been covered in previous research papers (Yoon et al., 2018; Son et al., 2021).

As the final step, we established a precise sensor orientation based on the auto-matching results with removed mismatches. We estimated the coefficients of the following error correction equation to rectify any errors in the initial sensor orientation.

c=a11c0+a12r0+a13
r=a21r0+a22r0+a23

In the equation above, c0, r0, c, and r represent the image coordinates before and after correction, respectively. The coefficients of the error correction formula are represented by a11 to a23. After estimating the coefficients of the error correction equation, the RPC coefficients were recalculated using this equation to establish precise sensor orientation. The precision sensor orientation was established for each upsampling factor and then converted back to the original resolution for accuracy comparison (Lee and Kim, 2021). The accuracy of the precision sensor orientation was compared for each upsampling factor.

3. Results

In this study, model error and check error were employed as performance evaluation metrics. Model error represents the relative root mean square error (rRMSE) measured for the GCPs used to refine the initial sensor model. Check error, on the other hand, is expressed as the rRMSE measured for checkpoints manually extracted from reference points not included in the model points. Eq. (5) details the calculation of rRMSE, where RMSEcol represents the RMSE in the column direction of the image, and RMSErow represents the RMSE in the row direction of the image. Accuracy analyses were conducted based on the original image with a spatial resolution of 0.5 m.

rRMSE=(RMSEcol)2+(RMSErow)2

Table 4 shows the number of GCP chips within the acquisition area and the manually acquired number of GCP chips for verification across the 11 CAS500-1 satellite images used in the experiment.

Table 4 . Quantity of GCP chips per experimental case.

DataQuantity of GCP chipsQuantity of checkpoint
Sejong 1566
Sejong 2406
Sejong 3686
Sejong 4556
Sejong 5587
Sejong 6406
Andong 1395
Andong 2364
Andong 3435
Pyongyang 19210
Pyongyang 2748


The experimental results are presented in Table 5, and the performance evaluation indicator graphs for each study area are shown in Figs. 8 to 18, illustrating the observed trends in the experimental outcomes. In each figure, (a) and (b) represent the model error and check error, respectively, as functions of the number of GCPs and the upsampling scale.

Figure 8. Result of Sejong1.

Figure 9. Result of Sejong2.

Figure 10. Result of Sejong3.

Figure 11. Result of Sejong4.

Figure 12. Result of Sejong5.

Figure 13. Result of Sejong6.

Figure 14. Result of Andong1.

Figure 15. Result of Andong2.

Figure 16. Result of Andong3.

Figure 17. Result of Pyongyang1.

Figure 18. Result of Pyongyang2.

Table 5 . Matching result.

Sejong 1Sejong 2
GSD (m)Error (pixel)Matching rateGSD (m)Error (pixel)Matching rate
ModelCheckModelCheck
0.5001.2851.54645%0.5001.1981.12550%
0.2500.6261.03636%0.2500.6240.96335%
0.1670.4030.73834%0.1670.4231.04338%
0.1250.2771.80736%0.1250.3461.08145%
Sejong 3Sejong 4
GSD (m)Error (pixel)Matching rateGSD (m)Error (pixel)Matching rate
ModelCheckModelCheck
0.5001.3672.03934%0.5001.2300.93645%
0.2500.7391.41925%0.2500.6050.63236%
0.1670.5024.36328%0.1670.4772.65140%
0.1250.3523.59022%0.1250.3240.88133%
Sejong 5Sejong 6
GSD (m)Error (pixel)Matching rateGSD (m)Error (pixel)Matching rate
ModelCheckModelCheck
0.5001.4331.57341%0.5001.3122.22555%
0.2500.6610.71434%0.2500.5591.66535%
0.1670.4652.23333%0.1670.3862.38743%
0.1250.2941.80934%0.1250.3352.63940%
Andong 1Andong 2
GSD (m)Error (pixel)Matching rateGSD (m)Error (pixel)Matching rate
ModelCheckModelCheck
0.5001.1741.86641%0.5001.0341.39647%
0.2500.6661.47636%0.2500.6001.21139%
0.1670.4462.05431%0.1670.4351.00233%
0.1250.3281.77344%0.1250.3251.54931%
Andong 3Pyongyang 1
GSD (m)Error (pixel)Matching rateGSD (m)Error (pixel)Matching rate
ModelCheckModelCheck
0.5001.2921.45342%0.5001.3462.18941%
0.2500.6541.34037%0.2500.7051.86027%
0.1670.4181.31542%0.1670.4354.30224%
0.1250.3042.34928%0.1250.3473.14121%
Pyongyang 2
GSD (m)Error (pixel)Matching rate
ModelCheck
0.5001.4771.93235%
0.2500.5611.00129%
0.1670.4381.97926%
0.1250.3413.21726%


Part (a) of each experiment shows that the model error consistently decreased as the upsampling factor increased and the ground sample distance (GSD) became smaller. These results indicated that coefficient estimation was successfully applied in the error correction equation for the RPC update. It can be interpreted that the higher the upsampling factor was applied, the closer the GCPs were to the estimated error correction, the closer they were selected as model points, and consequently, the smaller the GSD compared to the original resolution, the smaller the model estimation error.

Part (b) of each experiment demonstrates a decrease in check error when upsampling by 2x from the original resolution for all experiments. However, when upsampling by 3x, the error rate generally increased again. The 4x results also showed inconsistent results for each experiment. This observation is consistent with previous studies, such as Lee and Kim (2021), which have shown that excessive upsampling scaling relative to the original data reduces model estimation error but does not necessarily result in improved accuracy.

When establishing the orientation of a precision sensor at the original resolution, the check error exceeded 2 pixels in many cases, with an average check error of 1.7 pixels. When establishing the orientation of a precision sensor using an upsampling factor of 2, the check error was less than 2 pixels in all experiments, with an average check error of 1.2 pixels. We achieved a 50% improvement over the original resolution. In several experiments, where the original resolution resulted in check errors of over 1 pixel, performance considerably improved when the 2x upsampling factor was applied, resulting in less than 1 pixel check error.

To validate the experimental results objectively, we compared the matching results for each resolution. Fig. 19 shows the results of the matching GCP chip. The matching results of the GCP chip at the original resolution of 0.5 m were roughly aligned to the resolution grid. However, the matching results of the GCP chip at 0.25 m upsampled were more precisely aligned. However, when upsampled to 0.167 meters, the GCP chip over-interpolates, resulting in a smoothed image that makes it difficult to find the exact match point.

Figure 19. Matching results for each GSD. (a) Original satellite image, (b) matching result of original GSD (0.5 m), (c) matching result of 0.25 m GSD, and (d) matching result of 0.167 m GSD.

4. Discussion

In this chapter, we delve into the distinctions between our findings and those of preceding studies utilizing medium-resolution satellite imagery (Lee and Kim, 2021; Choi and Kim, 2022a). Previous research utilizing medium-resolution satellite imagery, with a resolution of 5 m and high-resolution GCP chips featuring a resolution of 0.25 m, identified an optimal upsampling ratio of 3 times, corresponding to a GSD of 1.67 m. In our present study, employing high-resolution satellite imagery at a resolution of 0.5 m along with a GCP chip sharing the same specifications, the optimal upsampling ratio was determined to be 2 times, equating to a GSD of 0.25 m.

In the case of North Korea, both the satellite image and the GCP chip had a resolution of 0.5 m. However, the results were still improved by upsampling to 0.25 m, which may be due to a limitation of the algorithm used for chip matching. The image-matching algorithm used in this study was a pixel-by-pixel similarity search method. In summary, when upsampled to 0.25 m, the similarity search interval was applied to 0.5 pixels of the original satellite image. This improvement was believed to be a result of this adjustment.

During the study, a decrease in accuracy was observed when the upsampling ratio was tripled, which deviates from the findings in low- and medium-resolution images. This phenomenon may be attributed to disparities in the sharpness of the original satellite image or the act of upsampling to a GSD higher than the chip resolution. Future research should investigate whether the clarity of the original image or the chip resolution has a greater impact on matching results. This is crucial for advancing our understanding and achieving higher accuracy in the context of high-resolution satellite images.

5. Conclusions

In this study, we applied upsampling to both satellite images and pre-existing high-resolution GCP chips to establish a highresolution satellite image precision sensor orientation. We subsequently conducted a comparative analysis of the matching performance. The experimental outcomes confirmed a notable enhancement in matching performance when upsampling was applied to both satellite images and GCP chips.

When establishing a precision sensor orientation with a 2x upsampling factor, an improvement of up to 50% was observed compared to the original resolution. In certain experimental outcomes, the check error approached or fell below 1 pixel. Due to technical constraints, achieving sub-pixel positioning accuracy when establishing precision sensor orientation on high-resolution satellite imagery at its original resolution was challenging. However, our results demonstrated that subpixel-level positioning accuracy can be achieved through meticulous upsampling of both satellite images and GCP chips.

Moreover, despite employing different specifications for GCP chips in the South and North Korean regions, the experimental results exhibited comparable trends. Through this, we confirmed that if it is possible to build GCP chips in inaccessible areas such as North Korea as well as South Korea, this method is sufficiently applicable, and it is possible to establish an improved precision sensor orientation.

In this study, we tested a GCP chip with a GSD similar to that of satellite images. However, if a more sophisticated GCP chip is built and utilized, the position error could be further reduced through the method proposed in this study. Therefore, further research using high-resolution and high-accuracy GCP chips is needed for precise sensor modeling of high-resolution satellite images. It is also necessary to consider the use of ultra-high-resolution data such as drones. We hope that this research will help in the processing and utilization of high-resolution satellite images.

Acknowledgments

This work was carried out with the support of “Cooperative Research Program for Agriculture Science and Technology Development (Project No. PJ 016233)” Rural Development Administration, Republic of Korea. This work is supported by the Korea Agency for Infrastructure Technology Advancement grant funded by the Ministry of Land, Infrastructure and Transport (Grant No. RS-2022-00155763).

Conflict of Interest

No potential conflict of interest relevant to this article was reported.

Fig 1.

Figure 1.Example of GCP chip.
Korean Journal of Remote Sensing 2024; 40: 103-114https://doi.org/10.7780/kjrs.2024.40.1.10

Fig 2.

Figure 2.Distribution of study area (Sejong).
Korean Journal of Remote Sensing 2024; 40: 103-114https://doi.org/10.7780/kjrs.2024.40.1.10

Fig 3.

Figure 3.Distribution of study area (Andong).
Korean Journal of Remote Sensing 2024; 40: 103-114https://doi.org/10.7780/kjrs.2024.40.1.10

Fig 4.

Figure 4.Distribution of study area (Pyongyang).
Korean Journal of Remote Sensing 2024; 40: 103-114https://doi.org/10.7780/kjrs.2024.40.1.10

Fig 5.

Figure 5.Flowchart in this study.
Korean Journal of Remote Sensing 2024; 40: 103-114https://doi.org/10.7780/kjrs.2024.40.1.10

Fig 6.

Figure 6.Different satellite image patches and a GCP chip according to upsampling factors.
Korean Journal of Remote Sensing 2024; 40: 103-114https://doi.org/10.7780/kjrs.2024.40.1.10

Fig 7.

Figure 7.Concept of resampling GCP chip.
Korean Journal of Remote Sensing 2024; 40: 103-114https://doi.org/10.7780/kjrs.2024.40.1.10

Fig 8.

Figure 8.Result of Sejong1.
Korean Journal of Remote Sensing 2024; 40: 103-114https://doi.org/10.7780/kjrs.2024.40.1.10

Fig 9.

Figure 9.Result of Sejong2.
Korean Journal of Remote Sensing 2024; 40: 103-114https://doi.org/10.7780/kjrs.2024.40.1.10

Fig 10.

Figure 10.Result of Sejong3.
Korean Journal of Remote Sensing 2024; 40: 103-114https://doi.org/10.7780/kjrs.2024.40.1.10

Fig 11.

Figure 11.Result of Sejong4.
Korean Journal of Remote Sensing 2024; 40: 103-114https://doi.org/10.7780/kjrs.2024.40.1.10

Fig 12.

Figure 12.Result of Sejong5.
Korean Journal of Remote Sensing 2024; 40: 103-114https://doi.org/10.7780/kjrs.2024.40.1.10

Fig 13.

Figure 13.Result of Sejong6.
Korean Journal of Remote Sensing 2024; 40: 103-114https://doi.org/10.7780/kjrs.2024.40.1.10

Fig 14.

Figure 14.Result of Andong1.
Korean Journal of Remote Sensing 2024; 40: 103-114https://doi.org/10.7780/kjrs.2024.40.1.10

Fig 15.

Figure 15.Result of Andong2.
Korean Journal of Remote Sensing 2024; 40: 103-114https://doi.org/10.7780/kjrs.2024.40.1.10

Fig 16.

Figure 16.Result of Andong3.
Korean Journal of Remote Sensing 2024; 40: 103-114https://doi.org/10.7780/kjrs.2024.40.1.10

Fig 17.

Figure 17.Result of Pyongyang1.
Korean Journal of Remote Sensing 2024; 40: 103-114https://doi.org/10.7780/kjrs.2024.40.1.10

Fig 18.

Figure 18.Result of Pyongyang2.
Korean Journal of Remote Sensing 2024; 40: 103-114https://doi.org/10.7780/kjrs.2024.40.1.10

Fig 19.

Figure 19.Matching results for each GSD. (a) Original satellite image, (b) matching result of original GSD (0.5 m), (c) matching result of 0.25 m GSD, and (d) matching result of 0.167 m GSD.
Korean Journal of Remote Sensing 2024; 40: 103-114https://doi.org/10.7780/kjrs.2024.40.1.10

Table 1 . Specifications for GCP chip used.

AreaSouth KoreaNorth Korea
Ground coordinates referenceUnified control point (UCP), Image control point (ICP), Triangulation point (TP)Plane coordinates of the orthogonal image, DEM ellipsoidal height
Source imageAerial orthoimageSatellite orthoimage
Chip size1,027 x 1,027 pixel513 x 513 pixel
GSD0.25 m0.50 m
BandRed, Green, BlueGray

GSD: ground sample distance..


Table 2 . Specifications for satellite images used.

Satellite imageCAS500-1
Product LevelLevel 1R
Spectral resolutionPanchromatic: 450–900 nm
Multispectral: 450–900 nm (Blue, Green, Red, NIR)
GSDPanchromatic: 0.5 m
Multispectral: 2.0 m
OrbitCircular sun-synchronous (500 km)
Swath≥ 12km
Radiometric resolution12 bits

Table 3 . List of used satellite images.

DataAcquisition dateUpper left coordinate (Lat, Lon)Bottom low coordinate (Lat, Lon)Quantity of GCP chip
Sejong 12021.12.0736.509042465, 127.26639565336.426378705, 127.42749891156
Sejong 22021.12.0736.612209951, 127.23868325936.529558877, 127.40001466140
Sejong 32022.02.2736.625661023, 127.20174770236.553286018, 127.40666886168
Sejong 42022.03.0336.533186804, 127.22771661136.450030787, 127.39011731055
Sejong 52023.03.1636.506973164, 127.24423404436.427195422, 127.41496116258
Sejong 62023.03.1636.609898079, 127.21483367336.530137740, 127.38583530940
Andong 12021.11.1136.724915722, 128.87505996536.642481614, 129.03803992839
Andong 22022.06.1236.619879282, 128.68802042436.537777343, 128.85064801136
Andong 32022.12.0236.449640721, 128.89364579236.368206234, 129.05863603643
Pyongyang 12021.10.1939.156498918, 125.64549434439.073884439, 125.83948529192
Pyongyang 22023.01.1039.004960711, 125.71048985338.922023330, 125.88835808474

Table 4 . Quantity of GCP chips per experimental case.

DataQuantity of GCP chipsQuantity of checkpoint
Sejong 1566
Sejong 2406
Sejong 3686
Sejong 4556
Sejong 5587
Sejong 6406
Andong 1395
Andong 2364
Andong 3435
Pyongyang 19210
Pyongyang 2748

Table 5 . Matching result.

Sejong 1Sejong 2
GSD (m)Error (pixel)Matching rateGSD (m)Error (pixel)Matching rate
ModelCheckModelCheck
0.5001.2851.54645%0.5001.1981.12550%
0.2500.6261.03636%0.2500.6240.96335%
0.1670.4030.73834%0.1670.4231.04338%
0.1250.2771.80736%0.1250.3461.08145%
Sejong 3Sejong 4
GSD (m)Error (pixel)Matching rateGSD (m)Error (pixel)Matching rate
ModelCheckModelCheck
0.5001.3672.03934%0.5001.2300.93645%
0.2500.7391.41925%0.2500.6050.63236%
0.1670.5024.36328%0.1670.4772.65140%
0.1250.3523.59022%0.1250.3240.88133%
Sejong 5Sejong 6
GSD (m)Error (pixel)Matching rateGSD (m)Error (pixel)Matching rate
ModelCheckModelCheck
0.5001.4331.57341%0.5001.3122.22555%
0.2500.6610.71434%0.2500.5591.66535%
0.1670.4652.23333%0.1670.3862.38743%
0.1250.2941.80934%0.1250.3352.63940%
Andong 1Andong 2
GSD (m)Error (pixel)Matching rateGSD (m)Error (pixel)Matching rate
ModelCheckModelCheck
0.5001.1741.86641%0.5001.0341.39647%
0.2500.6661.47636%0.2500.6001.21139%
0.1670.4462.05431%0.1670.4351.00233%
0.1250.3281.77344%0.1250.3251.54931%
Andong 3Pyongyang 1
GSD (m)Error (pixel)Matching rateGSD (m)Error (pixel)Matching rate
ModelCheckModelCheck
0.5001.2921.45342%0.5001.3462.18941%
0.2500.6541.34037%0.2500.7051.86027%
0.1670.4181.31542%0.1670.4354.30224%
0.1250.3042.34928%0.1250.3473.14121%
Pyongyang 2
GSD (m)Error (pixel)Matching rate
ModelCheck
0.5001.4771.93235%
0.2500.5611.00129%
0.1670.4381.97926%
0.1250.3413.21726%

References

  1. Choi, H. G., and Kim, T., 2022a. Analysis of optimal resolution and number of GCP chips for precision sensor modeling efficiency in satellite images. Korean Journal of Remote Sensing, 38(6-1), 1445-1462. https://doi.org/10.7780/kjrs.2022.38.6.1.34
  2. Choi, H. G., and Kim, T., 2022b. Improving automated precision sensor modeling accuracy of mid-resolution satellite images by adjustment matching resolution. In Proceedings of the 2022 Korean Society of Remote Sensing Fall Conference, Busan, Republic of Korea, Nov. 7-9, pp. 89-92.
  3. Choi, S., Lee, D., and Seo, D., 2023. Review of feasibility of compliance with CARD4L geometric correction requirements for high resolution optical satellite images. In Proceedings of the 2023 Korean Society for Aeronautical and Space Sciences Conference, Hongcheon, Republic of Korea, Nov. 15-17, pp. 610-613.
  4. Fischler, M. A., and Bolles, R. C., 1981. Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Communications of the ACM, 24(6), 381-395. https://doi.org/10.1145/358669.358692
  5. Kim, T., 2020. Current research and development status for CAS 500-1/2 image processing and utilization technology. Korean Journal of Remote Sensing, 36(5-2), 861-866. https://doi.org/10.7780/kjrs.2020.36.5.2.1
  6. Kim, T., 2023. Introduction to development of comprehensive land management technology using satellite image information bigdata. Korean Journal of Remote Sensing, 39(5-4), 1069-1073. https://doi.org/10.7780/kjrs.2023.39.5.4.1
  7. Lee, K., Kim, Y., and Choi, H., 2017. KOMPSAT image processing and applications. Korean Journal of Remote Sensing, 33(6-3), 1171-1177. https://doi.org/10.7780/kjrs.2017.33.6.3.1
  8. Lee, Y., and Kim, T., 2021. Determination of spatial resolution to improve GCP chip matching performance for CAS-4. Korean Journal of Remote Sensing, 37(6-1), 1517-1526. https://doi.org/10.7780/kjrs.2021.37.6.1.3
  9. Oh, J., Seo, D., Lee, C., Seong, S., and Choi, J., 2022. Automated RPCs bias compensation for KOMPSAT imagery using orthoimage GCP chips in Korea. IEEE Access, 10, 118465-118474. https://doi.org/10.1109/ACCESS.2022.3217788
  10. Park, H., Son, J. H., Jung, H. S., Kweon, K. E., Lee, K. D., and Kim, T., 2020. Development of the precision image processing system for CAS-500. Korean Journal of Remote Sensing, 36(5-2), 881-891. https://doi.org/10.7780/kjrs.2020.36.5.2.3
  11. Shin, J. I., Kim, T., Yoon, W. S., and Park, H. J., 2018. Improving satellite-aerial image matching success rate by image fusion. In Proceedings of the 2018 European Conference on Electrical Engineering and Computer Science (EECS), Bern, Switzerland, Dec. 20-22, pp. 224-227. https://doi.org/10.1109/EECS.2018.00049
  12. Son, J. H., Yoon, W., Kim, T., and Rhee, S., 2021. Iterative precision geometric correction for high-resolution satellite images. Korean Journal of Remote Sensing, 37(3), 431-447. https://doi.org/10.7780/kjrs.2021.37.3.6
  13. Vivone, G., Alparone, L., Chanussot, J., Mura, M. D., Garzelli, A., and Licciardi, G. A., et al, 2015. A critical comparison among pansharpening algorithms. IEEE Transactions on Geoscience and Remote Sensing, 53(5), 2565-2586. https://doi.org/10.1109/TGRS.2014.2361734
  14. Yoon, W., 2019. A Study on development of automatic GCP matching technology for CAS-500 imagery. Master's thesis, Inha University, Incheon, Republic of Korea.
  15. Yoon, W., Park, H., and Kim, T., 2018. Feasibility analysis of precise sensor modelling for KOMPSAT-3A imagery using unified control points. Korean Journal of Remote Sensing, 34(6-1), 1089-1100. https://doi.org/10.7780/kjrs.2018.34.6.1.19
KSRS
October 2024 Vol. 40, No.5, pp. 419-879

Metrics

Share

  • line

Related Articles

Korean Journal of Remote Sensing