Korean J. Remote Sens. 2024; 40(5): 657-673
Published online: October 31, 2024
https://doi.org/10.7780/kjrs.2024.40.5.1.19
© Korean Society of Remote Sensing
Correspondence to : Ho-yong Ahn
E-mail: hyahn85@korea.kr
This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (https://creativecommons.org/licenses/by-nc/4.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.
Satellite data are used in precision agriculture to optimize crop management. Thus, the planting pattern (e.g., flat and ridge-furrow) and crop type should be accurately reflected in the data. The purpose of this study was to identify the spatial characteristics of errors in the surface reflectance (SR) and vegetation index (VI) obtained from the Sentinel-2 satellite. Drone data were used to evaluate the suitability of the Sentinel-2 satellite for precision agriculture applications in agricultural fields. Four VIs (normalized difference vegetation index, green normalized difference vegetation index, enhanced vegetation index, and normalized difference red edge index) were calculated. The rice paddy exhibited a homogeneous surface, whereas garlic/onion and soybean fields showed high surface heterogeneity because of the presence of ridges and furrows. The SR values of the rice paddy, measured at near-infrared (NIR) wavelength using the Sentinel-2 satellite, were saturated. The VIs derived from both satellite and drone data exhibited a correlation above 0.811 and normalized root mean square error (NRMSE) below 11.1% after bias correction. The garlic and onion fields exhibited the worst results, with a bias-corrected NRMSE for VIs ranging between 12.9% and 13.8%. The soybean field, where the vegetation covered the surface almost completely, exhibited the best relationship between the Sentinel-2 and drone data. The correlation coefficient and bias-corrected NRMSE of VIs for the combination of the two devices were above 0.969 and below 6.4%, respectively. In addition, the SR at NIR had a correlation of 0.925 and a slope of 1.157, unlike in the rice paddy. These results indicate that crop structure has a greater effect than the planting pattern. The absolute difference between the VIs measured by the satellite and drone is influenced by the degree of surface heterogeneity. The errors are more pronounced at the farm-land edges. Our study contributes to a better understating of the characteristics of Sentinel-2 data for use in agricultural fields.
Keywords UAV, Surface reflectance, Homogeneity, Crop
Precision agriculture incorporates remote-sensing techniques to optimize crop management (Bansod et al., 2017; Liaghat and Balasundram, 2010). A sensor-based monitoring system can non-destructively diagnose the conditions of a crop (Ryu et al., 2020a). The multispectral camera onboard a drone has a sampling distance of about several centimeters at a height of 150 m from the ground. Thus, pesticide spraying, variable fertilization, and irrigation management can be conducted using drone data (Hafeez et al., 2023; Sishodia et al., 2020). However, regularly capturing wide areas using drones consumes the labor force and requires economic efforts (Pla et al., 2019). Satellite data can be an alternative to overcome these disadvantages. Because satellites regularly capture images of crop-growing areas, information such as the planting time, harvesting time, and spatial variation depending on the crop’s conditions can be produced and provided to the farmer (Ali et al., 2021).
Abundant satellite data are freely available for monitoring crop growth and development in the field (Labib and Harris, 2018). The Sentinel-2 satellite, which is a representative agricultural observation satellite, has various bands from visible to shortwave infrared, and the spatial image resolution ranges from 10 to 60 m. Moreover, it is possible to observe the same area every 5 days (Song et al., 2021). The growth parameters, such as the leaf area index, biomass, and yield, can be estimated using data from the Sentinel-2 satellite (Bansod et al., 2017; Dong et al., 2020; Mao et al., 2022), and crop types can be classified (Maponya et al., 2020; Sonobe et al., 2018). The improved spatiotemporal resolution and surface reflectance (SR) at various wavelengths facilitate precision agriculture using satellite data (Kong et al., 2023). However, there are drawbacks regarding the usability of the freely available satellite data for precision agriculture.
The temporal resolution of Sentinel-2 satellite data can be unsatisfactory because of cloud and cloud shadow effects (Bukowiecki et al., 2021; Caparros-Santiago et al., 2023). If information about a specific period is not obtained for a long time, using satellite data for precision agriculture may be challenging. To minimize these limitations, fused satellite data have been produced as a harmonized SR product of satellites such as Landsat-8/9 and Sentinel-2 (Dhillon et al., 2022; Kong et al., 2021). Furthermore, more than 130 microsatellites have been launched for the PlanetScope constellation (Frazier et al., 2021), which makes an effort to observe specific areas every day (Roy et al., 2021). However, this constellation cannot escape the influence of clouds and cloud shadows, which are inherent limitations of satellite data.
The crop conditions can be monitored using the multispectral camera onboard a drone during specific growth periods for which satellite images have not been acquired (Martinez et al., 2021). Drone images can provide spatially detailed information about crops compared to satellite images (Messina et al., 2020). In addition, satellite and drone data can be jointly used after converting the spatial resolution of the drone image to match that of the satellite image (Jiang et al., 2022; Zhang et al., 2023). However, the information obtained from satellite data can be biased because of the low spatial resolution associated with surface heterogeneity.
Crops are cultivated using various planting patterns (e.g., flat and ridge-furrow), depending on the crop type, soil type, slope of the field, and climate. The flat planting pattern has advantages in managing water and nutrients, and the crops are easy to grow and harvest. The ridge-furrow planting pattern is used in areas with heavy precipitation or poor drainage. The height difference between the ridges and furrows prevents the crops from being submerged and avoids damage due to excess moisture (Verma et al., 2020). These planting patterns affect surface heterogeneity. Satellite data can provide inaccurate information about crop conditions when crops are planted in rows (Mazzia et al., 2020).
However, the degree of soil coverage can vary depending on the leaf structure of the crops, even if the ridge-furrow planting pattern is used. Therefore, prior to using drones and satellites together, the satellite and drone data should be compared to determine whether the information about crop conditions is accurate (Zhang et al., 2023). Most previous studies have focused on evaluating the normalized difference vegetation index (NDVI) measured by satellites (Verma et al., 2020). However, the multispectral imager includes not only red and near-infrared (NIR) bands but also blue, green, and red-edge bands. Thus, evaluating the SR in various wavelength bands is necessary and vegetation indices (VIs) should be calculated using the SR.
The main purpose of this study is to identify the characteristics of errors in the SR and VIs measured by the Sentinel-2 satellite depending on crop type, planting pattern, and the surface heterogeneity in agricultural fields. Drone-based images, which have higher spatial resolution than satellite image, were used to as reference assess the Sentinel-2 satellite data. Moreover, suitability and limitations of the Sentinel-2 satellite data for precision agriculture were investigated in rice paddy, garlic, onion, and soybean crops.
This study was conducted in three counties located in South Korea (Fig. 1a). One measurement site was a rice paddy located in Jeollanam-do Agricultural Research & Extension Services (Naju County, latitude of 35.0275°N, longitude of 126.8209°E). Rice was cultivated in 2020 and 2021. The rice was transplanted in early June, and the heading stage was observed in mid to late August. The rice was harvested from late September to early October. The total target study area (blue polygon in Fig. 1b), which comprised five paddies, was approximately 17,000 m2. Given that the size of each paddy is approximately 30 × 95 m, the pixels located in the center of each rice paddy field have a fairly homogeneous footprint (Fig. 2a).
The second measurement site was an upland field located in the National Institute of Crop Science (Muan County, latitude of 34.9671°N, longitude of 126.4528°E). Garlic and onion were cultivated in this area during the 2019–2020 and 2020–2021 growing seasons. The growing period for garlic and onion was from October to late March/early June. The upland field featured numerous furrows and ridges, which resulted in a fairly heterogeneous footprint (Fig. 2b). In addition, bare soil and a concrete road were present at the edge of the study area. The total area was approximately 8,400 m2 (red polygon in Fig. 1c).
The third measurement site was a field located in Gimje County (latitude of 35.7517°N, longitude of 126.8136°E) for cultivating soybeans. This area consisted of multiple furrows and ridges to mitigate damage from excess moisture because of poor drainage (Fig. 2c). Soybean was cultivated in this field from June 3, 2022, to November 2, 2022. Unlike the garlic and onion fields, the leaves of the soybean plants covered the entire field in August and September, which made the soil barely visible. The target area (orange polygon in Fig. 1d) encompassed approximately 27,600 m2.
Satellite images from Sentinel-2, one of the most prominent satellites used for crop data monitoring, were collected using the Sentinel Hub interface. The coordinate system and projection were set to WGS 1984 UTM Zone 52N and Transverse Mercator, respectively, to match the drone’s raster data. We downloaded the Sentinel-2 Level-2A orthoimage bottom-of-atmosphere-corrected reflectance data, which eliminated atmospheric effects. Subsequently, contaminated pixels caused by clouds and cloud shadows were filtered using scene classification and cloud information. Pixels with a cloud probability value above 0 were masked (Fig. 3).
The SR of the crops (rice, garlic, onion, and soybean) was measured using a multispectral camera mounted on a drone (Fig. 3). The multispectral camera (RedEdge-MX Dual: MicaSense, Inc., Seattle, WA, USA) had 10 bands (Table 1). Drone measurements were conducted around the solar noon and on clear days. The flight altitude was set to 30 m or 50 m, resulting in a ground sample distance of 2.08 cm or 3.47 cm, respectively. The measured data were processed using the Pix4Dmapper software (version 4.3.31), and the information from seven ground control points—obtained using a GRX2 GNSS Receiver (SOKKIA Corporation, Olathe, KS, USA)—was used for geometric correction. A calibration reflectance panel provided by the manufacturer of the multispectral camera was used to convert the digital number of the drone raster data to reflectance in Pix4Dmapper. Moreover, to improve the accuracy of the measured values, a radiometric correction was conducted on the rice paddy and garlic and onion field, using four homogeneous calibration targets with reflectance values of 3%, 21%, 32%, and 51%, respectively. The processed spectral reflectance drone images showed the crop growth status, planting pattern, and artificial structures in detail due to high spatial resolution.
Table 1 Information from the multispectral sensors on board the drone and Sentinel-2 satellite
No. | Sentinel-2/MSI | Drone/RedEdge-MX Dual | |||
---|---|---|---|---|---|
Band | Central wavelength/Bandwidth (nm) | Spatial resolution (m) | Band | Central wavelength/Bandwidth (nm) | |
Blue/aerosol | 1 | 443/36 | 60 | 1 | 444/28 |
Blue | 2 | 490/96 | 10 | 2 | 475/32 |
Green | 3 | 531/14 | |||
Green | 3 | 560/45 | 10 | 4 | 560/27 |
Red | - | 5 | 650/16 | ||
Red | 4 | 665/39 | 10 | 6 | 668/14 |
Red edge | 5 | 705/20 | 20 | 7 | 705/10 |
Red edge | 8 | 717/12 | |||
Red edge | 6 | 740/18 | 20 | 9 | 740/18 |
Red edge | 7 | 783/28 | 20 | ||
NIR | 8 | 842/141 | 10 | 10 | 842/57 |
NIR | 8A | 865/33 | 20 | ||
WV | 9 | 945/27 | 60 | ||
SWIR/Cirrus | 10 | 1375/76 | 60 | ||
SWIR | 11 | 1610/142 | 20 | ||
SWIR | 12 | 2190/240 | 20 |
MSI: multispectral imager, NIR: near-infrared, WV: water vapor, SWIR: shortwave-infrared.
SR and VI were compared to identify the characteristics of the data measured by the satellite and drone, using the following steps. Six bands with central wavelengths of 490, 560, 665, 705, 740, and 842 nm (based on Sentinel-2) were selected to compare the imagery obtained from the two devices. Despite the different multispectral sensors, these bands have similar central wavelengths (Table 1). Four VIs (the NDVI, green normalized difference vegetation index (GNDVI), enhanced vegetation index (EVI), and normalized difference red-edge index (NDRE) were calculated using the SR measured by the Sentinel-2 satellite and drone (Table 2). These VIs utilized approximately five bands, including the wavelengths of 490, 560, 665, 705, and 842 nm based on the Sentinel-2 satellite. The four vegetation indices theoretically range from –1 to 1, but in areas with soil or vegetation, the values are typically between 0 and 1. Values closer to 0 indicate sparse vegetation, while values near 1 signify dense and healthy vegetation. Next, the spatial resolution of the SR and VI measured by the drone was converted to 10 m or 20 m to match the spatial resolution of the Sentinel-2 data. In the case of Sentinel-2, the SR at the blue, green, red, and NIR bands has a spatial resolution of 10 m, whereas the red-edge band has a spatial resolution of 20 m. When upscaling the drone image to match the scale of the Sentinel-2 data, the resampling method involved calculating the average (Assmann et al., 2020), except for no data values. The upscaled drone and Sentinel-2 images were clipped using a region-of-interest shapefile (Fig. 1b–d). Finally, the VIs measured by the satellite and drone were named VISatellite and VIDrone, respectively.
Table 2 Information from the multispectral sensors on board the drone and Sentinel-2 satellite
VI | Equation | Sentinel-2 band | Drone band |
---|---|---|---|
NDVI | (RNIR – RRed) / (RNIR + RRed) | B4, B8 | B6, B10 |
GNDVI | (RNIR – RGreen) / (RNIR + RGreen) | B3, B8 | B4, B10 |
EVI | 2.5 * (RNIR – RRed) / (RNIR + 6 * RRed – 7.5 RBlue + 1) | B2, B4, B8 | B2, B6, B10 |
NDRE | (RNIR – RRededge) / (RNIR + RRededge) | B5, B8 | B7, B10 |
VI: vegetation index, NDVI: normalized difference vegetation index, GNDVI: green normalized difference vegetation index, EVI: enhanced vegetation index; NDRE: normalized difference red-edge index, R: Reflectance, NIR: near-infrared, B: Band.
Matchup data for six bands and four VIs were established between the imagery obtained from the satellite and that of the drone. Although more drone measurements were conducted, only six images from the rice paddy and the onion and garlic field were matched because of the effects of weather conditions. Furthermore, four images from the soybean field were matched. There were seven images measured on the same date. In addition, three images with a one-day difference and six images with a two-day difference between the measurement date of the Sentinel-2 satellite and that of the drone were used (Table 3).
Table 3 Measurement information of satellite
No. | Site | Measurement date | Time difference | Air temperature (ºC) | Relative humidity (%) | |||
---|---|---|---|---|---|---|---|---|
Satellite | Drone | Satellite | Drone | Satellite | Drone | |||
1 | Naju | Aug. 15, 2020 | Aug. 14, 2020 | 1 day | 31.5 | 32.0 | 66.1 | 73.0 |
2 | Naju | Sep. 4, 2020 | Sep. 4, 2020 | 0 day | 27.9 | 28.9 | 60.2 | 56.3 |
3 | Naju | Sep. 24, 2020 | Sep. 24, 2020 | 0 day | 24.0 | 25.5 | 58.2 | 54.1 |
4 | Naju | Jul. 21, 2021 | Jul. 21, 2021 | 0 day | 31.8 | 31.6 | 57.4 | 59.0 |
5 | Naju | Aug. 10, 2021 | Aug. 10, 2021 | 0 day | 30.0 | 29.7 | 66.2 | 65.4 |
6 | Naju | Aug. 20, 2021 | Aug. 19, 2021 | 1 day | 28.4 | 28.8 | 61.6 | 57.0 |
7 | Muan | Mar. 18, 2020 | Mar. 20, 2020 | 2 days | 15.6 | 13.9 | 34.4 | 33.4 |
8 | Muan | Apr. 27, 2020 | Apr. 29, 2020 | 2 days | 17.0 | 21.0 | 32.0 | 34.4 |
9 | Muan | May 22, 2020 | May 20, 2020 | 2 days | 23.6 | 19.0 | 47.7 | 60.6 |
10 | Muan | Mar. 18, 2021 | Mar. 18, 2021 | 0 day | 17.4 | 15.6 | 51.1 | 58.0 |
11 | Muan | May 12, 2021 | May 14, 2021 | 2 days | 25.3 | 26.0 | 67.5 | 65.6 |
12 | Muan | May 22, 2021 | May 24, 2021 | 2 days | 22.1 | 21.3 | 70.6 | 71.5 |
13 | Gimje | Jul. 1, 2022 | Jun. 30, 2022 | 1 days | 30.9 | 29.4 | 70.7 | 81.5 |
14 | Gimje | Sep. 9, 2022 | Sep. 7, 2022 | 2 days | 25.6 | 24.3 | 79.9 | 90.1 |
15 | Gimje | Oct. 14, 2022 | Oct. 14, 2022 | 0 day | 19.8 | 19.7 | 79.6 | 80.8 |
16 | Gimje | Oct. 19, 2022 | Oct. 19, 2022 | 0 day | 14.6 | 15.2 | 51.3 | 51.4 |
Values of air temperature and relative humidiy are at measurement time of Sentienl-2 satellite and drone.
The correlation coefficient, absolute bias, RMSE, normalized RMSE (NRMSE), and bias-corrected NRMSE were computed to compare the satellite and the upscaled drone data. The correlation coefficient was calculated using the Scipy library (Version 1.7.0) of Python language (Version 3.9.16). Absolute bias, which describes the degree of spread between satellite and drone pixels, was calculated as the sum of the absolute values of the differences between satellite and drone pixels (Eq. 1). RMSE is an indicator that quantitatively explains average error between drone and satellite data (Eq. 2).
Where Si, Di, and n indicate satellite pixel, upscaled drone pixel, and the total number of data, respectively. NRMSE was calculated using the difference between the minimum and maximum values of the y-axis (Eq. 3). The NRMSE represents the percentage of RMSE within the range of the variable (y-axis).
Where Smaximum and Sminimum are the minimum and maximum values of satellite data. To analyze errors after removing bias between satellites and drones, the bias correction was performed using linear regression to minimize the bias between the satellite and drone data, and the resulting score was defined as the bias-corrected NRMSE.
The SRSatellite was evaluated using the SRDrone in the rice paddy (Fig. 4). The correlation coefficient ranged from 0.477 to 0.859 at six bands (blue, green, red, red-edge at 704 nm (hereafter, rededge@704 nm), red-edge at 740 nm (hereafter, rededge@740 nm), and NIR wavelengths). In the rice paddy, the correlation coefficient was higher in the visible wavelength range than in the NIR wavelength range, which was consistent with the other statistical scores. The NRMSE ranged from 10.5% to 12.2% in the visible to rededge@704 nm range, while it was 35.8% in the rededge@740 nm and 30.3% in the NIR range. The SRSatellite at the rededge@740 nm and NIR wavelengths were underestimated compared to the SRDrone. In particular, the SR values in the NIR range showed a narrower range in the satellite data (0.271 to 0.538) compared to the drone data (0.204 to 0.678) (Fig. 4f). In other words, the NIR reflectance measured by the satellite was less sensitive than that measured by the drone in the rice paddy.
The SRSatellite was evaluated using data measured by the drone in the garlic and onion field (Fig. 5). The correlation coefficients between the SR values measured by the Sentinel-2 satellite and the drone ranged from 0.418 to 0.649. The values obtained in the garlic and onion fields were lower than those in the rice paddy (correlation coefficients=0.477 to 0.859). The presence of furrows and ridges in the garlic and onion fields caused a heterogeneous footprint. In particular, the bias-corrected NRMSE in the visible to rededge@705 nm range was lower in the garlic and onion field than in the rice paddy (Table 3). On the other hand, the bias-corrected NRMSE in the rededge@740 nm and NIR in the garlic and onion field were lower than that in the rice paddy because of the underestimation of the SRSatellite.
The highest statistical scores for the SR at the red-edge and NIR wavelengths measured by the Sentinel-2 satellite and drone corresponded to the soybean field (Fig. 6). Unlike in the rice paddy and garlic and onion field, the correlation coefficient was above 0.806 and the absolute bias was below 0.043. Although the values of NRMSE and bias-corrected NRMSE at visible wavelengths in the soybean field were low compared to those of the rice paddy (Figs. 6a–c), they were adequate at the rededge@740 nm and NIR wavelengths (Figs. 6e, f). In addition, the slope between the Sentinel-2 satellite and drone data was close to 1. These values indicate that the relationship between the Sentinel-2 satellite and drone data was strongest in the soybean field. All statistical results related to spectral reflectance between Sentinel-2 and drone are summarized in Table 4.
Table 4 Statistic results for surface reflectance and vegetation indices measured from drone and satellite at rice paddy, garlic and onion, and soybean fields.
Crop | Index | R | Slope | Absolute bias | RMSE | NRMSE (%) | Bias-corrected NRMSE (%) |
---|---|---|---|---|---|---|---|
Rice paddy | Blue | 0.903 | 0.859 | 0.019 | 0.021 | 11.6 | 6.1 |
Green | 0.877 | 0.773 | 0.018 | 0.023 | 11.5 | 8.5 | |
Red | 0.908 | 0.759 | 0.017 | 0.024 | 10.5 | 8.3 | |
Rededge@705 nm | 0.913 | 0.792 | 0.022 | 0.029 | 12.2 | 9.7 | |
Rededge@740 nm | 0.569 | 0.303 | 0.025 | 0.072 | 35.8 | 17.9 | |
NIR | 0.477 | 0.273 | 0.065 | 0.081 | 30.3 | 16.9 | |
NDVI | 0.932 | 0.801 | 0.062 | 0.078 | 10.3 | 7.2 | |
GNDVI | 0.927 | 0.805 | 0.057 | 0.070 | 11.3 | 7.1 | |
EVI | 0.811 | 0.603 | 0.085 | 0.105 | 15.2 | 11.1 | |
NDRE | 0.939 | 0.780 | 0.061 | 0.077 | 13.3 | 8.5 | |
Garlic and onion | Blue | 0.522 | 0.452 | 0.033 | 0.041 | 21.1 | 11.3 |
Green | 0.566 | 0.436 | 0.033 | 0.041 | 21.7 | 13.7 | |
Red | 0.647 | 0.432 | 0.041 | 0.051 | 22.5 | 14.1 | |
Rededge@705 nm | 0.418 | 0.316 | 0.036 | 0.046 | 28.2 | 18.4 | |
Rededge@740 nm | 0.550 | 0.599 | 0.034 | 0.042 | 20.8 | 18.4 | |
NIR | 0.649 | 0.599 | 0.039 | 0.049 | 18.5 | 15.5 | |
NDVI | 0.707 | 0.416 | 0.119 | 0.148 | 26.2 | 13.8 | |
GNDVI | 0.649 | 0.415 | 0.095 | 0.116 | 24.4 | 13.2 | |
EVI | 0.745 | 0.474 | 0.079 | 0.100 | 20.9 | 13.1 | |
NDRE | 0.624 | 0.378 | 0.072 | 0.094 | 23.3 | 12.9 | |
Soybean | Blue | 0.850 | 1.053 | 0.018 | 0.023 | 14.6 | 10.2 |
Green | 0.881 | 0.993 | 0.017 | 0.022 | 11.6 | 9.0 | |
Red | 0.862 | 0.822 | 0.017 | 0.024 | 10.5 | 9.6 | |
Rededge@705 nm | 0.883 | 0.783 | 0.024 | 0.031 | 14.3 | 12.0 | |
Rededge@740 nm | 0.806 | 1.092 | 0.043 | 0.052 | 18.7 | 16.9 | |
NIR | 0.925 | 1.157 | 0.039 | 0.051 | 12.6 | 11.1 | |
NDVI | 0.969 | 0.999 | 0.038 | 0.055 | 6.6 | 6.4 | |
GNDVI | 0.975 | 1.058 | 0.049 | 0.059 | 8.5 | 5.6 | |
EVI | 0.977 | 1.105 | 1.105 | 0.071 | 8.3 | 6.0 | |
NDRE | 0.988 | 0.982 | 0.982 | 0.039 | 5.3 | 4.6 |
R: coefficient of correlation, NIR: Near infrared, NDVI: normalized difference vegetation index, GNDVI: green normalized difference vegetation index, EVI: enhanced vegetation index, NDRE: normalized difference red edge index.
The five bands (blue, green, red, rededge@705nm, and NIR) were used to calculate four vegetation indices (NDVI, GNDVI, EVI, and NDRE). The differences between the VIs obtained using Sentinel-2 satellite and drone data from the paddy rice, garlic and onion, and soybean fields were evaluated.
In the rice paddy, the NRMSE of four VIs from the combination of drone and Sentinel-2 satellite data ranged from 10.3% to 15.2%. NDVI and GNDVI showed similar statistical performance (Figs. 7a, b). After bias correction, the NRMSE of NDVI and GNDVI decreased to 7.2% and 7.1%, respectively. Similar characteristics were observed in the case of NDRE (Fig. 7c). However, the values for EVI were lower than those for NDVI, GDVI, and NDRE. The bias-corrected NRMSE in EVI was 11.1% and the slope between EVIDrone and EVISatellite was 0.603 (Fig. 7d).
All statistical parameters comparing the VIs measured by the Sentienel-2 satellite and the drone in the garlic and onion field were lower than those in the rice paddy (Fig. 8). In particular, the RMSE and NRMSE values for four VIs were higher in the garlic and onion field because of the bias between the two measurements. VISatellite was overestimated compared to VIDrone for low values, but it was underestimated for high values. After bias correction of VISatellite, the NRMSE ranged from 12.9% to 13.8%. The distribution of each VI was clustered depending on the measurement date for the rice paddy (Fig. 7), but the values were mixed for the garlic and onion field (Fig. 8).
The four VIs of the soybean field exhibited the highest statistical parameters compared to those of the rice paddy and garlic and onion field. In particular, the slope of NDVI was 0.999, and the correlation coefficient and RMSE were 0.969 and 0.055, respectively (Fig. 9a). Even if the bias correction between NDVIDrone and NDVISatellite were not performed, NRMSE was just 6.6%, which is similar to the bias-corrected NRMSE (6.4%) as a result of the low absolute bias (0.038). These high scores were also seen for GNDVI, EVI, and NDRE. The NRMSE range for the three VIs was from 5.3% to 8.5%. After the bias correction, the NRMSE decreased from 4.6% to 6.0%. All statistical results related to vegetation indices between Sentinel-2 and drone are summarized in Table 4.
The distribution of absolute differences (Sentinel-2 satellite data minus bias-corrected drone data) for the four VIs was examined in the three areas. The bias correction of VIDrone was conducted regionally. The polygon in Fig. 10 represents the cropland, including the road and facilities at its edge. In the rice paddy, the absolute difference of VIs, calculated using normalization methods such as NDVI, GNDVI, and NDRE, was low in the middle of each field. However, pixels included in farm roads showed high absolute difference values (Figs. 10a–d). In the garlic and onion field, the spatial distribution of absolute differences was complex because of the influence of furrows and ridges (Figs. 10e–h). The absolute difference at the center of the soybean field was the lowest. Unlike the rice paddy, the spatial pattern of EVI in the soybean field was similar to that of other VIs (Figs. 10i–l).
The difference between VISatellite and VIDrone was analyzed based on surface heterogeneity. VIDrone was subjected to bias correction to eliminate bias with VISatellite, and then the standard deviation of VIDrone pixels located inner one pixel of Sentinel-2 satellite data was calculated. In NDVI, the difference between the maximum and minimum values at each range increased when there was a large standard deviation (Fig. 11a). This trend is also shown in GNDVI (Fig. 11b). NDVI and GNDVI exhibits the structural characteristics of vegetation as the crop grows and develops. Thus, error trends according to the standard deviation of each VIDrone appeared conspicuously. In the case of EVI and NDRE, the section with a standard deviation of 0 to 0.04 exhibited the smallest difference between the maximum and minimum values in the box plot (Figs. 11c, d). Nonetheless, the difference was large when the standard deviation of VIDrone was above 0.12.
According to previous studies, the VIs measured by the Sentinel-2 satellite were evaluated using drone images. The NDVISatellite measured by Sentinel-2 exhibited different correlations with NDVIDrone because of heterogeneity (Assmann et al., 2020; Di Gennaro et al., 2019). Naethe et al. (2023) reported that the NDVI produced by Sentinel-2 had an R2 of 0.890 and RMSE of 0.123 in the relationship with NDVIDrone. Ryu et al. (2020b) compared the statistical parameters of the Sentinel-2 NDVI and NDVIDrone and obtained an R2 of 0.817 and RMSE of 0.086 (n=135). Our results were similar to those of previous studies. The obtained RMSE values of NDVI were 0.078 (n=790), 0.148 (n=480), and 0.055 (n=1,044) for the rice, garlic, onion, and soybean fields, respectively.
The slopes between VISatellite and VIDrone were below 1.0 for the rice, garlic, and onion fields. VISatllite was overestimated at low values and underestimated at high values (Figs. 7 and 8). These error characteristics were consistent with previous studies. The Sentinel-2 NDVI was overestimated in open terrains and grasslands (low NDVI values) but it was underestimated in the forest (high NDVI values) (Isaev et al., 2023). The NDVIDrone values were higher than the NDVISatellite values in the upland field, which agreed with another study. The differences between VISatellite and VIDrone may be caused by the spatial resolution (Bollas et al., 2021). Riihimäki et al. (2019) reported that spatial resolution of satellite data was related to data distributions. In the same area, the NDVI range of minimum and maximum values for satellite images with low spatial resolution is narrower than that for satellite images with high spatial resolution (Park et al., 2019). The VIDrone values had a wider range under surface heterogeneity conditions (Figs. 7 and 8). SR was analyzed to identify the cause of the error characteristics of VIs.
The SR at the rededge@740 nm and NIR wavelengths measured by the Sentinel-2 satellite have a theoretical range of 0 to 1 (0% to 100%). Here, the maximum SR at the NIR wavelength was 0.538 in the rice paddy, whereas, in the soybean field, it reached 0.593. Consequently, the maximum VISatellite values in the soybean field were higher than those in the rice paddy. This difference can be attributed to the fact that crops like soybeans completely cover the soil. Soybean grows to the extent that the plants cover the furrow, so 10% or less of the solar irradiance is transmitted in the growing stage from the full pod to the beginning seed stages. Moreover, the saturation of the SR at the NIR wavelength in the rice paddy may be influenced by water because rice is sub-merged during the cultivation period. Given that rice is an upright crop, soil and water are partially exposed depending on the planting interval, and these can influence the VIs (Khaliq et al., 2019; Sozzi et al., 2020). The SR at the NIR wavelength is highly sensitive and may show low values in the presence of water.
The accuracy between VISatellite and VIDrone varied depending on the type of crop. The leaf structure affected the fraction of vegetation, which was related to the maximum SR value at the NIR wavelength in this study. We also checked the NIR values in other study areas. For example, in large-scale cabbage fields, which are characterized by broad leaves and dense planting (coordinates: 128.741256°E, 37.615712°N), pixels with high NIR values from the Sentinel-2 satellite were identified by their values being above 0.6. Higher NIR values (above 0.7) have been reported by previous studies (Flood, 2017; Li et al., 2018). Therefore, the type of crop should be considered when interpreting the quality of Sentinel-2 satellite data on agricultural fields.
In addition to the characteristics of the crop, the geometric error can affect the difference between the satellite and drone data. Jiang et al. (2022) showed that the position errors of drone or satellite images caused differences in SR. Therefore, heterogeneous pixels, which encompass various surface conditions, resulted in large differences in the values of NDVI and GDNVI. The outskirt pixels of the field exhibited large differences because of the inclusion of the road and the structure in this study (Fig. 10). Given that the geometric correction was conducted using the ground control points in the case of drone images, the geometric error of the satellite image may even be larger.
In the rice paddy and soybean fields, the RMSE of EVI was higher than that of other VIs (NDVI, GDNVI, and NDRE). The statistical parameters of EVI were lower than those of other VIs likely because of the way the VIs were calculated. NDVI, GNDVI, and NDRE were calculated based on normalization using the SR at two different wavelengths, whereas the calculation of EVI included a constant. Thus, errors in the SR measured by satellite and drone can affect the EVI values (Ryu et al., 2021). If solar radiance changes dramatically because of clouds during a drone flight, the SR includes noise because it is calculated based on measurements from a calibrated reflectance panel before the flight. Although the downwelling light was considered in the operation of the multi-spectral camera, the reflectance values of the mosaic drone images were affected by the light intensity. In addition, border noise occurs when mosaicing drone images and images. Therefore, the errors in the SR measured by the drone may be larger than in unnormalized VIs.
The performance of satellite data should be evaluated before applying a satellite-based crop monitoring system in precision agriculture. The goal of this study was to evaluate the error characteristics of the VISatellite, using drone images, depending on the crop type and surface heterogeneity. The conclusions of this study are summarized below:
The effect of crop structure was larger than that of the planting pattern (e.g., flat and ridge-furrow). Despite the presence of ridges and furrows in the soybean field, this area exhibited the highest statistical parameters of the SR and VI obtained by combining the Sentinel-2 satellite and drone data. The SRSatellite values at the NIR wavelength were similar to those of the SRDrone because the broad leaves of soybean completely covered the soil.
VISatellite was overestimated for low values but underestimated for high values compared to VIDrone in the rice paddy, garlic, and onion fields. SRSatellite at the rededge@740 nm and NIR wavelengths was underestimated but it was overestimated at visible wavelengths.
The range of SR and VI measured by the drone was wider than that measured by the Sentinel-2 satellite. This is due to the higher spatial resolution of the drone image compared to the satellite image.
The degree of surface heterogeneity affected the absolute difference between VISatellite and VIDrone. The difference was low in the center of the soybean and rice fields but high in the mixed pixels (including the road and structure). Thus, the outskirt pixels of the field should be excluded from the analysis.
Given the spatial resolution of the Sentinel-2 satellite, if the canopy does not cover the soil in the case of crops with a planting pattern with ridges and furrows, information included relatively large errors like garlic and onion fields. Thus, it is recommended to utilize high-resolution drone data to diagnose the condition of crops.
The potential for precision farming of the Sentinel-2 satellite data varies depending on the type of crop. Sentinel-2 satellite data can be used to diagnose the growth of crops such as soybean and paddy rice, but difficulties are expected in the case of crops such as garlic and onion at the pixel level. Higher spatial resolution data are required to determine the conditions of garlic and onion, and analyses at the agricultural parcel level should be conducted. Additional studies are needed in large garlic and onion fields to confirm this. Our results can help users understand the error characteristics of the SR and VI measured by the Sentinel-2 satellite.
This research was funded by the Rural Development Administration (grant no. PJ016768).
No potential conflict of interest relevant to this article was reported.
Korean J. Remote Sens. 2024; 40(5): 657-673
Published online October 31, 2024 https://doi.org/10.7780/kjrs.2024.40.5.1.19
Copyright © Korean Society of Remote Sensing.
Jae-Hyun Ryu1 , Hyun-Dong Moon2,3, Kyung-Do Lee4 , Jaeil Cho5,6, Ho-yong Ahn1*
1Researcher, National Agricultural Satellite Center, National Institute of Agricultural Sciences, Rural Development Administration, Wanju, Republic of Korea
2PhD Student, Department of Applied Plant Science, Chonnam National University, Gwangju, Republic of Korea
3PhD Student, BK21 FOUR Center for IT-Bio Convergence System Agriculture, Chonnam National University, Gwangju, Republic of Korea
4Senior Researcher, National Agricultural Satellite Center, National Institute of Agricultural Sciences, Rural Development Administration, Wanju, Republic of Korea
5Professor, Department of Applied Plant Science, Chonnam National University, Gwangju, Republic of Korea
6Professor, BK21 FOUR Center for IT-Bio Convergence System Agriculture, Chonnam National University, Gwangju, Republic of Korea
Correspondence to:Ho-yong Ahn
E-mail: hyahn85@korea.kr
This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (https://creativecommons.org/licenses/by-nc/4.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.
Satellite data are used in precision agriculture to optimize crop management. Thus, the planting pattern (e.g., flat and ridge-furrow) and crop type should be accurately reflected in the data. The purpose of this study was to identify the spatial characteristics of errors in the surface reflectance (SR) and vegetation index (VI) obtained from the Sentinel-2 satellite. Drone data were used to evaluate the suitability of the Sentinel-2 satellite for precision agriculture applications in agricultural fields. Four VIs (normalized difference vegetation index, green normalized difference vegetation index, enhanced vegetation index, and normalized difference red edge index) were calculated. The rice paddy exhibited a homogeneous surface, whereas garlic/onion and soybean fields showed high surface heterogeneity because of the presence of ridges and furrows. The SR values of the rice paddy, measured at near-infrared (NIR) wavelength using the Sentinel-2 satellite, were saturated. The VIs derived from both satellite and drone data exhibited a correlation above 0.811 and normalized root mean square error (NRMSE) below 11.1% after bias correction. The garlic and onion fields exhibited the worst results, with a bias-corrected NRMSE for VIs ranging between 12.9% and 13.8%. The soybean field, where the vegetation covered the surface almost completely, exhibited the best relationship between the Sentinel-2 and drone data. The correlation coefficient and bias-corrected NRMSE of VIs for the combination of the two devices were above 0.969 and below 6.4%, respectively. In addition, the SR at NIR had a correlation of 0.925 and a slope of 1.157, unlike in the rice paddy. These results indicate that crop structure has a greater effect than the planting pattern. The absolute difference between the VIs measured by the satellite and drone is influenced by the degree of surface heterogeneity. The errors are more pronounced at the farm-land edges. Our study contributes to a better understating of the characteristics of Sentinel-2 data for use in agricultural fields.
Keywords: UAV, Surface reflectance, Homogeneity, Crop
Precision agriculture incorporates remote-sensing techniques to optimize crop management (Bansod et al., 2017; Liaghat and Balasundram, 2010). A sensor-based monitoring system can non-destructively diagnose the conditions of a crop (Ryu et al., 2020a). The multispectral camera onboard a drone has a sampling distance of about several centimeters at a height of 150 m from the ground. Thus, pesticide spraying, variable fertilization, and irrigation management can be conducted using drone data (Hafeez et al., 2023; Sishodia et al., 2020). However, regularly capturing wide areas using drones consumes the labor force and requires economic efforts (Pla et al., 2019). Satellite data can be an alternative to overcome these disadvantages. Because satellites regularly capture images of crop-growing areas, information such as the planting time, harvesting time, and spatial variation depending on the crop’s conditions can be produced and provided to the farmer (Ali et al., 2021).
Abundant satellite data are freely available for monitoring crop growth and development in the field (Labib and Harris, 2018). The Sentinel-2 satellite, which is a representative agricultural observation satellite, has various bands from visible to shortwave infrared, and the spatial image resolution ranges from 10 to 60 m. Moreover, it is possible to observe the same area every 5 days (Song et al., 2021). The growth parameters, such as the leaf area index, biomass, and yield, can be estimated using data from the Sentinel-2 satellite (Bansod et al., 2017; Dong et al., 2020; Mao et al., 2022), and crop types can be classified (Maponya et al., 2020; Sonobe et al., 2018). The improved spatiotemporal resolution and surface reflectance (SR) at various wavelengths facilitate precision agriculture using satellite data (Kong et al., 2023). However, there are drawbacks regarding the usability of the freely available satellite data for precision agriculture.
The temporal resolution of Sentinel-2 satellite data can be unsatisfactory because of cloud and cloud shadow effects (Bukowiecki et al., 2021; Caparros-Santiago et al., 2023). If information about a specific period is not obtained for a long time, using satellite data for precision agriculture may be challenging. To minimize these limitations, fused satellite data have been produced as a harmonized SR product of satellites such as Landsat-8/9 and Sentinel-2 (Dhillon et al., 2022; Kong et al., 2021). Furthermore, more than 130 microsatellites have been launched for the PlanetScope constellation (Frazier et al., 2021), which makes an effort to observe specific areas every day (Roy et al., 2021). However, this constellation cannot escape the influence of clouds and cloud shadows, which are inherent limitations of satellite data.
The crop conditions can be monitored using the multispectral camera onboard a drone during specific growth periods for which satellite images have not been acquired (Martinez et al., 2021). Drone images can provide spatially detailed information about crops compared to satellite images (Messina et al., 2020). In addition, satellite and drone data can be jointly used after converting the spatial resolution of the drone image to match that of the satellite image (Jiang et al., 2022; Zhang et al., 2023). However, the information obtained from satellite data can be biased because of the low spatial resolution associated with surface heterogeneity.
Crops are cultivated using various planting patterns (e.g., flat and ridge-furrow), depending on the crop type, soil type, slope of the field, and climate. The flat planting pattern has advantages in managing water and nutrients, and the crops are easy to grow and harvest. The ridge-furrow planting pattern is used in areas with heavy precipitation or poor drainage. The height difference between the ridges and furrows prevents the crops from being submerged and avoids damage due to excess moisture (Verma et al., 2020). These planting patterns affect surface heterogeneity. Satellite data can provide inaccurate information about crop conditions when crops are planted in rows (Mazzia et al., 2020).
However, the degree of soil coverage can vary depending on the leaf structure of the crops, even if the ridge-furrow planting pattern is used. Therefore, prior to using drones and satellites together, the satellite and drone data should be compared to determine whether the information about crop conditions is accurate (Zhang et al., 2023). Most previous studies have focused on evaluating the normalized difference vegetation index (NDVI) measured by satellites (Verma et al., 2020). However, the multispectral imager includes not only red and near-infrared (NIR) bands but also blue, green, and red-edge bands. Thus, evaluating the SR in various wavelength bands is necessary and vegetation indices (VIs) should be calculated using the SR.
The main purpose of this study is to identify the characteristics of errors in the SR and VIs measured by the Sentinel-2 satellite depending on crop type, planting pattern, and the surface heterogeneity in agricultural fields. Drone-based images, which have higher spatial resolution than satellite image, were used to as reference assess the Sentinel-2 satellite data. Moreover, suitability and limitations of the Sentinel-2 satellite data for precision agriculture were investigated in rice paddy, garlic, onion, and soybean crops.
This study was conducted in three counties located in South Korea (Fig. 1a). One measurement site was a rice paddy located in Jeollanam-do Agricultural Research & Extension Services (Naju County, latitude of 35.0275°N, longitude of 126.8209°E). Rice was cultivated in 2020 and 2021. The rice was transplanted in early June, and the heading stage was observed in mid to late August. The rice was harvested from late September to early October. The total target study area (blue polygon in Fig. 1b), which comprised five paddies, was approximately 17,000 m2. Given that the size of each paddy is approximately 30 × 95 m, the pixels located in the center of each rice paddy field have a fairly homogeneous footprint (Fig. 2a).
The second measurement site was an upland field located in the National Institute of Crop Science (Muan County, latitude of 34.9671°N, longitude of 126.4528°E). Garlic and onion were cultivated in this area during the 2019–2020 and 2020–2021 growing seasons. The growing period for garlic and onion was from October to late March/early June. The upland field featured numerous furrows and ridges, which resulted in a fairly heterogeneous footprint (Fig. 2b). In addition, bare soil and a concrete road were present at the edge of the study area. The total area was approximately 8,400 m2 (red polygon in Fig. 1c).
The third measurement site was a field located in Gimje County (latitude of 35.7517°N, longitude of 126.8136°E) for cultivating soybeans. This area consisted of multiple furrows and ridges to mitigate damage from excess moisture because of poor drainage (Fig. 2c). Soybean was cultivated in this field from June 3, 2022, to November 2, 2022. Unlike the garlic and onion fields, the leaves of the soybean plants covered the entire field in August and September, which made the soil barely visible. The target area (orange polygon in Fig. 1d) encompassed approximately 27,600 m2.
Satellite images from Sentinel-2, one of the most prominent satellites used for crop data monitoring, were collected using the Sentinel Hub interface. The coordinate system and projection were set to WGS 1984 UTM Zone 52N and Transverse Mercator, respectively, to match the drone’s raster data. We downloaded the Sentinel-2 Level-2A orthoimage bottom-of-atmosphere-corrected reflectance data, which eliminated atmospheric effects. Subsequently, contaminated pixels caused by clouds and cloud shadows were filtered using scene classification and cloud information. Pixels with a cloud probability value above 0 were masked (Fig. 3).
The SR of the crops (rice, garlic, onion, and soybean) was measured using a multispectral camera mounted on a drone (Fig. 3). The multispectral camera (RedEdge-MX Dual: MicaSense, Inc., Seattle, WA, USA) had 10 bands (Table 1). Drone measurements were conducted around the solar noon and on clear days. The flight altitude was set to 30 m or 50 m, resulting in a ground sample distance of 2.08 cm or 3.47 cm, respectively. The measured data were processed using the Pix4Dmapper software (version 4.3.31), and the information from seven ground control points—obtained using a GRX2 GNSS Receiver (SOKKIA Corporation, Olathe, KS, USA)—was used for geometric correction. A calibration reflectance panel provided by the manufacturer of the multispectral camera was used to convert the digital number of the drone raster data to reflectance in Pix4Dmapper. Moreover, to improve the accuracy of the measured values, a radiometric correction was conducted on the rice paddy and garlic and onion field, using four homogeneous calibration targets with reflectance values of 3%, 21%, 32%, and 51%, respectively. The processed spectral reflectance drone images showed the crop growth status, planting pattern, and artificial structures in detail due to high spatial resolution.
Table 1 . Information from the multispectral sensors on board the drone and Sentinel-2 satellite.
No. | Sentinel-2/MSI | Drone/RedEdge-MX Dual | |||
---|---|---|---|---|---|
Band | Central wavelength/Bandwidth (nm) | Spatial resolution (m) | Band | Central wavelength/Bandwidth (nm) | |
Blue/aerosol | 1 | 443/36 | 60 | 1 | 444/28 |
Blue | 2 | 490/96 | 10 | 2 | 475/32 |
Green | 3 | 531/14 | |||
Green | 3 | 560/45 | 10 | 4 | 560/27 |
Red | - | 5 | 650/16 | ||
Red | 4 | 665/39 | 10 | 6 | 668/14 |
Red edge | 5 | 705/20 | 20 | 7 | 705/10 |
Red edge | 8 | 717/12 | |||
Red edge | 6 | 740/18 | 20 | 9 | 740/18 |
Red edge | 7 | 783/28 | 20 | ||
NIR | 8 | 842/141 | 10 | 10 | 842/57 |
NIR | 8A | 865/33 | 20 | ||
WV | 9 | 945/27 | 60 | ||
SWIR/Cirrus | 10 | 1375/76 | 60 | ||
SWIR | 11 | 1610/142 | 20 | ||
SWIR | 12 | 2190/240 | 20 |
MSI: multispectral imager, NIR: near-infrared, WV: water vapor, SWIR: shortwave-infrared..
SR and VI were compared to identify the characteristics of the data measured by the satellite and drone, using the following steps. Six bands with central wavelengths of 490, 560, 665, 705, 740, and 842 nm (based on Sentinel-2) were selected to compare the imagery obtained from the two devices. Despite the different multispectral sensors, these bands have similar central wavelengths (Table 1). Four VIs (the NDVI, green normalized difference vegetation index (GNDVI), enhanced vegetation index (EVI), and normalized difference red-edge index (NDRE) were calculated using the SR measured by the Sentinel-2 satellite and drone (Table 2). These VIs utilized approximately five bands, including the wavelengths of 490, 560, 665, 705, and 842 nm based on the Sentinel-2 satellite. The four vegetation indices theoretically range from –1 to 1, but in areas with soil or vegetation, the values are typically between 0 and 1. Values closer to 0 indicate sparse vegetation, while values near 1 signify dense and healthy vegetation. Next, the spatial resolution of the SR and VI measured by the drone was converted to 10 m or 20 m to match the spatial resolution of the Sentinel-2 data. In the case of Sentinel-2, the SR at the blue, green, red, and NIR bands has a spatial resolution of 10 m, whereas the red-edge band has a spatial resolution of 20 m. When upscaling the drone image to match the scale of the Sentinel-2 data, the resampling method involved calculating the average (Assmann et al., 2020), except for no data values. The upscaled drone and Sentinel-2 images were clipped using a region-of-interest shapefile (Fig. 1b–d). Finally, the VIs measured by the satellite and drone were named VISatellite and VIDrone, respectively.
Table 2 . Information from the multispectral sensors on board the drone and Sentinel-2 satellite.
VI | Equation | Sentinel-2 band | Drone band |
---|---|---|---|
NDVI | (RNIR – RRed) / (RNIR + RRed) | B4, B8 | B6, B10 |
GNDVI | (RNIR – RGreen) / (RNIR + RGreen) | B3, B8 | B4, B10 |
EVI | 2.5 * (RNIR – RRed) / (RNIR + 6 * RRed – 7.5 RBlue + 1) | B2, B4, B8 | B2, B6, B10 |
NDRE | (RNIR – RRededge) / (RNIR + RRededge) | B5, B8 | B7, B10 |
VI: vegetation index, NDVI: normalized difference vegetation index, GNDVI: green normalized difference vegetation index, EVI: enhanced vegetation index; NDRE: normalized difference red-edge index, R: Reflectance, NIR: near-infrared, B: Band..
Matchup data for six bands and four VIs were established between the imagery obtained from the satellite and that of the drone. Although more drone measurements were conducted, only six images from the rice paddy and the onion and garlic field were matched because of the effects of weather conditions. Furthermore, four images from the soybean field were matched. There were seven images measured on the same date. In addition, three images with a one-day difference and six images with a two-day difference between the measurement date of the Sentinel-2 satellite and that of the drone were used (Table 3).
Table 3 . Measurement information of satellite.
No. | Site | Measurement date | Time difference | Air temperature (ºC) | Relative humidity (%) | |||
---|---|---|---|---|---|---|---|---|
Satellite | Drone | Satellite | Drone | Satellite | Drone | |||
1 | Naju | Aug. 15, 2020 | Aug. 14, 2020 | 1 day | 31.5 | 32.0 | 66.1 | 73.0 |
2 | Naju | Sep. 4, 2020 | Sep. 4, 2020 | 0 day | 27.9 | 28.9 | 60.2 | 56.3 |
3 | Naju | Sep. 24, 2020 | Sep. 24, 2020 | 0 day | 24.0 | 25.5 | 58.2 | 54.1 |
4 | Naju | Jul. 21, 2021 | Jul. 21, 2021 | 0 day | 31.8 | 31.6 | 57.4 | 59.0 |
5 | Naju | Aug. 10, 2021 | Aug. 10, 2021 | 0 day | 30.0 | 29.7 | 66.2 | 65.4 |
6 | Naju | Aug. 20, 2021 | Aug. 19, 2021 | 1 day | 28.4 | 28.8 | 61.6 | 57.0 |
7 | Muan | Mar. 18, 2020 | Mar. 20, 2020 | 2 days | 15.6 | 13.9 | 34.4 | 33.4 |
8 | Muan | Apr. 27, 2020 | Apr. 29, 2020 | 2 days | 17.0 | 21.0 | 32.0 | 34.4 |
9 | Muan | May 22, 2020 | May 20, 2020 | 2 days | 23.6 | 19.0 | 47.7 | 60.6 |
10 | Muan | Mar. 18, 2021 | Mar. 18, 2021 | 0 day | 17.4 | 15.6 | 51.1 | 58.0 |
11 | Muan | May 12, 2021 | May 14, 2021 | 2 days | 25.3 | 26.0 | 67.5 | 65.6 |
12 | Muan | May 22, 2021 | May 24, 2021 | 2 days | 22.1 | 21.3 | 70.6 | 71.5 |
13 | Gimje | Jul. 1, 2022 | Jun. 30, 2022 | 1 days | 30.9 | 29.4 | 70.7 | 81.5 |
14 | Gimje | Sep. 9, 2022 | Sep. 7, 2022 | 2 days | 25.6 | 24.3 | 79.9 | 90.1 |
15 | Gimje | Oct. 14, 2022 | Oct. 14, 2022 | 0 day | 19.8 | 19.7 | 79.6 | 80.8 |
16 | Gimje | Oct. 19, 2022 | Oct. 19, 2022 | 0 day | 14.6 | 15.2 | 51.3 | 51.4 |
Values of air temperature and relative humidiy are at measurement time of Sentienl-2 satellite and drone..
The correlation coefficient, absolute bias, RMSE, normalized RMSE (NRMSE), and bias-corrected NRMSE were computed to compare the satellite and the upscaled drone data. The correlation coefficient was calculated using the Scipy library (Version 1.7.0) of Python language (Version 3.9.16). Absolute bias, which describes the degree of spread between satellite and drone pixels, was calculated as the sum of the absolute values of the differences between satellite and drone pixels (Eq. 1). RMSE is an indicator that quantitatively explains average error between drone and satellite data (Eq. 2).
Where Si, Di, and n indicate satellite pixel, upscaled drone pixel, and the total number of data, respectively. NRMSE was calculated using the difference between the minimum and maximum values of the y-axis (Eq. 3). The NRMSE represents the percentage of RMSE within the range of the variable (y-axis).
Where Smaximum and Sminimum are the minimum and maximum values of satellite data. To analyze errors after removing bias between satellites and drones, the bias correction was performed using linear regression to minimize the bias between the satellite and drone data, and the resulting score was defined as the bias-corrected NRMSE.
The SRSatellite was evaluated using the SRDrone in the rice paddy (Fig. 4). The correlation coefficient ranged from 0.477 to 0.859 at six bands (blue, green, red, red-edge at 704 nm (hereafter, rededge@704 nm), red-edge at 740 nm (hereafter, rededge@740 nm), and NIR wavelengths). In the rice paddy, the correlation coefficient was higher in the visible wavelength range than in the NIR wavelength range, which was consistent with the other statistical scores. The NRMSE ranged from 10.5% to 12.2% in the visible to rededge@704 nm range, while it was 35.8% in the rededge@740 nm and 30.3% in the NIR range. The SRSatellite at the rededge@740 nm and NIR wavelengths were underestimated compared to the SRDrone. In particular, the SR values in the NIR range showed a narrower range in the satellite data (0.271 to 0.538) compared to the drone data (0.204 to 0.678) (Fig. 4f). In other words, the NIR reflectance measured by the satellite was less sensitive than that measured by the drone in the rice paddy.
The SRSatellite was evaluated using data measured by the drone in the garlic and onion field (Fig. 5). The correlation coefficients between the SR values measured by the Sentinel-2 satellite and the drone ranged from 0.418 to 0.649. The values obtained in the garlic and onion fields were lower than those in the rice paddy (correlation coefficients=0.477 to 0.859). The presence of furrows and ridges in the garlic and onion fields caused a heterogeneous footprint. In particular, the bias-corrected NRMSE in the visible to rededge@705 nm range was lower in the garlic and onion field than in the rice paddy (Table 3). On the other hand, the bias-corrected NRMSE in the rededge@740 nm and NIR in the garlic and onion field were lower than that in the rice paddy because of the underestimation of the SRSatellite.
The highest statistical scores for the SR at the red-edge and NIR wavelengths measured by the Sentinel-2 satellite and drone corresponded to the soybean field (Fig. 6). Unlike in the rice paddy and garlic and onion field, the correlation coefficient was above 0.806 and the absolute bias was below 0.043. Although the values of NRMSE and bias-corrected NRMSE at visible wavelengths in the soybean field were low compared to those of the rice paddy (Figs. 6a–c), they were adequate at the rededge@740 nm and NIR wavelengths (Figs. 6e, f). In addition, the slope between the Sentinel-2 satellite and drone data was close to 1. These values indicate that the relationship between the Sentinel-2 satellite and drone data was strongest in the soybean field. All statistical results related to spectral reflectance between Sentinel-2 and drone are summarized in Table 4.
Table 4 . Statistic results for surface reflectance and vegetation indices measured from drone and satellite at rice paddy, garlic and onion, and soybean fields..
Crop | Index | R | Slope | Absolute bias | RMSE | NRMSE (%) | Bias-corrected NRMSE (%) |
---|---|---|---|---|---|---|---|
Rice paddy | Blue | 0.903 | 0.859 | 0.019 | 0.021 | 11.6 | 6.1 |
Green | 0.877 | 0.773 | 0.018 | 0.023 | 11.5 | 8.5 | |
Red | 0.908 | 0.759 | 0.017 | 0.024 | 10.5 | 8.3 | |
Rededge@705 nm | 0.913 | 0.792 | 0.022 | 0.029 | 12.2 | 9.7 | |
Rededge@740 nm | 0.569 | 0.303 | 0.025 | 0.072 | 35.8 | 17.9 | |
NIR | 0.477 | 0.273 | 0.065 | 0.081 | 30.3 | 16.9 | |
NDVI | 0.932 | 0.801 | 0.062 | 0.078 | 10.3 | 7.2 | |
GNDVI | 0.927 | 0.805 | 0.057 | 0.070 | 11.3 | 7.1 | |
EVI | 0.811 | 0.603 | 0.085 | 0.105 | 15.2 | 11.1 | |
NDRE | 0.939 | 0.780 | 0.061 | 0.077 | 13.3 | 8.5 | |
Garlic and onion | Blue | 0.522 | 0.452 | 0.033 | 0.041 | 21.1 | 11.3 |
Green | 0.566 | 0.436 | 0.033 | 0.041 | 21.7 | 13.7 | |
Red | 0.647 | 0.432 | 0.041 | 0.051 | 22.5 | 14.1 | |
Rededge@705 nm | 0.418 | 0.316 | 0.036 | 0.046 | 28.2 | 18.4 | |
Rededge@740 nm | 0.550 | 0.599 | 0.034 | 0.042 | 20.8 | 18.4 | |
NIR | 0.649 | 0.599 | 0.039 | 0.049 | 18.5 | 15.5 | |
NDVI | 0.707 | 0.416 | 0.119 | 0.148 | 26.2 | 13.8 | |
GNDVI | 0.649 | 0.415 | 0.095 | 0.116 | 24.4 | 13.2 | |
EVI | 0.745 | 0.474 | 0.079 | 0.100 | 20.9 | 13.1 | |
NDRE | 0.624 | 0.378 | 0.072 | 0.094 | 23.3 | 12.9 | |
Soybean | Blue | 0.850 | 1.053 | 0.018 | 0.023 | 14.6 | 10.2 |
Green | 0.881 | 0.993 | 0.017 | 0.022 | 11.6 | 9.0 | |
Red | 0.862 | 0.822 | 0.017 | 0.024 | 10.5 | 9.6 | |
Rededge@705 nm | 0.883 | 0.783 | 0.024 | 0.031 | 14.3 | 12.0 | |
Rededge@740 nm | 0.806 | 1.092 | 0.043 | 0.052 | 18.7 | 16.9 | |
NIR | 0.925 | 1.157 | 0.039 | 0.051 | 12.6 | 11.1 | |
NDVI | 0.969 | 0.999 | 0.038 | 0.055 | 6.6 | 6.4 | |
GNDVI | 0.975 | 1.058 | 0.049 | 0.059 | 8.5 | 5.6 | |
EVI | 0.977 | 1.105 | 1.105 | 0.071 | 8.3 | 6.0 | |
NDRE | 0.988 | 0.982 | 0.982 | 0.039 | 5.3 | 4.6 |
R: coefficient of correlation, NIR: Near infrared, NDVI: normalized difference vegetation index, GNDVI: green normalized difference vegetation index, EVI: enhanced vegetation index, NDRE: normalized difference red edge index..
The five bands (blue, green, red, rededge@705nm, and NIR) were used to calculate four vegetation indices (NDVI, GNDVI, EVI, and NDRE). The differences between the VIs obtained using Sentinel-2 satellite and drone data from the paddy rice, garlic and onion, and soybean fields were evaluated.
In the rice paddy, the NRMSE of four VIs from the combination of drone and Sentinel-2 satellite data ranged from 10.3% to 15.2%. NDVI and GNDVI showed similar statistical performance (Figs. 7a, b). After bias correction, the NRMSE of NDVI and GNDVI decreased to 7.2% and 7.1%, respectively. Similar characteristics were observed in the case of NDRE (Fig. 7c). However, the values for EVI were lower than those for NDVI, GDVI, and NDRE. The bias-corrected NRMSE in EVI was 11.1% and the slope between EVIDrone and EVISatellite was 0.603 (Fig. 7d).
All statistical parameters comparing the VIs measured by the Sentienel-2 satellite and the drone in the garlic and onion field were lower than those in the rice paddy (Fig. 8). In particular, the RMSE and NRMSE values for four VIs were higher in the garlic and onion field because of the bias between the two measurements. VISatellite was overestimated compared to VIDrone for low values, but it was underestimated for high values. After bias correction of VISatellite, the NRMSE ranged from 12.9% to 13.8%. The distribution of each VI was clustered depending on the measurement date for the rice paddy (Fig. 7), but the values were mixed for the garlic and onion field (Fig. 8).
The four VIs of the soybean field exhibited the highest statistical parameters compared to those of the rice paddy and garlic and onion field. In particular, the slope of NDVI was 0.999, and the correlation coefficient and RMSE were 0.969 and 0.055, respectively (Fig. 9a). Even if the bias correction between NDVIDrone and NDVISatellite were not performed, NRMSE was just 6.6%, which is similar to the bias-corrected NRMSE (6.4%) as a result of the low absolute bias (0.038). These high scores were also seen for GNDVI, EVI, and NDRE. The NRMSE range for the three VIs was from 5.3% to 8.5%. After the bias correction, the NRMSE decreased from 4.6% to 6.0%. All statistical results related to vegetation indices between Sentinel-2 and drone are summarized in Table 4.
The distribution of absolute differences (Sentinel-2 satellite data minus bias-corrected drone data) for the four VIs was examined in the three areas. The bias correction of VIDrone was conducted regionally. The polygon in Fig. 10 represents the cropland, including the road and facilities at its edge. In the rice paddy, the absolute difference of VIs, calculated using normalization methods such as NDVI, GNDVI, and NDRE, was low in the middle of each field. However, pixels included in farm roads showed high absolute difference values (Figs. 10a–d). In the garlic and onion field, the spatial distribution of absolute differences was complex because of the influence of furrows and ridges (Figs. 10e–h). The absolute difference at the center of the soybean field was the lowest. Unlike the rice paddy, the spatial pattern of EVI in the soybean field was similar to that of other VIs (Figs. 10i–l).
The difference between VISatellite and VIDrone was analyzed based on surface heterogeneity. VIDrone was subjected to bias correction to eliminate bias with VISatellite, and then the standard deviation of VIDrone pixels located inner one pixel of Sentinel-2 satellite data was calculated. In NDVI, the difference between the maximum and minimum values at each range increased when there was a large standard deviation (Fig. 11a). This trend is also shown in GNDVI (Fig. 11b). NDVI and GNDVI exhibits the structural characteristics of vegetation as the crop grows and develops. Thus, error trends according to the standard deviation of each VIDrone appeared conspicuously. In the case of EVI and NDRE, the section with a standard deviation of 0 to 0.04 exhibited the smallest difference between the maximum and minimum values in the box plot (Figs. 11c, d). Nonetheless, the difference was large when the standard deviation of VIDrone was above 0.12.
According to previous studies, the VIs measured by the Sentinel-2 satellite were evaluated using drone images. The NDVISatellite measured by Sentinel-2 exhibited different correlations with NDVIDrone because of heterogeneity (Assmann et al., 2020; Di Gennaro et al., 2019). Naethe et al. (2023) reported that the NDVI produced by Sentinel-2 had an R2 of 0.890 and RMSE of 0.123 in the relationship with NDVIDrone. Ryu et al. (2020b) compared the statistical parameters of the Sentinel-2 NDVI and NDVIDrone and obtained an R2 of 0.817 and RMSE of 0.086 (n=135). Our results were similar to those of previous studies. The obtained RMSE values of NDVI were 0.078 (n=790), 0.148 (n=480), and 0.055 (n=1,044) for the rice, garlic, onion, and soybean fields, respectively.
The slopes between VISatellite and VIDrone were below 1.0 for the rice, garlic, and onion fields. VISatllite was overestimated at low values and underestimated at high values (Figs. 7 and 8). These error characteristics were consistent with previous studies. The Sentinel-2 NDVI was overestimated in open terrains and grasslands (low NDVI values) but it was underestimated in the forest (high NDVI values) (Isaev et al., 2023). The NDVIDrone values were higher than the NDVISatellite values in the upland field, which agreed with another study. The differences between VISatellite and VIDrone may be caused by the spatial resolution (Bollas et al., 2021). Riihimäki et al. (2019) reported that spatial resolution of satellite data was related to data distributions. In the same area, the NDVI range of minimum and maximum values for satellite images with low spatial resolution is narrower than that for satellite images with high spatial resolution (Park et al., 2019). The VIDrone values had a wider range under surface heterogeneity conditions (Figs. 7 and 8). SR was analyzed to identify the cause of the error characteristics of VIs.
The SR at the rededge@740 nm and NIR wavelengths measured by the Sentinel-2 satellite have a theoretical range of 0 to 1 (0% to 100%). Here, the maximum SR at the NIR wavelength was 0.538 in the rice paddy, whereas, in the soybean field, it reached 0.593. Consequently, the maximum VISatellite values in the soybean field were higher than those in the rice paddy. This difference can be attributed to the fact that crops like soybeans completely cover the soil. Soybean grows to the extent that the plants cover the furrow, so 10% or less of the solar irradiance is transmitted in the growing stage from the full pod to the beginning seed stages. Moreover, the saturation of the SR at the NIR wavelength in the rice paddy may be influenced by water because rice is sub-merged during the cultivation period. Given that rice is an upright crop, soil and water are partially exposed depending on the planting interval, and these can influence the VIs (Khaliq et al., 2019; Sozzi et al., 2020). The SR at the NIR wavelength is highly sensitive and may show low values in the presence of water.
The accuracy between VISatellite and VIDrone varied depending on the type of crop. The leaf structure affected the fraction of vegetation, which was related to the maximum SR value at the NIR wavelength in this study. We also checked the NIR values in other study areas. For example, in large-scale cabbage fields, which are characterized by broad leaves and dense planting (coordinates: 128.741256°E, 37.615712°N), pixels with high NIR values from the Sentinel-2 satellite were identified by their values being above 0.6. Higher NIR values (above 0.7) have been reported by previous studies (Flood, 2017; Li et al., 2018). Therefore, the type of crop should be considered when interpreting the quality of Sentinel-2 satellite data on agricultural fields.
In addition to the characteristics of the crop, the geometric error can affect the difference between the satellite and drone data. Jiang et al. (2022) showed that the position errors of drone or satellite images caused differences in SR. Therefore, heterogeneous pixels, which encompass various surface conditions, resulted in large differences in the values of NDVI and GDNVI. The outskirt pixels of the field exhibited large differences because of the inclusion of the road and the structure in this study (Fig. 10). Given that the geometric correction was conducted using the ground control points in the case of drone images, the geometric error of the satellite image may even be larger.
In the rice paddy and soybean fields, the RMSE of EVI was higher than that of other VIs (NDVI, GDNVI, and NDRE). The statistical parameters of EVI were lower than those of other VIs likely because of the way the VIs were calculated. NDVI, GNDVI, and NDRE were calculated based on normalization using the SR at two different wavelengths, whereas the calculation of EVI included a constant. Thus, errors in the SR measured by satellite and drone can affect the EVI values (Ryu et al., 2021). If solar radiance changes dramatically because of clouds during a drone flight, the SR includes noise because it is calculated based on measurements from a calibrated reflectance panel before the flight. Although the downwelling light was considered in the operation of the multi-spectral camera, the reflectance values of the mosaic drone images were affected by the light intensity. In addition, border noise occurs when mosaicing drone images and images. Therefore, the errors in the SR measured by the drone may be larger than in unnormalized VIs.
The performance of satellite data should be evaluated before applying a satellite-based crop monitoring system in precision agriculture. The goal of this study was to evaluate the error characteristics of the VISatellite, using drone images, depending on the crop type and surface heterogeneity. The conclusions of this study are summarized below:
The effect of crop structure was larger than that of the planting pattern (e.g., flat and ridge-furrow). Despite the presence of ridges and furrows in the soybean field, this area exhibited the highest statistical parameters of the SR and VI obtained by combining the Sentinel-2 satellite and drone data. The SRSatellite values at the NIR wavelength were similar to those of the SRDrone because the broad leaves of soybean completely covered the soil.
VISatellite was overestimated for low values but underestimated for high values compared to VIDrone in the rice paddy, garlic, and onion fields. SRSatellite at the rededge@740 nm and NIR wavelengths was underestimated but it was overestimated at visible wavelengths.
The range of SR and VI measured by the drone was wider than that measured by the Sentinel-2 satellite. This is due to the higher spatial resolution of the drone image compared to the satellite image.
The degree of surface heterogeneity affected the absolute difference between VISatellite and VIDrone. The difference was low in the center of the soybean and rice fields but high in the mixed pixels (including the road and structure). Thus, the outskirt pixels of the field should be excluded from the analysis.
Given the spatial resolution of the Sentinel-2 satellite, if the canopy does not cover the soil in the case of crops with a planting pattern with ridges and furrows, information included relatively large errors like garlic and onion fields. Thus, it is recommended to utilize high-resolution drone data to diagnose the condition of crops.
The potential for precision farming of the Sentinel-2 satellite data varies depending on the type of crop. Sentinel-2 satellite data can be used to diagnose the growth of crops such as soybean and paddy rice, but difficulties are expected in the case of crops such as garlic and onion at the pixel level. Higher spatial resolution data are required to determine the conditions of garlic and onion, and analyses at the agricultural parcel level should be conducted. Additional studies are needed in large garlic and onion fields to confirm this. Our results can help users understand the error characteristics of the SR and VI measured by the Sentinel-2 satellite.
This research was funded by the Rural Development Administration (grant no. PJ016768).
No potential conflict of interest relevant to this article was reported.
Table 1 . Information from the multispectral sensors on board the drone and Sentinel-2 satellite.
No. | Sentinel-2/MSI | Drone/RedEdge-MX Dual | |||
---|---|---|---|---|---|
Band | Central wavelength/Bandwidth (nm) | Spatial resolution (m) | Band | Central wavelength/Bandwidth (nm) | |
Blue/aerosol | 1 | 443/36 | 60 | 1 | 444/28 |
Blue | 2 | 490/96 | 10 | 2 | 475/32 |
Green | 3 | 531/14 | |||
Green | 3 | 560/45 | 10 | 4 | 560/27 |
Red | - | 5 | 650/16 | ||
Red | 4 | 665/39 | 10 | 6 | 668/14 |
Red edge | 5 | 705/20 | 20 | 7 | 705/10 |
Red edge | 8 | 717/12 | |||
Red edge | 6 | 740/18 | 20 | 9 | 740/18 |
Red edge | 7 | 783/28 | 20 | ||
NIR | 8 | 842/141 | 10 | 10 | 842/57 |
NIR | 8A | 865/33 | 20 | ||
WV | 9 | 945/27 | 60 | ||
SWIR/Cirrus | 10 | 1375/76 | 60 | ||
SWIR | 11 | 1610/142 | 20 | ||
SWIR | 12 | 2190/240 | 20 |
MSI: multispectral imager, NIR: near-infrared, WV: water vapor, SWIR: shortwave-infrared..
Table 2 . Information from the multispectral sensors on board the drone and Sentinel-2 satellite.
VI | Equation | Sentinel-2 band | Drone band |
---|---|---|---|
NDVI | (RNIR – RRed) / (RNIR + RRed) | B4, B8 | B6, B10 |
GNDVI | (RNIR – RGreen) / (RNIR + RGreen) | B3, B8 | B4, B10 |
EVI | 2.5 * (RNIR – RRed) / (RNIR + 6 * RRed – 7.5 RBlue + 1) | B2, B4, B8 | B2, B6, B10 |
NDRE | (RNIR – RRededge) / (RNIR + RRededge) | B5, B8 | B7, B10 |
VI: vegetation index, NDVI: normalized difference vegetation index, GNDVI: green normalized difference vegetation index, EVI: enhanced vegetation index; NDRE: normalized difference red-edge index, R: Reflectance, NIR: near-infrared, B: Band..
Table 3 . Measurement information of satellite.
No. | Site | Measurement date | Time difference | Air temperature (ºC) | Relative humidity (%) | |||
---|---|---|---|---|---|---|---|---|
Satellite | Drone | Satellite | Drone | Satellite | Drone | |||
1 | Naju | Aug. 15, 2020 | Aug. 14, 2020 | 1 day | 31.5 | 32.0 | 66.1 | 73.0 |
2 | Naju | Sep. 4, 2020 | Sep. 4, 2020 | 0 day | 27.9 | 28.9 | 60.2 | 56.3 |
3 | Naju | Sep. 24, 2020 | Sep. 24, 2020 | 0 day | 24.0 | 25.5 | 58.2 | 54.1 |
4 | Naju | Jul. 21, 2021 | Jul. 21, 2021 | 0 day | 31.8 | 31.6 | 57.4 | 59.0 |
5 | Naju | Aug. 10, 2021 | Aug. 10, 2021 | 0 day | 30.0 | 29.7 | 66.2 | 65.4 |
6 | Naju | Aug. 20, 2021 | Aug. 19, 2021 | 1 day | 28.4 | 28.8 | 61.6 | 57.0 |
7 | Muan | Mar. 18, 2020 | Mar. 20, 2020 | 2 days | 15.6 | 13.9 | 34.4 | 33.4 |
8 | Muan | Apr. 27, 2020 | Apr. 29, 2020 | 2 days | 17.0 | 21.0 | 32.0 | 34.4 |
9 | Muan | May 22, 2020 | May 20, 2020 | 2 days | 23.6 | 19.0 | 47.7 | 60.6 |
10 | Muan | Mar. 18, 2021 | Mar. 18, 2021 | 0 day | 17.4 | 15.6 | 51.1 | 58.0 |
11 | Muan | May 12, 2021 | May 14, 2021 | 2 days | 25.3 | 26.0 | 67.5 | 65.6 |
12 | Muan | May 22, 2021 | May 24, 2021 | 2 days | 22.1 | 21.3 | 70.6 | 71.5 |
13 | Gimje | Jul. 1, 2022 | Jun. 30, 2022 | 1 days | 30.9 | 29.4 | 70.7 | 81.5 |
14 | Gimje | Sep. 9, 2022 | Sep. 7, 2022 | 2 days | 25.6 | 24.3 | 79.9 | 90.1 |
15 | Gimje | Oct. 14, 2022 | Oct. 14, 2022 | 0 day | 19.8 | 19.7 | 79.6 | 80.8 |
16 | Gimje | Oct. 19, 2022 | Oct. 19, 2022 | 0 day | 14.6 | 15.2 | 51.3 | 51.4 |
Values of air temperature and relative humidiy are at measurement time of Sentienl-2 satellite and drone..
Table 4 . Statistic results for surface reflectance and vegetation indices measured from drone and satellite at rice paddy, garlic and onion, and soybean fields..
Crop | Index | R | Slope | Absolute bias | RMSE | NRMSE (%) | Bias-corrected NRMSE (%) |
---|---|---|---|---|---|---|---|
Rice paddy | Blue | 0.903 | 0.859 | 0.019 | 0.021 | 11.6 | 6.1 |
Green | 0.877 | 0.773 | 0.018 | 0.023 | 11.5 | 8.5 | |
Red | 0.908 | 0.759 | 0.017 | 0.024 | 10.5 | 8.3 | |
Rededge@705 nm | 0.913 | 0.792 | 0.022 | 0.029 | 12.2 | 9.7 | |
Rededge@740 nm | 0.569 | 0.303 | 0.025 | 0.072 | 35.8 | 17.9 | |
NIR | 0.477 | 0.273 | 0.065 | 0.081 | 30.3 | 16.9 | |
NDVI | 0.932 | 0.801 | 0.062 | 0.078 | 10.3 | 7.2 | |
GNDVI | 0.927 | 0.805 | 0.057 | 0.070 | 11.3 | 7.1 | |
EVI | 0.811 | 0.603 | 0.085 | 0.105 | 15.2 | 11.1 | |
NDRE | 0.939 | 0.780 | 0.061 | 0.077 | 13.3 | 8.5 | |
Garlic and onion | Blue | 0.522 | 0.452 | 0.033 | 0.041 | 21.1 | 11.3 |
Green | 0.566 | 0.436 | 0.033 | 0.041 | 21.7 | 13.7 | |
Red | 0.647 | 0.432 | 0.041 | 0.051 | 22.5 | 14.1 | |
Rededge@705 nm | 0.418 | 0.316 | 0.036 | 0.046 | 28.2 | 18.4 | |
Rededge@740 nm | 0.550 | 0.599 | 0.034 | 0.042 | 20.8 | 18.4 | |
NIR | 0.649 | 0.599 | 0.039 | 0.049 | 18.5 | 15.5 | |
NDVI | 0.707 | 0.416 | 0.119 | 0.148 | 26.2 | 13.8 | |
GNDVI | 0.649 | 0.415 | 0.095 | 0.116 | 24.4 | 13.2 | |
EVI | 0.745 | 0.474 | 0.079 | 0.100 | 20.9 | 13.1 | |
NDRE | 0.624 | 0.378 | 0.072 | 0.094 | 23.3 | 12.9 | |
Soybean | Blue | 0.850 | 1.053 | 0.018 | 0.023 | 14.6 | 10.2 |
Green | 0.881 | 0.993 | 0.017 | 0.022 | 11.6 | 9.0 | |
Red | 0.862 | 0.822 | 0.017 | 0.024 | 10.5 | 9.6 | |
Rededge@705 nm | 0.883 | 0.783 | 0.024 | 0.031 | 14.3 | 12.0 | |
Rededge@740 nm | 0.806 | 1.092 | 0.043 | 0.052 | 18.7 | 16.9 | |
NIR | 0.925 | 1.157 | 0.039 | 0.051 | 12.6 | 11.1 | |
NDVI | 0.969 | 0.999 | 0.038 | 0.055 | 6.6 | 6.4 | |
GNDVI | 0.975 | 1.058 | 0.049 | 0.059 | 8.5 | 5.6 | |
EVI | 0.977 | 1.105 | 1.105 | 0.071 | 8.3 | 6.0 | |
NDRE | 0.988 | 0.982 | 0.982 | 0.039 | 5.3 | 4.6 |
R: coefficient of correlation, NIR: Near infrared, NDVI: normalized difference vegetation index, GNDVI: green normalized difference vegetation index, EVI: enhanced vegetation index, NDRE: normalized difference red edge index..
Jae-Hyun Ryu 1) · Jung-Gon Han 2) · Ho-yong Ahn 3) · Sang-Il Na 3) · Byungmo Lee 4) · Kyung-do Lee 3)†
Korean J. Remote Sens. 2022; 38(5): 535-543Ji-Ae Jung, Young-Il Cho, Sunmin Lee, Moung-Jin Lee
Korean J. Remote Sens. 2024; 40(5): 525-537