Research Article

Split Viewer

Korean J. Remote Sens. 2024; 40(5): 657-673

Published online: October 31, 2024

https://doi.org/10.7780/kjrs.2024.40.5.1.19

© Korean Society of Remote Sensing

Evaluation of Surface Reflectance and Vegetation Indices Measured by Sentinel-2 Satellite Using Drone Considering Crop Type and Surface Heterogeneity

Jae-Hyun Ryu1 , Hyun-Dong Moon2,3, Kyung-Do Lee4 , Jaeil Cho5,6, Ho-yong Ahn1*

1Researcher, National Agricultural Satellite Center, National Institute of Agricultural Sciences, Rural Development Administration, Wanju, Republic of Korea
2PhD Student, Department of Applied Plant Science, Chonnam National University, Gwangju, Republic of Korea
3PhD Student, BK21 FOUR Center for IT-Bio Convergence System Agriculture, Chonnam National University, Gwangju, Republic of Korea
4Senior Researcher, National Agricultural Satellite Center, National Institute of Agricultural Sciences, Rural Development Administration, Wanju, Republic of Korea
5Professor, Department of Applied Plant Science, Chonnam National University, Gwangju, Republic of Korea
6Professor, BK21 FOUR Center for IT-Bio Convergence System Agriculture, Chonnam National University, Gwangju, Republic of Korea

Correspondence to : Ho-yong Ahn
E-mail: hyahn85@korea.kr

Received: September 30, 2024; Revised: October 17, 2024; Accepted: October 18, 2024

This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (https://creativecommons.org/licenses/by-nc/4.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.

Satellite data are used in precision agriculture to optimize crop management. Thus, the planting pattern (e.g., flat and ridge-furrow) and crop type should be accurately reflected in the data. The purpose of this study was to identify the spatial characteristics of errors in the surface reflectance (SR) and vegetation index (VI) obtained from the Sentinel-2 satellite. Drone data were used to evaluate the suitability of the Sentinel-2 satellite for precision agriculture applications in agricultural fields. Four VIs (normalized difference vegetation index, green normalized difference vegetation index, enhanced vegetation index, and normalized difference red edge index) were calculated. The rice paddy exhibited a homogeneous surface, whereas garlic/onion and soybean fields showed high surface heterogeneity because of the presence of ridges and furrows. The SR values of the rice paddy, measured at near-infrared (NIR) wavelength using the Sentinel-2 satellite, were saturated. The VIs derived from both satellite and drone data exhibited a correlation above 0.811 and normalized root mean square error (NRMSE) below 11.1% after bias correction. The garlic and onion fields exhibited the worst results, with a bias-corrected NRMSE for VIs ranging between 12.9% and 13.8%. The soybean field, where the vegetation covered the surface almost completely, exhibited the best relationship between the Sentinel-2 and drone data. The correlation coefficient and bias-corrected NRMSE of VIs for the combination of the two devices were above 0.969 and below 6.4%, respectively. In addition, the SR at NIR had a correlation of 0.925 and a slope of 1.157, unlike in the rice paddy. These results indicate that crop structure has a greater effect than the planting pattern. The absolute difference between the VIs measured by the satellite and drone is influenced by the degree of surface heterogeneity. The errors are more pronounced at the farm-land edges. Our study contributes to a better understating of the characteristics of Sentinel-2 data for use in agricultural fields.

Keywords UAV, Surface reflectance, Homogeneity, Crop

Precision agriculture incorporates remote-sensing techniques to optimize crop management (Bansod et al., 2017; Liaghat and Balasundram, 2010). A sensor-based monitoring system can non-destructively diagnose the conditions of a crop (Ryu et al., 2020a). The multispectral camera onboard a drone has a sampling distance of about several centimeters at a height of 150 m from the ground. Thus, pesticide spraying, variable fertilization, and irrigation management can be conducted using drone data (Hafeez et al., 2023; Sishodia et al., 2020). However, regularly capturing wide areas using drones consumes the labor force and requires economic efforts (Pla et al., 2019). Satellite data can be an alternative to overcome these disadvantages. Because satellites regularly capture images of crop-growing areas, information such as the planting time, harvesting time, and spatial variation depending on the crop’s conditions can be produced and provided to the farmer (Ali et al., 2021).

Abundant satellite data are freely available for monitoring crop growth and development in the field (Labib and Harris, 2018). The Sentinel-2 satellite, which is a representative agricultural observation satellite, has various bands from visible to shortwave infrared, and the spatial image resolution ranges from 10 to 60 m. Moreover, it is possible to observe the same area every 5 days (Song et al., 2021). The growth parameters, such as the leaf area index, biomass, and yield, can be estimated using data from the Sentinel-2 satellite (Bansod et al., 2017; Dong et al., 2020; Mao et al., 2022), and crop types can be classified (Maponya et al., 2020; Sonobe et al., 2018). The improved spatiotemporal resolution and surface reflectance (SR) at various wavelengths facilitate precision agriculture using satellite data (Kong et al., 2023). However, there are drawbacks regarding the usability of the freely available satellite data for precision agriculture.

The temporal resolution of Sentinel-2 satellite data can be unsatisfactory because of cloud and cloud shadow effects (Bukowiecki et al., 2021; Caparros-Santiago et al., 2023). If information about a specific period is not obtained for a long time, using satellite data for precision agriculture may be challenging. To minimize these limitations, fused satellite data have been produced as a harmonized SR product of satellites such as Landsat-8/9 and Sentinel-2 (Dhillon et al., 2022; Kong et al., 2021). Furthermore, more than 130 microsatellites have been launched for the PlanetScope constellation (Frazier et al., 2021), which makes an effort to observe specific areas every day (Roy et al., 2021). However, this constellation cannot escape the influence of clouds and cloud shadows, which are inherent limitations of satellite data.

The crop conditions can be monitored using the multispectral camera onboard a drone during specific growth periods for which satellite images have not been acquired (Martinez et al., 2021). Drone images can provide spatially detailed information about crops compared to satellite images (Messina et al., 2020). In addition, satellite and drone data can be jointly used after converting the spatial resolution of the drone image to match that of the satellite image (Jiang et al., 2022; Zhang et al., 2023). However, the information obtained from satellite data can be biased because of the low spatial resolution associated with surface heterogeneity.

Crops are cultivated using various planting patterns (e.g., flat and ridge-furrow), depending on the crop type, soil type, slope of the field, and climate. The flat planting pattern has advantages in managing water and nutrients, and the crops are easy to grow and harvest. The ridge-furrow planting pattern is used in areas with heavy precipitation or poor drainage. The height difference between the ridges and furrows prevents the crops from being submerged and avoids damage due to excess moisture (Verma et al., 2020). These planting patterns affect surface heterogeneity. Satellite data can provide inaccurate information about crop conditions when crops are planted in rows (Mazzia et al., 2020).

However, the degree of soil coverage can vary depending on the leaf structure of the crops, even if the ridge-furrow planting pattern is used. Therefore, prior to using drones and satellites together, the satellite and drone data should be compared to determine whether the information about crop conditions is accurate (Zhang et al., 2023). Most previous studies have focused on evaluating the normalized difference vegetation index (NDVI) measured by satellites (Verma et al., 2020). However, the multispectral imager includes not only red and near-infrared (NIR) bands but also blue, green, and red-edge bands. Thus, evaluating the SR in various wavelength bands is necessary and vegetation indices (VIs) should be calculated using the SR.

The main purpose of this study is to identify the characteristics of errors in the SR and VIs measured by the Sentinel-2 satellite depending on crop type, planting pattern, and the surface heterogeneity in agricultural fields. Drone-based images, which have higher spatial resolution than satellite image, were used to as reference assess the Sentinel-2 satellite data. Moreover, suitability and limitations of the Sentinel-2 satellite data for precision agriculture were investigated in rice paddy, garlic, onion, and soybean crops.

2.1. Study Area

This study was conducted in three counties located in South Korea (Fig. 1a). One measurement site was a rice paddy located in Jeollanam-do Agricultural Research & Extension Services (Naju County, latitude of 35.0275°N, longitude of 126.8209°E). Rice was cultivated in 2020 and 2021. The rice was transplanted in early June, and the heading stage was observed in mid to late August. The rice was harvested from late September to early October. The total target study area (blue polygon in Fig. 1b), which comprised five paddies, was approximately 17,000 m2. Given that the size of each paddy is approximately 30 × 95 m, the pixels located in the center of each rice paddy field have a fairly homogeneous footprint (Fig. 2a).

Fig. 1. Study area including heterogeneous and homogeneous footprints. The blue, red, and orange grids indicate the region of interest (based on 10 m). (a) Measurement sites. (b) Rice paddy with a homogeneous footprint. (c) Garlic and onion fields with heterogeneous footprints; this upland field includes many furrows and ridges. (d) Paddy for cultivating soybean; this paddy includes many furrows and ridges.
Fig. 2. Zoomed-in field images measured by drone. (a) Rice paddy. (b) Garlic and onion field including ridges and furrows. (c) Soybean field including ridges and furrows.

The second measurement site was an upland field located in the National Institute of Crop Science (Muan County, latitude of 34.9671°N, longitude of 126.4528°E). Garlic and onion were cultivated in this area during the 2019–2020 and 2020–2021 growing seasons. The growing period for garlic and onion was from October to late March/early June. The upland field featured numerous furrows and ridges, which resulted in a fairly heterogeneous footprint (Fig. 2b). In addition, bare soil and a concrete road were present at the edge of the study area. The total area was approximately 8,400 m2 (red polygon in Fig. 1c).

The third measurement site was a field located in Gimje County (latitude of 35.7517°N, longitude of 126.8136°E) for cultivating soybeans. This area consisted of multiple furrows and ridges to mitigate damage from excess moisture because of poor drainage (Fig. 2c). Soybean was cultivated in this field from June 3, 2022, to November 2, 2022. Unlike the garlic and onion fields, the leaves of the soybean plants covered the entire field in August and September, which made the soil barely visible. The target area (orange polygon in Fig. 1d) encompassed approximately 27,600 m2.

2.2. Dataset

Satellite images from Sentinel-2, one of the most prominent satellites used for crop data monitoring, were collected using the Sentinel Hub interface. The coordinate system and projection were set to WGS 1984 UTM Zone 52N and Transverse Mercator, respectively, to match the drone’s raster data. We downloaded the Sentinel-2 Level-2A orthoimage bottom-of-atmosphere-corrected reflectance data, which eliminated atmospheric effects. Subsequently, contaminated pixels caused by clouds and cloud shadows were filtered using scene classification and cloud information. Pixels with a cloud probability value above 0 were masked (Fig. 3).

Fig. 3. Flowchart for comparing the surface reflectance and vegetation indices measured by the satellite and drone.

The SR of the crops (rice, garlic, onion, and soybean) was measured using a multispectral camera mounted on a drone (Fig. 3). The multispectral camera (RedEdge-MX Dual: MicaSense, Inc., Seattle, WA, USA) had 10 bands (Table 1). Drone measurements were conducted around the solar noon and on clear days. The flight altitude was set to 30 m or 50 m, resulting in a ground sample distance of 2.08 cm or 3.47 cm, respectively. The measured data were processed using the Pix4Dmapper software (version 4.3.31), and the information from seven ground control points—obtained using a GRX2 GNSS Receiver (SOKKIA Corporation, Olathe, KS, USA)—was used for geometric correction. A calibration reflectance panel provided by the manufacturer of the multispectral camera was used to convert the digital number of the drone raster data to reflectance in Pix4Dmapper. Moreover, to improve the accuracy of the measured values, a radiometric correction was conducted on the rice paddy and garlic and onion field, using four homogeneous calibration targets with reflectance values of 3%, 21%, 32%, and 51%, respectively. The processed spectral reflectance drone images showed the crop growth status, planting pattern, and artificial structures in detail due to high spatial resolution.

Table 1 Information from the multispectral sensors on board the drone and Sentinel-2 satellite

No.Sentinel-2/MSIDrone/RedEdge-MX Dual
BandCentral wavelength/Bandwidth (nm)Spatial resolution (m)BandCentral wavelength/Bandwidth (nm)
Blue/aerosol1443/36601444/28
Blue2490/96102475/32
Green3531/14
Green3560/45104560/27
Red-5650/16
Red4665/39106668/14
Red edge5705/20207705/10
Red edge8717/12
Red edge6740/18209740/18
Red edge7783/2820
NIR8842/1411010842/57
NIR8A865/3320
WV9945/2760
SWIR/Cirrus101375/7660
SWIR111610/14220
SWIR122190/24020

MSI: multispectral imager, NIR: near-infrared, WV: water vapor, SWIR: shortwave-infrared.



2.3. Matchup Data

SR and VI were compared to identify the characteristics of the data measured by the satellite and drone, using the following steps. Six bands with central wavelengths of 490, 560, 665, 705, 740, and 842 nm (based on Sentinel-2) were selected to compare the imagery obtained from the two devices. Despite the different multispectral sensors, these bands have similar central wavelengths (Table 1). Four VIs (the NDVI, green normalized difference vegetation index (GNDVI), enhanced vegetation index (EVI), and normalized difference red-edge index (NDRE) were calculated using the SR measured by the Sentinel-2 satellite and drone (Table 2). These VIs utilized approximately five bands, including the wavelengths of 490, 560, 665, 705, and 842 nm based on the Sentinel-2 satellite. The four vegetation indices theoretically range from –1 to 1, but in areas with soil or vegetation, the values are typically between 0 and 1. Values closer to 0 indicate sparse vegetation, while values near 1 signify dense and healthy vegetation. Next, the spatial resolution of the SR and VI measured by the drone was converted to 10 m or 20 m to match the spatial resolution of the Sentinel-2 data. In the case of Sentinel-2, the SR at the blue, green, red, and NIR bands has a spatial resolution of 10 m, whereas the red-edge band has a spatial resolution of 20 m. When upscaling the drone image to match the scale of the Sentinel-2 data, the resampling method involved calculating the average (Assmann et al., 2020), except for no data values. The upscaled drone and Sentinel-2 images were clipped using a region-of-interest shapefile (Fig. 1b–d). Finally, the VIs measured by the satellite and drone were named VISatellite and VIDrone, respectively.

Table 2 Information from the multispectral sensors on board the drone and Sentinel-2 satellite

VIEquationSentinel-2 bandDrone band
NDVI(RNIR – RRed) / (RNIR + RRed)B4, B8B6, B10
GNDVI(RNIR – RGreen) / (RNIR + RGreen)B3, B8B4, B10
EVI2.5 * (RNIR – RRed) / (RNIR + 6 * RRed – 7.5 RBlue + 1)B2, B4, B8B2, B6, B10
NDRE(RNIR – RRededge) / (RNIR + RRededge)B5, B8B7, B10

VI: vegetation index, NDVI: normalized difference vegetation index, GNDVI: green normalized difference vegetation index, EVI: enhanced vegetation index; NDRE: normalized difference red-edge index, R: Reflectance, NIR: near-infrared, B: Band.



Matchup data for six bands and four VIs were established between the imagery obtained from the satellite and that of the drone. Although more drone measurements were conducted, only six images from the rice paddy and the onion and garlic field were matched because of the effects of weather conditions. Furthermore, four images from the soybean field were matched. There were seven images measured on the same date. In addition, three images with a one-day difference and six images with a two-day difference between the measurement date of the Sentinel-2 satellite and that of the drone were used (Table 3).

Table 3 Measurement information of satellite

No.SiteMeasurement dateTime differenceAir temperature (ºC)Relative humidity (%)
SatelliteDroneSatelliteDroneSatelliteDrone
1NajuAug. 15, 2020Aug. 14, 20201 day31.532.066.173.0
2NajuSep. 4, 2020Sep. 4, 20200 day27.928.960.256.3
3NajuSep. 24, 2020Sep. 24, 20200 day24.025.558.254.1
4NajuJul. 21, 2021Jul. 21, 20210 day31.831.657.459.0
5NajuAug. 10, 2021Aug. 10, 20210 day30.029.766.265.4
6NajuAug. 20, 2021Aug. 19, 20211 day28.428.861.657.0
7MuanMar. 18, 2020Mar. 20, 20202 days15.613.934.433.4
8MuanApr. 27, 2020Apr. 29, 20202 days17.021.032.034.4
9MuanMay 22, 2020May 20, 20202 days23.619.047.760.6
10MuanMar. 18, 2021Mar. 18, 20210 day17.415.651.158.0
11MuanMay 12, 2021May 14, 20212 days25.326.067.565.6
12MuanMay 22, 2021May 24, 20212 days22.121.370.671.5
13GimjeJul. 1, 2022Jun. 30, 20221 days30.929.470.781.5
14GimjeSep. 9, 2022Sep. 7, 20222 days25.624.379.990.1
15GimjeOct. 14, 2022Oct. 14, 20220 day19.819.779.680.8
16GimjeOct. 19, 2022Oct. 19, 20220 day14.615.251.351.4

Values of air temperature and relative humidiy are at measurement time of Sentienl-2 satellite and drone.



2.4. Statistical Analysis

The correlation coefficient, absolute bias, RMSE, normalized RMSE (NRMSE), and bias-corrected NRMSE were computed to compare the satellite and the upscaled drone data. The correlation coefficient was calculated using the Scipy library (Version 1.7.0) of Python language (Version 3.9.16). Absolute bias, which describes the degree of spread between satellite and drone pixels, was calculated as the sum of the absolute values of the differences between satellite and drone pixels (Eq. 1). RMSE is an indicator that quantitatively explains average error between drone and satellite data (Eq. 2).

Absolute bias= i=1n SiDi n RMSE= i=1nSiDi2 n

Where Si, Di, and n indicate satellite pixel, upscaled drone pixel, and the total number of data, respectively. NRMSE was calculated using the difference between the minimum and maximum values of the y-axis (Eq. 3). The NRMSE represents the percentage of RMSE within the range of the variable (y-axis).

NRMSE=RMSESmaximumSminimum×100

Where Smaximum and Sminimum are the minimum and maximum values of satellite data. To analyze errors after removing bias between satellites and drones, the bias correction was performed using linear regression to minimize the bias between the satellite and drone data, and the resulting score was defined as the bias-corrected NRMSE.

3.1. Spectral Reflectance

The SRSatellite was evaluated using the SRDrone in the rice paddy (Fig. 4). The correlation coefficient ranged from 0.477 to 0.859 at six bands (blue, green, red, red-edge at 704 nm (hereafter, rededge@704 nm), red-edge at 740 nm (hereafter, rededge@740 nm), and NIR wavelengths). In the rice paddy, the correlation coefficient was higher in the visible wavelength range than in the NIR wavelength range, which was consistent with the other statistical scores. The NRMSE ranged from 10.5% to 12.2% in the visible to rededge@704 nm range, while it was 35.8% in the rededge@740 nm and 30.3% in the NIR range. The SRSatellite at the rededge@740 nm and NIR wavelengths were underestimated compared to the SRDrone. In particular, the SR values in the NIR range showed a narrower range in the satellite data (0.271 to 0.538) compared to the drone data (0.204 to 0.678) (Fig. 4f). In other words, the NIR reflectance measured by the satellite was less sensitive than that measured by the drone in the rice paddy.

Fig. 4. Scatter plots for the combination of drone-based surface reflectance (SRDrone) and satellite-based surface reflectance (SRSatellite) in the rice paddy: (a) blue wavelength, (b) green wavelength, (c) red wavelength, (d) red-edge wavelength at 704 nm, (e) red-edge wavelength at 740 nm, and (f) near-infrared wavelength.

The SRSatellite was evaluated using data measured by the drone in the garlic and onion field (Fig. 5). The correlation coefficients between the SR values measured by the Sentinel-2 satellite and the drone ranged from 0.418 to 0.649. The values obtained in the garlic and onion fields were lower than those in the rice paddy (correlation coefficients=0.477 to 0.859). The presence of furrows and ridges in the garlic and onion fields caused a heterogeneous footprint. In particular, the bias-corrected NRMSE in the visible to rededge@705 nm range was lower in the garlic and onion field than in the rice paddy (Table 3). On the other hand, the bias-corrected NRMSE in the rededge@740 nm and NIR in the garlic and onion field were lower than that in the rice paddy because of the underestimation of the SRSatellite.

Fig. 5. Scatter plots for the combination of drone-based surface reflectance (SRDrone) and satellite-based surface reflectance (SRSatellite) in the garlic and onion field, including ridges and furrows: (a) blue wavelength, (b) green wavelength, (c) red wavelength, (d) red-edge wavelength at 704 nm, (e) red-edge wavelength at 740 nm, and (f) near-infrared wavelength.

The highest statistical scores for the SR at the red-edge and NIR wavelengths measured by the Sentinel-2 satellite and drone corresponded to the soybean field (Fig. 6). Unlike in the rice paddy and garlic and onion field, the correlation coefficient was above 0.806 and the absolute bias was below 0.043. Although the values of NRMSE and bias-corrected NRMSE at visible wavelengths in the soybean field were low compared to those of the rice paddy (Figs. 6a–c), they were adequate at the rededge@740 nm and NIR wavelengths (Figs. 6e, f). In addition, the slope between the Sentinel-2 satellite and drone data was close to 1. These values indicate that the relationship between the Sentinel-2 satellite and drone data was strongest in the soybean field. All statistical results related to spectral reflectance between Sentinel-2 and drone are summarized in Table 4.

Fig. 6. Scatter plots for the combination of drone-based surface reflectance (SRDrone) and satellite-based surface reflectance (SRSatellite) in the soybean field: (a) blue wavelength, (b) green wavelength, (c) red wavelength, (d) red-edge wavelength at 704 nm, (e) red-edge wavelength at 740 nm, and (f) near-infrared wavelength.

Table 4 Statistic results for surface reflectance and vegetation indices measured from drone and satellite at rice paddy, garlic and onion, and soybean fields.

CropIndexRSlopeAbsolute biasRMSENRMSE (%)Bias-corrected NRMSE (%)
Rice paddyBlue0.9030.8590.0190.02111.66.1
Green0.8770.7730.0180.02311.58.5
Red0.9080.7590.0170.02410.58.3
Rededge@705 nm0.9130.7920.0220.02912.29.7
Rededge@740 nm0.5690.3030.0250.07235.817.9
NIR0.4770.2730.0650.08130.316.9
NDVI0.9320.8010.0620.07810.37.2
GNDVI0.9270.8050.0570.07011.37.1
EVI0.8110.6030.0850.10515.211.1
NDRE0.9390.7800.0610.07713.38.5
Garlic and onionBlue0.5220.4520.0330.04121.111.3
Green0.5660.4360.0330.04121.713.7
Red0.6470.4320.0410.05122.514.1
Rededge@705 nm0.4180.3160.0360.04628.218.4
Rededge@740 nm0.5500.5990.0340.04220.818.4
NIR0.6490.5990.0390.04918.515.5
NDVI0.7070.4160.1190.14826.213.8
GNDVI0.6490.4150.0950.11624.413.2
EVI0.7450.4740.0790.10020.913.1
NDRE0.6240.3780.0720.09423.312.9
SoybeanBlue0.8501.0530.0180.02314.610.2
Green0.8810.9930.0170.02211.69.0
Red0.8620.8220.0170.02410.59.6
Rededge@705 nm0.8830.7830.0240.03114.312.0
Rededge@740 nm0.8061.0920.0430.05218.716.9
NIR0.9251.1570.0390.05112.611.1
NDVI0.9690.9990.0380.0556.66.4
GNDVI0.9751.0580.0490.0598.55.6
EVI0.9771.1051.1050.0718.36.0
NDRE0.9880.9820.9820.0395.34.6

R: coefficient of correlation, NIR: Near infrared, NDVI: normalized difference vegetation index, GNDVI: green normalized difference vegetation index, EVI: enhanced vegetation index, NDRE: normalized difference red edge index.



3.2. Vegetation Index

The five bands (blue, green, red, rededge@705nm, and NIR) were used to calculate four vegetation indices (NDVI, GNDVI, EVI, and NDRE). The differences between the VIs obtained using Sentinel-2 satellite and drone data from the paddy rice, garlic and onion, and soybean fields were evaluated.

In the rice paddy, the NRMSE of four VIs from the combination of drone and Sentinel-2 satellite data ranged from 10.3% to 15.2%. NDVI and GNDVI showed similar statistical performance (Figs. 7a, b). After bias correction, the NRMSE of NDVI and GNDVI decreased to 7.2% and 7.1%, respectively. Similar characteristics were observed in the case of NDRE (Fig. 7c). However, the values for EVI were lower than those for NDVI, GDVI, and NDRE. The bias-corrected NRMSE in EVI was 11.1% and the slope between EVIDrone and EVISatellite was 0.603 (Fig. 7d).

Fig. 7. Scatter plots for the combination of VIs obtained by the drone and Sentinel-2 satellite in the rice paddy: (a) NDVI, (b) GNDVI, (c) EVI, and (d) NDRE.

All statistical parameters comparing the VIs measured by the Sentienel-2 satellite and the drone in the garlic and onion field were lower than those in the rice paddy (Fig. 8). In particular, the RMSE and NRMSE values for four VIs were higher in the garlic and onion field because of the bias between the two measurements. VISatellite was overestimated compared to VIDrone for low values, but it was underestimated for high values. After bias correction of VISatellite, the NRMSE ranged from 12.9% to 13.8%. The distribution of each VI was clustered depending on the measurement date for the rice paddy (Fig. 7), but the values were mixed for the garlic and onion field (Fig. 8).

Fig. 8. Scatter plots for the combination of drone-based VIs and Sentinel-2 satellite-based VIs for the garlic and onion field: (a) NDVI, (b) GNDVI, (c) EVI, and (d) NDRE.

The four VIs of the soybean field exhibited the highest statistical parameters compared to those of the rice paddy and garlic and onion field. In particular, the slope of NDVI was 0.999, and the correlation coefficient and RMSE were 0.969 and 0.055, respectively (Fig. 9a). Even if the bias correction between NDVIDrone and NDVISatellite were not performed, NRMSE was just 6.6%, which is similar to the bias-corrected NRMSE (6.4%) as a result of the low absolute bias (0.038). These high scores were also seen for GNDVI, EVI, and NDRE. The NRMSE range for the three VIs was from 5.3% to 8.5%. After the bias correction, the NRMSE decreased from 4.6% to 6.0%. All statistical results related to vegetation indices between Sentinel-2 and drone are summarized in Table 4.

Fig. 9. Scatter plots for the combination of VIs measured by the Sentinel-2 satellite and drone in the soybean field: (a) NDVI, (b) GNDVI, (c) EVI, and (d) NDRE.

3.3. Error Characteristics of the Vegetation Index

The distribution of absolute differences (Sentinel-2 satellite data minus bias-corrected drone data) for the four VIs was examined in the three areas. The bias correction of VIDrone was conducted regionally. The polygon in Fig. 10 represents the cropland, including the road and facilities at its edge. In the rice paddy, the absolute difference of VIs, calculated using normalization methods such as NDVI, GNDVI, and NDRE, was low in the middle of each field. However, pixels included in farm roads showed high absolute difference values (Figs. 10a–d). In the garlic and onion field, the spatial distribution of absolute differences was complex because of the influence of furrows and ridges (Figs. 10e–h). The absolute difference at the center of the soybean field was the lowest. Unlike the rice paddy, the spatial pattern of EVI in the soybean field was similar to that of other VIs (Figs. 10i–l).

Fig. 10. Absolute difference images between the vegetation indices measured by the satellite and drone (Sentinel-2 satellite minus bias-corrected drone). The polygon indicates the cropland. (a)–(d) NDVI, GNDVI, EVI, and NDRE in the rice paddy; (e)–(h) NDVI, GNDVI, EVI, and NDRE in the garlic and onion field; and (i)–(l) NDVI, GNDVI, EVI, and NDRE in the soybean field.

The difference between VISatellite and VIDrone was analyzed based on surface heterogeneity. VIDrone was subjected to bias correction to eliminate bias with VISatellite, and then the standard deviation of VIDrone pixels located inner one pixel of Sentinel-2 satellite data was calculated. In NDVI, the difference between the maximum and minimum values at each range increased when there was a large standard deviation (Fig. 11a). This trend is also shown in GNDVI (Fig. 11b). NDVI and GNDVI exhibits the structural characteristics of vegetation as the crop grows and develops. Thus, error trends according to the standard deviation of each VIDrone appeared conspicuously. In the case of EVI and NDRE, the section with a standard deviation of 0 to 0.04 exhibited the smallest difference between the maximum and minimum values in the box plot (Figs. 11c, d). Nonetheless, the difference was large when the standard deviation of VIDrone was above 0.12.

Fig. 11. Box plots of the differences between the vegetation index based on Sentinel-2 satellite data and the bias-corrected vegetation index measured by the drone, based on the standard deviation (SD) of VIDrone pixels located inner one pixel of Sentinel-2 satellite data. Max. and Min. are the maximum and minimum values at each range. (a) NDVI, (b) GDNVI, (c) EVI, and (d) NDRE.

According to previous studies, the VIs measured by the Sentinel-2 satellite were evaluated using drone images. The NDVISatellite measured by Sentinel-2 exhibited different correlations with NDVIDrone because of heterogeneity (Assmann et al., 2020; Di Gennaro et al., 2019). Naethe et al. (2023) reported that the NDVI produced by Sentinel-2 had an R2 of 0.890 and RMSE of 0.123 in the relationship with NDVIDrone. Ryu et al. (2020b) compared the statistical parameters of the Sentinel-2 NDVI and NDVIDrone and obtained an R2 of 0.817 and RMSE of 0.086 (n=135). Our results were similar to those of previous studies. The obtained RMSE values of NDVI were 0.078 (n=790), 0.148 (n=480), and 0.055 (n=1,044) for the rice, garlic, onion, and soybean fields, respectively.

The slopes between VISatellite and VIDrone were below 1.0 for the rice, garlic, and onion fields. VISatllite was overestimated at low values and underestimated at high values (Figs. 7 and 8). These error characteristics were consistent with previous studies. The Sentinel-2 NDVI was overestimated in open terrains and grasslands (low NDVI values) but it was underestimated in the forest (high NDVI values) (Isaev et al., 2023). The NDVIDrone values were higher than the NDVISatellite values in the upland field, which agreed with another study. The differences between VISatellite and VIDrone may be caused by the spatial resolution (Bollas et al., 2021). Riihimäki et al. (2019) reported that spatial resolution of satellite data was related to data distributions. In the same area, the NDVI range of minimum and maximum values for satellite images with low spatial resolution is narrower than that for satellite images with high spatial resolution (Park et al., 2019). The VIDrone values had a wider range under surface heterogeneity conditions (Figs. 7 and 8). SR was analyzed to identify the cause of the error characteristics of VIs.

The SR at the rededge@740 nm and NIR wavelengths measured by the Sentinel-2 satellite have a theoretical range of 0 to 1 (0% to 100%). Here, the maximum SR at the NIR wavelength was 0.538 in the rice paddy, whereas, in the soybean field, it reached 0.593. Consequently, the maximum VISatellite values in the soybean field were higher than those in the rice paddy. This difference can be attributed to the fact that crops like soybeans completely cover the soil. Soybean grows to the extent that the plants cover the furrow, so 10% or less of the solar irradiance is transmitted in the growing stage from the full pod to the beginning seed stages. Moreover, the saturation of the SR at the NIR wavelength in the rice paddy may be influenced by water because rice is sub-merged during the cultivation period. Given that rice is an upright crop, soil and water are partially exposed depending on the planting interval, and these can influence the VIs (Khaliq et al., 2019; Sozzi et al., 2020). The SR at the NIR wavelength is highly sensitive and may show low values in the presence of water.

The accuracy between VISatellite and VIDrone varied depending on the type of crop. The leaf structure affected the fraction of vegetation, which was related to the maximum SR value at the NIR wavelength in this study. We also checked the NIR values in other study areas. For example, in large-scale cabbage fields, which are characterized by broad leaves and dense planting (coordinates: 128.741256°E, 37.615712°N), pixels with high NIR values from the Sentinel-2 satellite were identified by their values being above 0.6. Higher NIR values (above 0.7) have been reported by previous studies (Flood, 2017; Li et al., 2018). Therefore, the type of crop should be considered when interpreting the quality of Sentinel-2 satellite data on agricultural fields.

In addition to the characteristics of the crop, the geometric error can affect the difference between the satellite and drone data. Jiang et al. (2022) showed that the position errors of drone or satellite images caused differences in SR. Therefore, heterogeneous pixels, which encompass various surface conditions, resulted in large differences in the values of NDVI and GDNVI. The outskirt pixels of the field exhibited large differences because of the inclusion of the road and the structure in this study (Fig. 10). Given that the geometric correction was conducted using the ground control points in the case of drone images, the geometric error of the satellite image may even be larger.

In the rice paddy and soybean fields, the RMSE of EVI was higher than that of other VIs (NDVI, GDNVI, and NDRE). The statistical parameters of EVI were lower than those of other VIs likely because of the way the VIs were calculated. NDVI, GNDVI, and NDRE were calculated based on normalization using the SR at two different wavelengths, whereas the calculation of EVI included a constant. Thus, errors in the SR measured by satellite and drone can affect the EVI values (Ryu et al., 2021). If solar radiance changes dramatically because of clouds during a drone flight, the SR includes noise because it is calculated based on measurements from a calibrated reflectance panel before the flight. Although the downwelling light was considered in the operation of the multi-spectral camera, the reflectance values of the mosaic drone images were affected by the light intensity. In addition, border noise occurs when mosaicing drone images and images. Therefore, the errors in the SR measured by the drone may be larger than in unnormalized VIs.

The performance of satellite data should be evaluated before applying a satellite-based crop monitoring system in precision agriculture. The goal of this study was to evaluate the error characteristics of the VISatellite, using drone images, depending on the crop type and surface heterogeneity. The conclusions of this study are summarized below:

  • The effect of crop structure was larger than that of the planting pattern (e.g., flat and ridge-furrow). Despite the presence of ridges and furrows in the soybean field, this area exhibited the highest statistical parameters of the SR and VI obtained by combining the Sentinel-2 satellite and drone data. The SRSatellite values at the NIR wavelength were similar to those of the SRDrone because the broad leaves of soybean completely covered the soil.

  • VISatellite was overestimated for low values but underestimated for high values compared to VIDrone in the rice paddy, garlic, and onion fields. SRSatellite at the rededge@740 nm and NIR wavelengths was underestimated but it was overestimated at visible wavelengths.

  • The range of SR and VI measured by the drone was wider than that measured by the Sentinel-2 satellite. This is due to the higher spatial resolution of the drone image compared to the satellite image.

  • The degree of surface heterogeneity affected the absolute difference between VISatellite and VIDrone. The difference was low in the center of the soybean and rice fields but high in the mixed pixels (including the road and structure). Thus, the outskirt pixels of the field should be excluded from the analysis.

  • Given the spatial resolution of the Sentinel-2 satellite, if the canopy does not cover the soil in the case of crops with a planting pattern with ridges and furrows, information included relatively large errors like garlic and onion fields. Thus, it is recommended to utilize high-resolution drone data to diagnose the condition of crops.

The potential for precision farming of the Sentinel-2 satellite data varies depending on the type of crop. Sentinel-2 satellite data can be used to diagnose the growth of crops such as soybean and paddy rice, but difficulties are expected in the case of crops such as garlic and onion at the pixel level. Higher spatial resolution data are required to determine the conditions of garlic and onion, and analyses at the agricultural parcel level should be conducted. Additional studies are needed in large garlic and onion fields to confirm this. Our results can help users understand the error characteristics of the SR and VI measured by the Sentinel-2 satellite.

This research was funded by the Rural Development Administration (grant no. PJ016768).

No potential conflict of interest relevant to this article was reported.

  1. Ali, A. M., Savin, I., Poddubskiy, A., Abouelghar, M., Saleh, N., and Abutaleb, K., et al, 2021. Integrated method for rice cultivation monitoring using Sentinel-2 data and Leaf Area Index. The Egyptian Journal of Remote Sensing and Space, 24(3), 431-441. https://doi.org/10.1016/j.ejrs.2020.06.007
  2. Assmann, J. J., Myers-Smith, I. H., Kerby, J. T., Cunliffe, A. M., and Daskalova, G. N., 2020. Drone data reveal heterogeneity in tundra greenness and phenology not captured by satellites. Environmental Research Letters, 15(12), 125002. https://doi.org/10.1088/1748-9326/abbf7d
  3. Bansod, B., Singh, R., Thakur, R., and Singhal, G., 2017. A comparision between satellite based and drone based remote sensing technology to achieve sustainable development: A review. Journal of Agriculture and Environment for International Development, 111(2), 383-407. https://doi.org/10.12895/jaeid.20172.690
  4. Bollas, N., Kokinou, E., and Polychronos, V., 2021. Comparison of Sentinel-2 and UAV multispectral data for use in precision agriculture: An application from Northern Greece. Drones, 5(2), 35. https://doi.org/10.3390/drones5020035
  5. Bukowiecki, J., Rose, T., and Kage, H., 2021. Sentinel-2 data for precision agriculture?-A UAV-based assessment. Sensors, 21(8), 2861. https://doi.org/10.3390/s21082861
  6. Caparros-Santiago, J. A., Quesada-Ruiz, L. C., and Rodriguez-Galiano, V., 2023. Can land surface phenology from Sentinel-2 time-series be used as an indicator of Macaronesian ecosystem dynamics?. Ecological Informatics, 77, 102239. https://doi.org/10.1016/j.ecoinf.2023.102239
  7. Dhillon, M. S., Dahms, T., Kübert-Flock, C., Steffan-Dewenter, I., Zhang, J., and Ullmann, T., 2022. Spatiotemporal fusion modelling using STARFM: Examples of Landsat 8 and Sentinel-2 NDVI in Bavaria. Remote Sensing, 14(3), 677. https://doi.org/10.3390/rs14030677
  8. Di Gennaro, S. F., Dainelli, R., Palliotti, A., Toscano, P., and Matese, A., 2019. Sentinel-2 validation for spatial variability assessment in overhead trellis system viticulture versus UAV and agronomic data. Remote Sensing, 11(21), 2573. https://doi.org/10.3390/rs11212573
  9. Dong, T., Liu, J., Qian, B., He, L., Liu, J., and Wang, R., et al, 2020. Estimating crop biomass using leaf area index derived from Landsat 8 and Sentinel-2 data. ISPRS Journal of Photogrammetry and Remote Sensing, 168, 236-250. https://doi.org/10.1016/j.isprsjprs.2020.08.003
  10. Flood, N., 2017. Comparing Sentinel-2A and Landsat 7 and 8 using surface reflectance over Australia. Remote Sensing, 9(7), 659. https://doi.org/10.3390/rs9070659
  11. Frazier, A. E., and Hemingway, B. L., 2021. A technical review of planet smallsat data: Practical considerations for processing and using PlanetScope imagery. Remote Sensing, 13(19), 3930. https://doi.org/10.3390/rs13193930
  12. Hafeez, A., Husain, M. A., Singh, S. P., Chauhan, A., Khan, M. T., and Kumar, N., et al, 2023. Implementation of drone technology for farm monitoring & pesticide spraying: A review. Information Processing in Agriculture, 10(2), 192-203. https://doi.org/10.1016/j.inpa.2022.02.002
  13. Isaev, E., Kulikov, M., Shibkov, E., and Sidle, R. C., 2023. Bias correction of Sentinel-2 with unmanned aerial vehicle multispectral data for use in monitoring walnut fruit forest in western Tien Shan, Kyrgyzstan. Journal of Applied Remote Sensing, 17(2), 022204. https://doi.org/10.1117/1.JRS.17.022204
  14. Jiang, J., Johansen, K., Tu, Y. H., and McCabe, M. F., 2022. Multi-sensor and multi-platform consistency and interoperability between UAV, Planet CubeSat, Sentinel-2, and Landsat reflectance data. GIScience & Remote Sensing, 59(1), 936-958. https://doi.org/10.1080/15481603.2022.2083791
  15. Khaliq, A., Comba, L., Biglia, A., Ricauda Aimonino, D., Chiaberge, M., and Gay, P., 2019. Comparison of satellite and UAVbased multispectral imagery for vineyard variability assessment. Remote Sensing, 11(4), 436. https://doi.org/10.3390/rs11040436
  16. Kong, J., Ryu, Y., Huang, Y., Dechant, B., Houborg, R., and Guan, K., et al, 2021. Evaluation of four image fusion NDVI products against in-situ spectral-measurements over a heterogeneous rice paddy landscape. Agricultural and Forest Meteorology, 297, 108255. https://doi.org/10.1016/j.agrformet.2020.108255
  17. Kong, J., Ryu, Y., Jeong, S., Zhong, Z., Choi, W., and Kim, J., et al, 2023. Super resolution of historic Landsat imagery using a dual generative adversarial network (GAN) model with CubeSat constellation imagery for spatially enhanced longterm vegetation monitoring. ISPRS Journal of Photogrammetry and Remote Sensing, 200, 1-23. https://doi.org/10.1016/j.isprsjprs.2023.04.013
  18. Labib, S. M., and Harris, A., 2018. The potentials of Sentinel-2 and LandSat-8 data in green infrastructure extraction, using object based image analysis (OBIA) method. European Journal of Remote Sensing, 51(1), 231-240. https://doi.org/10.1080/22797254.2017.1419441
  19. Li, Y., Chen, J., Ma, Q., Zhang, H. K., and Liu, J., 2018. Evaluation of Sentinel-2A surface reflectance derived using Sen2Cor in North America. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 11(6), 1997-2021. https://doi.org/10.1109/JSTARS.2018.2835823
  20. Liaghat, S., and Balasundram, S. K., 2010. A review: The role of remote sensing in precision agriculture. American Journal of Agricultural and Biological Sciences, 5(1), 50-55. https://doi.org/10.3844/ajabssp.2010.50.55
  21. Mao, P., Ding, J., Jiang, B., Qin, L., and Qiu, G. Y., 2022. How can UAV bridge the gap between ground and satellite observations for quantifying the biomass of desert shrub community?. ISPRS Journal of Photogrammetry and Remote Sensing, 2, 361-376. https://doi.org/10.1016/j.isprsjprs.2022.08.021
  22. Maponya, M. G., Van Niekerk, A., and Mashimbye, Z. E., 2020. Pre-harvest classification of crop types using a Sentinel-2 time-series and machine learning. Computers and Electronics in Agriculture, 169, 105164. https://doi.org/10.1016/j.compag.2019.105164
  23. Martinez, J. L., Lucas-Borja, M. E., Plaza-Alvarez, P. A., Denisi, P., Moreno, M. A., and Hernández, D., et al, 2021. Comparison of satellite and drone-based images at two spatial scales to evaluate vegetation regeneration after post-fire treatments in a Mediterranean forest. Applied Sciences, 11(12), 5423. https://doi.org/10.3390/app11125423
  24. Mazzia, V., Comba, L., Khaliq, A., Chiaberge, M., and Gay, P., 2020. UAV and machine learning based refinement of a satellite-driven vegetation index for precision agriculture. Sensors, 20(9), 2530. https://doi.org/10.3390/s20092530
  25. Messina, G., Peña, J. M., Vizzari, M., and Modica, G. A., 2020. A comparison of UAV and satellites multispectral imagery in monitoring onion crop. An Application in the 'Cipolla Rossa di Tropea' (Italy). Remote Sensing, 12(20), 3424. https://doi.org/10.3390/rs12203424
  26. Naethe, P., Asgari, M., Kneer, C., Knieps, M., Jenal, A., and Weber, I., et al, 2023. Calibration and validation from ground to airborne and satellite level: Joint application of time-synchronous field spectroscopy, drone, aircraft and Sentinel-2 imaging. PFG-Journal of Photogrammetry, Remote Sensing and Geoinformation Science, 91(1), 43-58. https://doi.org/10.1007/s41064-022-00231-x
  27. Park, N. W., Kim, Y., and Kwak, G. H., 2019. An overview of theoretical and practical issues in spatial downscaling of coarse resolution satellite-derived products. Korean Journal of Remote Sensing, 35(4), 589-607. https://doi.org/10.7780/kjrs.2019.35.4.8
  28. Pla, M., Bota, G., Duane, A., Balagué, J., Curcó, A., and Gutiérrez, R., et al, 2019. Calibrating Sentinel-2 imagery with multispectral UAV derived information to quantify damages in Mediterranean rice crops caused by Western Swamphen (Porphyrio porphyrio). Drones, 3(2), 45. https://doi.org/10.3390/drones3020045
  29. Riihimäki, H., Luoto, M., and Heiskanen, J., 2019. Estimating fractional cover of tundra vegetation at multiple scales using unmanned aerial systems and optical satellite data. Remote Sensing of Environment, 224, 119-132. https://doi.org/10.1016/j.rse.2019.01.030
  30. Roy, D. P., Huang, H., Houborg, R., and Martins, V. S., 2021. A global analysis of the temporal availability of PlanetScope high spatial resolution multi-spectral imagery. Remote Sensing of Environment, 264, 112586. https://doi.org/10.1016/j.rse.2021.112586
  31. Ryu, J. H., Jeong, H., and Cho, J., 2020a. Performances of vegetation indices on paddy rice at elevated air temperature, heat stress, and herbicide damage. Remote Sensing, 12(16), 2654. https://doi.org/10.3390/rs12162654
  32. Ryu, J. H., Moon, H. D., Cho, J., Lee, K., Ahn, H., and So, K., et al, 2021. Response of structural, biochemical, and physiological vegetation indices measured from field-spectrometer and multi-spectral camera under drop stress caused by herbicide. Korean Journal of Remote Sensing, 37(6-1), 1559-1572. https://doi.org/10.7780/KJRS.2021.37.6.1.6
  33. Ryu, J. H., Na, S. I., and Cho, J., 2020b. Inter-Comparison of normalized difference vegetation index measured from different footprint sizes in cropland. Remote Sensing, 12(18), 2980. https://doi.org/10.3390/rs12182980
  34. Sishodia, R. P., Ray, R. L., and Singh, S. K., 2020. Applications of remote sensing in precision agriculture: A review. Remote Sensing, 12(19), 3136. https://doi.org/10.3390/rs12193136
  35. Song, X. P., Huang, W., Hansen, M. C., and Potapov, P., 2021. An evaluation of Landsat, Sentinel-2, Sentinel-1 and MODIS data for crop type mapping. Science of Remote Sensing, 3, 100018. https://doi.org/10.1016/j.srs.2021.100018
  36. Sonobe, R., Yamaya, Y., Tani, H., Wang, X., Kobayashi, N., and Mochizuki, K. I., 2018. Crop classification from Sentinel-2-derived vegetation indices using ensemble learning. Journal of Applied Remote Sensing, 12(2), 026019. https://doi.org/10.1117/1.JRS.12.026019
  37. Sozzi, M., Kayad, A., Marinello, F., Taylor, J., and Tisseyre, B., 2020. Comparing vineyard imagery acquired from Sentinel-2 and Unmanned Aerial Vehicle (UAV) platform. Oeno One, 54(2), 189-197. https://dx.doi.org/10.20870/oeno-one.2020.54.1.2557
  38. Verma, C., Tripathi, V. K., and Paikra, I., 2020. Effect of ridge and furrow system in soybean cultivation and feasibility of economics. International Journal of Chemical Studies, 8(3), 1755-1760. https://doi.org/10.22271/chemi.2020.v8.i3x.9451
  39. Zhang, J., Pan, Y., Tao, X., Wang, B., Cao, Q., and Tian, Y., et al, 2023. In-season mapping of rice yield potential at jointing stage using Sentinel-2 images integrated with high-precision UAS data. European Journal of Agronomy, 146, 126808. https://doi.org/10.1016/j.eja.2023.126808

Research Article

Korean J. Remote Sens. 2024; 40(5): 657-673

Published online October 31, 2024 https://doi.org/10.7780/kjrs.2024.40.5.1.19

Copyright © Korean Society of Remote Sensing.

Evaluation of Surface Reflectance and Vegetation Indices Measured by Sentinel-2 Satellite Using Drone Considering Crop Type and Surface Heterogeneity

Jae-Hyun Ryu1 , Hyun-Dong Moon2,3, Kyung-Do Lee4 , Jaeil Cho5,6, Ho-yong Ahn1*

1Researcher, National Agricultural Satellite Center, National Institute of Agricultural Sciences, Rural Development Administration, Wanju, Republic of Korea
2PhD Student, Department of Applied Plant Science, Chonnam National University, Gwangju, Republic of Korea
3PhD Student, BK21 FOUR Center for IT-Bio Convergence System Agriculture, Chonnam National University, Gwangju, Republic of Korea
4Senior Researcher, National Agricultural Satellite Center, National Institute of Agricultural Sciences, Rural Development Administration, Wanju, Republic of Korea
5Professor, Department of Applied Plant Science, Chonnam National University, Gwangju, Republic of Korea
6Professor, BK21 FOUR Center for IT-Bio Convergence System Agriculture, Chonnam National University, Gwangju, Republic of Korea

Correspondence to:Ho-yong Ahn
E-mail: hyahn85@korea.kr

Received: September 30, 2024; Revised: October 17, 2024; Accepted: October 18, 2024

This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (https://creativecommons.org/licenses/by-nc/4.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Satellite data are used in precision agriculture to optimize crop management. Thus, the planting pattern (e.g., flat and ridge-furrow) and crop type should be accurately reflected in the data. The purpose of this study was to identify the spatial characteristics of errors in the surface reflectance (SR) and vegetation index (VI) obtained from the Sentinel-2 satellite. Drone data were used to evaluate the suitability of the Sentinel-2 satellite for precision agriculture applications in agricultural fields. Four VIs (normalized difference vegetation index, green normalized difference vegetation index, enhanced vegetation index, and normalized difference red edge index) were calculated. The rice paddy exhibited a homogeneous surface, whereas garlic/onion and soybean fields showed high surface heterogeneity because of the presence of ridges and furrows. The SR values of the rice paddy, measured at near-infrared (NIR) wavelength using the Sentinel-2 satellite, were saturated. The VIs derived from both satellite and drone data exhibited a correlation above 0.811 and normalized root mean square error (NRMSE) below 11.1% after bias correction. The garlic and onion fields exhibited the worst results, with a bias-corrected NRMSE for VIs ranging between 12.9% and 13.8%. The soybean field, where the vegetation covered the surface almost completely, exhibited the best relationship between the Sentinel-2 and drone data. The correlation coefficient and bias-corrected NRMSE of VIs for the combination of the two devices were above 0.969 and below 6.4%, respectively. In addition, the SR at NIR had a correlation of 0.925 and a slope of 1.157, unlike in the rice paddy. These results indicate that crop structure has a greater effect than the planting pattern. The absolute difference between the VIs measured by the satellite and drone is influenced by the degree of surface heterogeneity. The errors are more pronounced at the farm-land edges. Our study contributes to a better understating of the characteristics of Sentinel-2 data for use in agricultural fields.

Keywords: UAV, Surface reflectance, Homogeneity, Crop

1. Introduction

Precision agriculture incorporates remote-sensing techniques to optimize crop management (Bansod et al., 2017; Liaghat and Balasundram, 2010). A sensor-based monitoring system can non-destructively diagnose the conditions of a crop (Ryu et al., 2020a). The multispectral camera onboard a drone has a sampling distance of about several centimeters at a height of 150 m from the ground. Thus, pesticide spraying, variable fertilization, and irrigation management can be conducted using drone data (Hafeez et al., 2023; Sishodia et al., 2020). However, regularly capturing wide areas using drones consumes the labor force and requires economic efforts (Pla et al., 2019). Satellite data can be an alternative to overcome these disadvantages. Because satellites regularly capture images of crop-growing areas, information such as the planting time, harvesting time, and spatial variation depending on the crop’s conditions can be produced and provided to the farmer (Ali et al., 2021).

Abundant satellite data are freely available for monitoring crop growth and development in the field (Labib and Harris, 2018). The Sentinel-2 satellite, which is a representative agricultural observation satellite, has various bands from visible to shortwave infrared, and the spatial image resolution ranges from 10 to 60 m. Moreover, it is possible to observe the same area every 5 days (Song et al., 2021). The growth parameters, such as the leaf area index, biomass, and yield, can be estimated using data from the Sentinel-2 satellite (Bansod et al., 2017; Dong et al., 2020; Mao et al., 2022), and crop types can be classified (Maponya et al., 2020; Sonobe et al., 2018). The improved spatiotemporal resolution and surface reflectance (SR) at various wavelengths facilitate precision agriculture using satellite data (Kong et al., 2023). However, there are drawbacks regarding the usability of the freely available satellite data for precision agriculture.

The temporal resolution of Sentinel-2 satellite data can be unsatisfactory because of cloud and cloud shadow effects (Bukowiecki et al., 2021; Caparros-Santiago et al., 2023). If information about a specific period is not obtained for a long time, using satellite data for precision agriculture may be challenging. To minimize these limitations, fused satellite data have been produced as a harmonized SR product of satellites such as Landsat-8/9 and Sentinel-2 (Dhillon et al., 2022; Kong et al., 2021). Furthermore, more than 130 microsatellites have been launched for the PlanetScope constellation (Frazier et al., 2021), which makes an effort to observe specific areas every day (Roy et al., 2021). However, this constellation cannot escape the influence of clouds and cloud shadows, which are inherent limitations of satellite data.

The crop conditions can be monitored using the multispectral camera onboard a drone during specific growth periods for which satellite images have not been acquired (Martinez et al., 2021). Drone images can provide spatially detailed information about crops compared to satellite images (Messina et al., 2020). In addition, satellite and drone data can be jointly used after converting the spatial resolution of the drone image to match that of the satellite image (Jiang et al., 2022; Zhang et al., 2023). However, the information obtained from satellite data can be biased because of the low spatial resolution associated with surface heterogeneity.

Crops are cultivated using various planting patterns (e.g., flat and ridge-furrow), depending on the crop type, soil type, slope of the field, and climate. The flat planting pattern has advantages in managing water and nutrients, and the crops are easy to grow and harvest. The ridge-furrow planting pattern is used in areas with heavy precipitation or poor drainage. The height difference between the ridges and furrows prevents the crops from being submerged and avoids damage due to excess moisture (Verma et al., 2020). These planting patterns affect surface heterogeneity. Satellite data can provide inaccurate information about crop conditions when crops are planted in rows (Mazzia et al., 2020).

However, the degree of soil coverage can vary depending on the leaf structure of the crops, even if the ridge-furrow planting pattern is used. Therefore, prior to using drones and satellites together, the satellite and drone data should be compared to determine whether the information about crop conditions is accurate (Zhang et al., 2023). Most previous studies have focused on evaluating the normalized difference vegetation index (NDVI) measured by satellites (Verma et al., 2020). However, the multispectral imager includes not only red and near-infrared (NIR) bands but also blue, green, and red-edge bands. Thus, evaluating the SR in various wavelength bands is necessary and vegetation indices (VIs) should be calculated using the SR.

The main purpose of this study is to identify the characteristics of errors in the SR and VIs measured by the Sentinel-2 satellite depending on crop type, planting pattern, and the surface heterogeneity in agricultural fields. Drone-based images, which have higher spatial resolution than satellite image, were used to as reference assess the Sentinel-2 satellite data. Moreover, suitability and limitations of the Sentinel-2 satellite data for precision agriculture were investigated in rice paddy, garlic, onion, and soybean crops.

2. Materials and Methods

2.1. Study Area

This study was conducted in three counties located in South Korea (Fig. 1a). One measurement site was a rice paddy located in Jeollanam-do Agricultural Research & Extension Services (Naju County, latitude of 35.0275°N, longitude of 126.8209°E). Rice was cultivated in 2020 and 2021. The rice was transplanted in early June, and the heading stage was observed in mid to late August. The rice was harvested from late September to early October. The total target study area (blue polygon in Fig. 1b), which comprised five paddies, was approximately 17,000 m2. Given that the size of each paddy is approximately 30 × 95 m, the pixels located in the center of each rice paddy field have a fairly homogeneous footprint (Fig. 2a).

Figure 1. Study area including heterogeneous and homogeneous footprints. The blue, red, and orange grids indicate the region of interest (based on 10 m). (a) Measurement sites. (b) Rice paddy with a homogeneous footprint. (c) Garlic and onion fields with heterogeneous footprints; this upland field includes many furrows and ridges. (d) Paddy for cultivating soybean; this paddy includes many furrows and ridges.
Figure 2. Zoomed-in field images measured by drone. (a) Rice paddy. (b) Garlic and onion field including ridges and furrows. (c) Soybean field including ridges and furrows.

The second measurement site was an upland field located in the National Institute of Crop Science (Muan County, latitude of 34.9671°N, longitude of 126.4528°E). Garlic and onion were cultivated in this area during the 2019–2020 and 2020–2021 growing seasons. The growing period for garlic and onion was from October to late March/early June. The upland field featured numerous furrows and ridges, which resulted in a fairly heterogeneous footprint (Fig. 2b). In addition, bare soil and a concrete road were present at the edge of the study area. The total area was approximately 8,400 m2 (red polygon in Fig. 1c).

The third measurement site was a field located in Gimje County (latitude of 35.7517°N, longitude of 126.8136°E) for cultivating soybeans. This area consisted of multiple furrows and ridges to mitigate damage from excess moisture because of poor drainage (Fig. 2c). Soybean was cultivated in this field from June 3, 2022, to November 2, 2022. Unlike the garlic and onion fields, the leaves of the soybean plants covered the entire field in August and September, which made the soil barely visible. The target area (orange polygon in Fig. 1d) encompassed approximately 27,600 m2.

2.2. Dataset

Satellite images from Sentinel-2, one of the most prominent satellites used for crop data monitoring, were collected using the Sentinel Hub interface. The coordinate system and projection were set to WGS 1984 UTM Zone 52N and Transverse Mercator, respectively, to match the drone’s raster data. We downloaded the Sentinel-2 Level-2A orthoimage bottom-of-atmosphere-corrected reflectance data, which eliminated atmospheric effects. Subsequently, contaminated pixels caused by clouds and cloud shadows were filtered using scene classification and cloud information. Pixels with a cloud probability value above 0 were masked (Fig. 3).

Figure 3. Flowchart for comparing the surface reflectance and vegetation indices measured by the satellite and drone.

The SR of the crops (rice, garlic, onion, and soybean) was measured using a multispectral camera mounted on a drone (Fig. 3). The multispectral camera (RedEdge-MX Dual: MicaSense, Inc., Seattle, WA, USA) had 10 bands (Table 1). Drone measurements were conducted around the solar noon and on clear days. The flight altitude was set to 30 m or 50 m, resulting in a ground sample distance of 2.08 cm or 3.47 cm, respectively. The measured data were processed using the Pix4Dmapper software (version 4.3.31), and the information from seven ground control points—obtained using a GRX2 GNSS Receiver (SOKKIA Corporation, Olathe, KS, USA)—was used for geometric correction. A calibration reflectance panel provided by the manufacturer of the multispectral camera was used to convert the digital number of the drone raster data to reflectance in Pix4Dmapper. Moreover, to improve the accuracy of the measured values, a radiometric correction was conducted on the rice paddy and garlic and onion field, using four homogeneous calibration targets with reflectance values of 3%, 21%, 32%, and 51%, respectively. The processed spectral reflectance drone images showed the crop growth status, planting pattern, and artificial structures in detail due to high spatial resolution.

Table 1 . Information from the multispectral sensors on board the drone and Sentinel-2 satellite.

No.Sentinel-2/MSIDrone/RedEdge-MX Dual
BandCentral wavelength/Bandwidth (nm)Spatial resolution (m)BandCentral wavelength/Bandwidth (nm)
Blue/aerosol1443/36601444/28
Blue2490/96102475/32
Green3531/14
Green3560/45104560/27
Red-5650/16
Red4665/39106668/14
Red edge5705/20207705/10
Red edge8717/12
Red edge6740/18209740/18
Red edge7783/2820
NIR8842/1411010842/57
NIR8A865/3320
WV9945/2760
SWIR/Cirrus101375/7660
SWIR111610/14220
SWIR122190/24020

MSI: multispectral imager, NIR: near-infrared, WV: water vapor, SWIR: shortwave-infrared..



2.3. Matchup Data

SR and VI were compared to identify the characteristics of the data measured by the satellite and drone, using the following steps. Six bands with central wavelengths of 490, 560, 665, 705, 740, and 842 nm (based on Sentinel-2) were selected to compare the imagery obtained from the two devices. Despite the different multispectral sensors, these bands have similar central wavelengths (Table 1). Four VIs (the NDVI, green normalized difference vegetation index (GNDVI), enhanced vegetation index (EVI), and normalized difference red-edge index (NDRE) were calculated using the SR measured by the Sentinel-2 satellite and drone (Table 2). These VIs utilized approximately five bands, including the wavelengths of 490, 560, 665, 705, and 842 nm based on the Sentinel-2 satellite. The four vegetation indices theoretically range from –1 to 1, but in areas with soil or vegetation, the values are typically between 0 and 1. Values closer to 0 indicate sparse vegetation, while values near 1 signify dense and healthy vegetation. Next, the spatial resolution of the SR and VI measured by the drone was converted to 10 m or 20 m to match the spatial resolution of the Sentinel-2 data. In the case of Sentinel-2, the SR at the blue, green, red, and NIR bands has a spatial resolution of 10 m, whereas the red-edge band has a spatial resolution of 20 m. When upscaling the drone image to match the scale of the Sentinel-2 data, the resampling method involved calculating the average (Assmann et al., 2020), except for no data values. The upscaled drone and Sentinel-2 images were clipped using a region-of-interest shapefile (Fig. 1b–d). Finally, the VIs measured by the satellite and drone were named VISatellite and VIDrone, respectively.

Table 2 . Information from the multispectral sensors on board the drone and Sentinel-2 satellite.

VIEquationSentinel-2 bandDrone band
NDVI(RNIR – RRed) / (RNIR + RRed)B4, B8B6, B10
GNDVI(RNIR – RGreen) / (RNIR + RGreen)B3, B8B4, B10
EVI2.5 * (RNIR – RRed) / (RNIR + 6 * RRed – 7.5 RBlue + 1)B2, B4, B8B2, B6, B10
NDRE(RNIR – RRededge) / (RNIR + RRededge)B5, B8B7, B10

VI: vegetation index, NDVI: normalized difference vegetation index, GNDVI: green normalized difference vegetation index, EVI: enhanced vegetation index; NDRE: normalized difference red-edge index, R: Reflectance, NIR: near-infrared, B: Band..



Matchup data for six bands and four VIs were established between the imagery obtained from the satellite and that of the drone. Although more drone measurements were conducted, only six images from the rice paddy and the onion and garlic field were matched because of the effects of weather conditions. Furthermore, four images from the soybean field were matched. There were seven images measured on the same date. In addition, three images with a one-day difference and six images with a two-day difference between the measurement date of the Sentinel-2 satellite and that of the drone were used (Table 3).

Table 3 . Measurement information of satellite.

No.SiteMeasurement dateTime differenceAir temperature (ºC)Relative humidity (%)
SatelliteDroneSatelliteDroneSatelliteDrone
1NajuAug. 15, 2020Aug. 14, 20201 day31.532.066.173.0
2NajuSep. 4, 2020Sep. 4, 20200 day27.928.960.256.3
3NajuSep. 24, 2020Sep. 24, 20200 day24.025.558.254.1
4NajuJul. 21, 2021Jul. 21, 20210 day31.831.657.459.0
5NajuAug. 10, 2021Aug. 10, 20210 day30.029.766.265.4
6NajuAug. 20, 2021Aug. 19, 20211 day28.428.861.657.0
7MuanMar. 18, 2020Mar. 20, 20202 days15.613.934.433.4
8MuanApr. 27, 2020Apr. 29, 20202 days17.021.032.034.4
9MuanMay 22, 2020May 20, 20202 days23.619.047.760.6
10MuanMar. 18, 2021Mar. 18, 20210 day17.415.651.158.0
11MuanMay 12, 2021May 14, 20212 days25.326.067.565.6
12MuanMay 22, 2021May 24, 20212 days22.121.370.671.5
13GimjeJul. 1, 2022Jun. 30, 20221 days30.929.470.781.5
14GimjeSep. 9, 2022Sep. 7, 20222 days25.624.379.990.1
15GimjeOct. 14, 2022Oct. 14, 20220 day19.819.779.680.8
16GimjeOct. 19, 2022Oct. 19, 20220 day14.615.251.351.4

Values of air temperature and relative humidiy are at measurement time of Sentienl-2 satellite and drone..



2.4. Statistical Analysis

The correlation coefficient, absolute bias, RMSE, normalized RMSE (NRMSE), and bias-corrected NRMSE were computed to compare the satellite and the upscaled drone data. The correlation coefficient was calculated using the Scipy library (Version 1.7.0) of Python language (Version 3.9.16). Absolute bias, which describes the degree of spread between satellite and drone pixels, was calculated as the sum of the absolute values of the differences between satellite and drone pixels (Eq. 1). RMSE is an indicator that quantitatively explains average error between drone and satellite data (Eq. 2).

Absolute bias= i=1n SiDi n RMSE= i=1nSiDi2 n

Where Si, Di, and n indicate satellite pixel, upscaled drone pixel, and the total number of data, respectively. NRMSE was calculated using the difference between the minimum and maximum values of the y-axis (Eq. 3). The NRMSE represents the percentage of RMSE within the range of the variable (y-axis).

NRMSE=RMSESmaximumSminimum×100

Where Smaximum and Sminimum are the minimum and maximum values of satellite data. To analyze errors after removing bias between satellites and drones, the bias correction was performed using linear regression to minimize the bias between the satellite and drone data, and the resulting score was defined as the bias-corrected NRMSE.

3. Results

3.1. Spectral Reflectance

The SRSatellite was evaluated using the SRDrone in the rice paddy (Fig. 4). The correlation coefficient ranged from 0.477 to 0.859 at six bands (blue, green, red, red-edge at 704 nm (hereafter, rededge@704 nm), red-edge at 740 nm (hereafter, rededge@740 nm), and NIR wavelengths). In the rice paddy, the correlation coefficient was higher in the visible wavelength range than in the NIR wavelength range, which was consistent with the other statistical scores. The NRMSE ranged from 10.5% to 12.2% in the visible to rededge@704 nm range, while it was 35.8% in the rededge@740 nm and 30.3% in the NIR range. The SRSatellite at the rededge@740 nm and NIR wavelengths were underestimated compared to the SRDrone. In particular, the SR values in the NIR range showed a narrower range in the satellite data (0.271 to 0.538) compared to the drone data (0.204 to 0.678) (Fig. 4f). In other words, the NIR reflectance measured by the satellite was less sensitive than that measured by the drone in the rice paddy.

Figure 4. Scatter plots for the combination of drone-based surface reflectance (SRDrone) and satellite-based surface reflectance (SRSatellite) in the rice paddy: (a) blue wavelength, (b) green wavelength, (c) red wavelength, (d) red-edge wavelength at 704 nm, (e) red-edge wavelength at 740 nm, and (f) near-infrared wavelength.

The SRSatellite was evaluated using data measured by the drone in the garlic and onion field (Fig. 5). The correlation coefficients between the SR values measured by the Sentinel-2 satellite and the drone ranged from 0.418 to 0.649. The values obtained in the garlic and onion fields were lower than those in the rice paddy (correlation coefficients=0.477 to 0.859). The presence of furrows and ridges in the garlic and onion fields caused a heterogeneous footprint. In particular, the bias-corrected NRMSE in the visible to rededge@705 nm range was lower in the garlic and onion field than in the rice paddy (Table 3). On the other hand, the bias-corrected NRMSE in the rededge@740 nm and NIR in the garlic and onion field were lower than that in the rice paddy because of the underestimation of the SRSatellite.

Figure 5. Scatter plots for the combination of drone-based surface reflectance (SRDrone) and satellite-based surface reflectance (SRSatellite) in the garlic and onion field, including ridges and furrows: (a) blue wavelength, (b) green wavelength, (c) red wavelength, (d) red-edge wavelength at 704 nm, (e) red-edge wavelength at 740 nm, and (f) near-infrared wavelength.

The highest statistical scores for the SR at the red-edge and NIR wavelengths measured by the Sentinel-2 satellite and drone corresponded to the soybean field (Fig. 6). Unlike in the rice paddy and garlic and onion field, the correlation coefficient was above 0.806 and the absolute bias was below 0.043. Although the values of NRMSE and bias-corrected NRMSE at visible wavelengths in the soybean field were low compared to those of the rice paddy (Figs. 6a–c), they were adequate at the rededge@740 nm and NIR wavelengths (Figs. 6e, f). In addition, the slope between the Sentinel-2 satellite and drone data was close to 1. These values indicate that the relationship between the Sentinel-2 satellite and drone data was strongest in the soybean field. All statistical results related to spectral reflectance between Sentinel-2 and drone are summarized in Table 4.

Figure 6. Scatter plots for the combination of drone-based surface reflectance (SRDrone) and satellite-based surface reflectance (SRSatellite) in the soybean field: (a) blue wavelength, (b) green wavelength, (c) red wavelength, (d) red-edge wavelength at 704 nm, (e) red-edge wavelength at 740 nm, and (f) near-infrared wavelength.

Table 4 . Statistic results for surface reflectance and vegetation indices measured from drone and satellite at rice paddy, garlic and onion, and soybean fields..

CropIndexRSlopeAbsolute biasRMSENRMSE (%)Bias-corrected NRMSE (%)
Rice paddyBlue0.9030.8590.0190.02111.66.1
Green0.8770.7730.0180.02311.58.5
Red0.9080.7590.0170.02410.58.3
Rededge@705 nm0.9130.7920.0220.02912.29.7
Rededge@740 nm0.5690.3030.0250.07235.817.9
NIR0.4770.2730.0650.08130.316.9
NDVI0.9320.8010.0620.07810.37.2
GNDVI0.9270.8050.0570.07011.37.1
EVI0.8110.6030.0850.10515.211.1
NDRE0.9390.7800.0610.07713.38.5
Garlic and onionBlue0.5220.4520.0330.04121.111.3
Green0.5660.4360.0330.04121.713.7
Red0.6470.4320.0410.05122.514.1
Rededge@705 nm0.4180.3160.0360.04628.218.4
Rededge@740 nm0.5500.5990.0340.04220.818.4
NIR0.6490.5990.0390.04918.515.5
NDVI0.7070.4160.1190.14826.213.8
GNDVI0.6490.4150.0950.11624.413.2
EVI0.7450.4740.0790.10020.913.1
NDRE0.6240.3780.0720.09423.312.9
SoybeanBlue0.8501.0530.0180.02314.610.2
Green0.8810.9930.0170.02211.69.0
Red0.8620.8220.0170.02410.59.6
Rededge@705 nm0.8830.7830.0240.03114.312.0
Rededge@740 nm0.8061.0920.0430.05218.716.9
NIR0.9251.1570.0390.05112.611.1
NDVI0.9690.9990.0380.0556.66.4
GNDVI0.9751.0580.0490.0598.55.6
EVI0.9771.1051.1050.0718.36.0
NDRE0.9880.9820.9820.0395.34.6

R: coefficient of correlation, NIR: Near infrared, NDVI: normalized difference vegetation index, GNDVI: green normalized difference vegetation index, EVI: enhanced vegetation index, NDRE: normalized difference red edge index..



3.2. Vegetation Index

The five bands (blue, green, red, rededge@705nm, and NIR) were used to calculate four vegetation indices (NDVI, GNDVI, EVI, and NDRE). The differences between the VIs obtained using Sentinel-2 satellite and drone data from the paddy rice, garlic and onion, and soybean fields were evaluated.

In the rice paddy, the NRMSE of four VIs from the combination of drone and Sentinel-2 satellite data ranged from 10.3% to 15.2%. NDVI and GNDVI showed similar statistical performance (Figs. 7a, b). After bias correction, the NRMSE of NDVI and GNDVI decreased to 7.2% and 7.1%, respectively. Similar characteristics were observed in the case of NDRE (Fig. 7c). However, the values for EVI were lower than those for NDVI, GDVI, and NDRE. The bias-corrected NRMSE in EVI was 11.1% and the slope between EVIDrone and EVISatellite was 0.603 (Fig. 7d).

Figure 7. Scatter plots for the combination of VIs obtained by the drone and Sentinel-2 satellite in the rice paddy: (a) NDVI, (b) GNDVI, (c) EVI, and (d) NDRE.

All statistical parameters comparing the VIs measured by the Sentienel-2 satellite and the drone in the garlic and onion field were lower than those in the rice paddy (Fig. 8). In particular, the RMSE and NRMSE values for four VIs were higher in the garlic and onion field because of the bias between the two measurements. VISatellite was overestimated compared to VIDrone for low values, but it was underestimated for high values. After bias correction of VISatellite, the NRMSE ranged from 12.9% to 13.8%. The distribution of each VI was clustered depending on the measurement date for the rice paddy (Fig. 7), but the values were mixed for the garlic and onion field (Fig. 8).

Figure 8. Scatter plots for the combination of drone-based VIs and Sentinel-2 satellite-based VIs for the garlic and onion field: (a) NDVI, (b) GNDVI, (c) EVI, and (d) NDRE.

The four VIs of the soybean field exhibited the highest statistical parameters compared to those of the rice paddy and garlic and onion field. In particular, the slope of NDVI was 0.999, and the correlation coefficient and RMSE were 0.969 and 0.055, respectively (Fig. 9a). Even if the bias correction between NDVIDrone and NDVISatellite were not performed, NRMSE was just 6.6%, which is similar to the bias-corrected NRMSE (6.4%) as a result of the low absolute bias (0.038). These high scores were also seen for GNDVI, EVI, and NDRE. The NRMSE range for the three VIs was from 5.3% to 8.5%. After the bias correction, the NRMSE decreased from 4.6% to 6.0%. All statistical results related to vegetation indices between Sentinel-2 and drone are summarized in Table 4.

Figure 9. Scatter plots for the combination of VIs measured by the Sentinel-2 satellite and drone in the soybean field: (a) NDVI, (b) GNDVI, (c) EVI, and (d) NDRE.

3.3. Error Characteristics of the Vegetation Index

The distribution of absolute differences (Sentinel-2 satellite data minus bias-corrected drone data) for the four VIs was examined in the three areas. The bias correction of VIDrone was conducted regionally. The polygon in Fig. 10 represents the cropland, including the road and facilities at its edge. In the rice paddy, the absolute difference of VIs, calculated using normalization methods such as NDVI, GNDVI, and NDRE, was low in the middle of each field. However, pixels included in farm roads showed high absolute difference values (Figs. 10a–d). In the garlic and onion field, the spatial distribution of absolute differences was complex because of the influence of furrows and ridges (Figs. 10e–h). The absolute difference at the center of the soybean field was the lowest. Unlike the rice paddy, the spatial pattern of EVI in the soybean field was similar to that of other VIs (Figs. 10i–l).

Figure 10. Absolute difference images between the vegetation indices measured by the satellite and drone (Sentinel-2 satellite minus bias-corrected drone). The polygon indicates the cropland. (a)–(d) NDVI, GNDVI, EVI, and NDRE in the rice paddy; (e)–(h) NDVI, GNDVI, EVI, and NDRE in the garlic and onion field; and (i)–(l) NDVI, GNDVI, EVI, and NDRE in the soybean field.

The difference between VISatellite and VIDrone was analyzed based on surface heterogeneity. VIDrone was subjected to bias correction to eliminate bias with VISatellite, and then the standard deviation of VIDrone pixels located inner one pixel of Sentinel-2 satellite data was calculated. In NDVI, the difference between the maximum and minimum values at each range increased when there was a large standard deviation (Fig. 11a). This trend is also shown in GNDVI (Fig. 11b). NDVI and GNDVI exhibits the structural characteristics of vegetation as the crop grows and develops. Thus, error trends according to the standard deviation of each VIDrone appeared conspicuously. In the case of EVI and NDRE, the section with a standard deviation of 0 to 0.04 exhibited the smallest difference between the maximum and minimum values in the box plot (Figs. 11c, d). Nonetheless, the difference was large when the standard deviation of VIDrone was above 0.12.

Figure 11. Box plots of the differences between the vegetation index based on Sentinel-2 satellite data and the bias-corrected vegetation index measured by the drone, based on the standard deviation (SD) of VIDrone pixels located inner one pixel of Sentinel-2 satellite data. Max. and Min. are the maximum and minimum values at each range. (a) NDVI, (b) GDNVI, (c) EVI, and (d) NDRE.

4. Discussion

According to previous studies, the VIs measured by the Sentinel-2 satellite were evaluated using drone images. The NDVISatellite measured by Sentinel-2 exhibited different correlations with NDVIDrone because of heterogeneity (Assmann et al., 2020; Di Gennaro et al., 2019). Naethe et al. (2023) reported that the NDVI produced by Sentinel-2 had an R2 of 0.890 and RMSE of 0.123 in the relationship with NDVIDrone. Ryu et al. (2020b) compared the statistical parameters of the Sentinel-2 NDVI and NDVIDrone and obtained an R2 of 0.817 and RMSE of 0.086 (n=135). Our results were similar to those of previous studies. The obtained RMSE values of NDVI were 0.078 (n=790), 0.148 (n=480), and 0.055 (n=1,044) for the rice, garlic, onion, and soybean fields, respectively.

The slopes between VISatellite and VIDrone were below 1.0 for the rice, garlic, and onion fields. VISatllite was overestimated at low values and underestimated at high values (Figs. 7 and 8). These error characteristics were consistent with previous studies. The Sentinel-2 NDVI was overestimated in open terrains and grasslands (low NDVI values) but it was underestimated in the forest (high NDVI values) (Isaev et al., 2023). The NDVIDrone values were higher than the NDVISatellite values in the upland field, which agreed with another study. The differences between VISatellite and VIDrone may be caused by the spatial resolution (Bollas et al., 2021). Riihimäki et al. (2019) reported that spatial resolution of satellite data was related to data distributions. In the same area, the NDVI range of minimum and maximum values for satellite images with low spatial resolution is narrower than that for satellite images with high spatial resolution (Park et al., 2019). The VIDrone values had a wider range under surface heterogeneity conditions (Figs. 7 and 8). SR was analyzed to identify the cause of the error characteristics of VIs.

The SR at the rededge@740 nm and NIR wavelengths measured by the Sentinel-2 satellite have a theoretical range of 0 to 1 (0% to 100%). Here, the maximum SR at the NIR wavelength was 0.538 in the rice paddy, whereas, in the soybean field, it reached 0.593. Consequently, the maximum VISatellite values in the soybean field were higher than those in the rice paddy. This difference can be attributed to the fact that crops like soybeans completely cover the soil. Soybean grows to the extent that the plants cover the furrow, so 10% or less of the solar irradiance is transmitted in the growing stage from the full pod to the beginning seed stages. Moreover, the saturation of the SR at the NIR wavelength in the rice paddy may be influenced by water because rice is sub-merged during the cultivation period. Given that rice is an upright crop, soil and water are partially exposed depending on the planting interval, and these can influence the VIs (Khaliq et al., 2019; Sozzi et al., 2020). The SR at the NIR wavelength is highly sensitive and may show low values in the presence of water.

The accuracy between VISatellite and VIDrone varied depending on the type of crop. The leaf structure affected the fraction of vegetation, which was related to the maximum SR value at the NIR wavelength in this study. We also checked the NIR values in other study areas. For example, in large-scale cabbage fields, which are characterized by broad leaves and dense planting (coordinates: 128.741256°E, 37.615712°N), pixels with high NIR values from the Sentinel-2 satellite were identified by their values being above 0.6. Higher NIR values (above 0.7) have been reported by previous studies (Flood, 2017; Li et al., 2018). Therefore, the type of crop should be considered when interpreting the quality of Sentinel-2 satellite data on agricultural fields.

In addition to the characteristics of the crop, the geometric error can affect the difference between the satellite and drone data. Jiang et al. (2022) showed that the position errors of drone or satellite images caused differences in SR. Therefore, heterogeneous pixels, which encompass various surface conditions, resulted in large differences in the values of NDVI and GDNVI. The outskirt pixels of the field exhibited large differences because of the inclusion of the road and the structure in this study (Fig. 10). Given that the geometric correction was conducted using the ground control points in the case of drone images, the geometric error of the satellite image may even be larger.

In the rice paddy and soybean fields, the RMSE of EVI was higher than that of other VIs (NDVI, GDNVI, and NDRE). The statistical parameters of EVI were lower than those of other VIs likely because of the way the VIs were calculated. NDVI, GNDVI, and NDRE were calculated based on normalization using the SR at two different wavelengths, whereas the calculation of EVI included a constant. Thus, errors in the SR measured by satellite and drone can affect the EVI values (Ryu et al., 2021). If solar radiance changes dramatically because of clouds during a drone flight, the SR includes noise because it is calculated based on measurements from a calibrated reflectance panel before the flight. Although the downwelling light was considered in the operation of the multi-spectral camera, the reflectance values of the mosaic drone images were affected by the light intensity. In addition, border noise occurs when mosaicing drone images and images. Therefore, the errors in the SR measured by the drone may be larger than in unnormalized VIs.

5. Conclusions

The performance of satellite data should be evaluated before applying a satellite-based crop monitoring system in precision agriculture. The goal of this study was to evaluate the error characteristics of the VISatellite, using drone images, depending on the crop type and surface heterogeneity. The conclusions of this study are summarized below:

  • The effect of crop structure was larger than that of the planting pattern (e.g., flat and ridge-furrow). Despite the presence of ridges and furrows in the soybean field, this area exhibited the highest statistical parameters of the SR and VI obtained by combining the Sentinel-2 satellite and drone data. The SRSatellite values at the NIR wavelength were similar to those of the SRDrone because the broad leaves of soybean completely covered the soil.

  • VISatellite was overestimated for low values but underestimated for high values compared to VIDrone in the rice paddy, garlic, and onion fields. SRSatellite at the rededge@740 nm and NIR wavelengths was underestimated but it was overestimated at visible wavelengths.

  • The range of SR and VI measured by the drone was wider than that measured by the Sentinel-2 satellite. This is due to the higher spatial resolution of the drone image compared to the satellite image.

  • The degree of surface heterogeneity affected the absolute difference between VISatellite and VIDrone. The difference was low in the center of the soybean and rice fields but high in the mixed pixels (including the road and structure). Thus, the outskirt pixels of the field should be excluded from the analysis.

  • Given the spatial resolution of the Sentinel-2 satellite, if the canopy does not cover the soil in the case of crops with a planting pattern with ridges and furrows, information included relatively large errors like garlic and onion fields. Thus, it is recommended to utilize high-resolution drone data to diagnose the condition of crops.

The potential for precision farming of the Sentinel-2 satellite data varies depending on the type of crop. Sentinel-2 satellite data can be used to diagnose the growth of crops such as soybean and paddy rice, but difficulties are expected in the case of crops such as garlic and onion at the pixel level. Higher spatial resolution data are required to determine the conditions of garlic and onion, and analyses at the agricultural parcel level should be conducted. Additional studies are needed in large garlic and onion fields to confirm this. Our results can help users understand the error characteristics of the SR and VI measured by the Sentinel-2 satellite.

Acknowledgments

This research was funded by the Rural Development Administration (grant no. PJ016768).

Conflict of Interest

No potential conflict of interest relevant to this article was reported.

Fig 1.

Figure 1.Study area including heterogeneous and homogeneous footprints. The blue, red, and orange grids indicate the region of interest (based on 10 m). (a) Measurement sites. (b) Rice paddy with a homogeneous footprint. (c) Garlic and onion fields with heterogeneous footprints; this upland field includes many furrows and ridges. (d) Paddy for cultivating soybean; this paddy includes many furrows and ridges.
Korean Journal of Remote Sensing 2024; 40: 657-673https://doi.org/10.7780/kjrs.2024.40.5.1.19

Fig 2.

Figure 2.Zoomed-in field images measured by drone. (a) Rice paddy. (b) Garlic and onion field including ridges and furrows. (c) Soybean field including ridges and furrows.
Korean Journal of Remote Sensing 2024; 40: 657-673https://doi.org/10.7780/kjrs.2024.40.5.1.19

Fig 3.

Figure 3.Flowchart for comparing the surface reflectance and vegetation indices measured by the satellite and drone.
Korean Journal of Remote Sensing 2024; 40: 657-673https://doi.org/10.7780/kjrs.2024.40.5.1.19

Fig 4.

Figure 4.Scatter plots for the combination of drone-based surface reflectance (SRDrone) and satellite-based surface reflectance (SRSatellite) in the rice paddy: (a) blue wavelength, (b) green wavelength, (c) red wavelength, (d) red-edge wavelength at 704 nm, (e) red-edge wavelength at 740 nm, and (f) near-infrared wavelength.
Korean Journal of Remote Sensing 2024; 40: 657-673https://doi.org/10.7780/kjrs.2024.40.5.1.19

Fig 5.

Figure 5.Scatter plots for the combination of drone-based surface reflectance (SRDrone) and satellite-based surface reflectance (SRSatellite) in the garlic and onion field, including ridges and furrows: (a) blue wavelength, (b) green wavelength, (c) red wavelength, (d) red-edge wavelength at 704 nm, (e) red-edge wavelength at 740 nm, and (f) near-infrared wavelength.
Korean Journal of Remote Sensing 2024; 40: 657-673https://doi.org/10.7780/kjrs.2024.40.5.1.19

Fig 6.

Figure 6.Scatter plots for the combination of drone-based surface reflectance (SRDrone) and satellite-based surface reflectance (SRSatellite) in the soybean field: (a) blue wavelength, (b) green wavelength, (c) red wavelength, (d) red-edge wavelength at 704 nm, (e) red-edge wavelength at 740 nm, and (f) near-infrared wavelength.
Korean Journal of Remote Sensing 2024; 40: 657-673https://doi.org/10.7780/kjrs.2024.40.5.1.19

Fig 7.

Figure 7.Scatter plots for the combination of VIs obtained by the drone and Sentinel-2 satellite in the rice paddy: (a) NDVI, (b) GNDVI, (c) EVI, and (d) NDRE.
Korean Journal of Remote Sensing 2024; 40: 657-673https://doi.org/10.7780/kjrs.2024.40.5.1.19

Fig 8.

Figure 8.Scatter plots for the combination of drone-based VIs and Sentinel-2 satellite-based VIs for the garlic and onion field: (a) NDVI, (b) GNDVI, (c) EVI, and (d) NDRE.
Korean Journal of Remote Sensing 2024; 40: 657-673https://doi.org/10.7780/kjrs.2024.40.5.1.19

Fig 9.

Figure 9.Scatter plots for the combination of VIs measured by the Sentinel-2 satellite and drone in the soybean field: (a) NDVI, (b) GNDVI, (c) EVI, and (d) NDRE.
Korean Journal of Remote Sensing 2024; 40: 657-673https://doi.org/10.7780/kjrs.2024.40.5.1.19

Fig 10.

Figure 10.Absolute difference images between the vegetation indices measured by the satellite and drone (Sentinel-2 satellite minus bias-corrected drone). The polygon indicates the cropland. (a)–(d) NDVI, GNDVI, EVI, and NDRE in the rice paddy; (e)–(h) NDVI, GNDVI, EVI, and NDRE in the garlic and onion field; and (i)–(l) NDVI, GNDVI, EVI, and NDRE in the soybean field.
Korean Journal of Remote Sensing 2024; 40: 657-673https://doi.org/10.7780/kjrs.2024.40.5.1.19

Fig 11.

Figure 11.Box plots of the differences between the vegetation index based on Sentinel-2 satellite data and the bias-corrected vegetation index measured by the drone, based on the standard deviation (SD) of VIDrone pixels located inner one pixel of Sentinel-2 satellite data. Max. and Min. are the maximum and minimum values at each range. (a) NDVI, (b) GDNVI, (c) EVI, and (d) NDRE.
Korean Journal of Remote Sensing 2024; 40: 657-673https://doi.org/10.7780/kjrs.2024.40.5.1.19

Table 1 . Information from the multispectral sensors on board the drone and Sentinel-2 satellite.

No.Sentinel-2/MSIDrone/RedEdge-MX Dual
BandCentral wavelength/Bandwidth (nm)Spatial resolution (m)BandCentral wavelength/Bandwidth (nm)
Blue/aerosol1443/36601444/28
Blue2490/96102475/32
Green3531/14
Green3560/45104560/27
Red-5650/16
Red4665/39106668/14
Red edge5705/20207705/10
Red edge8717/12
Red edge6740/18209740/18
Red edge7783/2820
NIR8842/1411010842/57
NIR8A865/3320
WV9945/2760
SWIR/Cirrus101375/7660
SWIR111610/14220
SWIR122190/24020

MSI: multispectral imager, NIR: near-infrared, WV: water vapor, SWIR: shortwave-infrared..


Table 2 . Information from the multispectral sensors on board the drone and Sentinel-2 satellite.

VIEquationSentinel-2 bandDrone band
NDVI(RNIR – RRed) / (RNIR + RRed)B4, B8B6, B10
GNDVI(RNIR – RGreen) / (RNIR + RGreen)B3, B8B4, B10
EVI2.5 * (RNIR – RRed) / (RNIR + 6 * RRed – 7.5 RBlue + 1)B2, B4, B8B2, B6, B10
NDRE(RNIR – RRededge) / (RNIR + RRededge)B5, B8B7, B10

VI: vegetation index, NDVI: normalized difference vegetation index, GNDVI: green normalized difference vegetation index, EVI: enhanced vegetation index; NDRE: normalized difference red-edge index, R: Reflectance, NIR: near-infrared, B: Band..


Table 3 . Measurement information of satellite.

No.SiteMeasurement dateTime differenceAir temperature (ºC)Relative humidity (%)
SatelliteDroneSatelliteDroneSatelliteDrone
1NajuAug. 15, 2020Aug. 14, 20201 day31.532.066.173.0
2NajuSep. 4, 2020Sep. 4, 20200 day27.928.960.256.3
3NajuSep. 24, 2020Sep. 24, 20200 day24.025.558.254.1
4NajuJul. 21, 2021Jul. 21, 20210 day31.831.657.459.0
5NajuAug. 10, 2021Aug. 10, 20210 day30.029.766.265.4
6NajuAug. 20, 2021Aug. 19, 20211 day28.428.861.657.0
7MuanMar. 18, 2020Mar. 20, 20202 days15.613.934.433.4
8MuanApr. 27, 2020Apr. 29, 20202 days17.021.032.034.4
9MuanMay 22, 2020May 20, 20202 days23.619.047.760.6
10MuanMar. 18, 2021Mar. 18, 20210 day17.415.651.158.0
11MuanMay 12, 2021May 14, 20212 days25.326.067.565.6
12MuanMay 22, 2021May 24, 20212 days22.121.370.671.5
13GimjeJul. 1, 2022Jun. 30, 20221 days30.929.470.781.5
14GimjeSep. 9, 2022Sep. 7, 20222 days25.624.379.990.1
15GimjeOct. 14, 2022Oct. 14, 20220 day19.819.779.680.8
16GimjeOct. 19, 2022Oct. 19, 20220 day14.615.251.351.4

Values of air temperature and relative humidiy are at measurement time of Sentienl-2 satellite and drone..


Table 4 . Statistic results for surface reflectance and vegetation indices measured from drone and satellite at rice paddy, garlic and onion, and soybean fields..

CropIndexRSlopeAbsolute biasRMSENRMSE (%)Bias-corrected NRMSE (%)
Rice paddyBlue0.9030.8590.0190.02111.66.1
Green0.8770.7730.0180.02311.58.5
Red0.9080.7590.0170.02410.58.3
Rededge@705 nm0.9130.7920.0220.02912.29.7
Rededge@740 nm0.5690.3030.0250.07235.817.9
NIR0.4770.2730.0650.08130.316.9
NDVI0.9320.8010.0620.07810.37.2
GNDVI0.9270.8050.0570.07011.37.1
EVI0.8110.6030.0850.10515.211.1
NDRE0.9390.7800.0610.07713.38.5
Garlic and onionBlue0.5220.4520.0330.04121.111.3
Green0.5660.4360.0330.04121.713.7
Red0.6470.4320.0410.05122.514.1
Rededge@705 nm0.4180.3160.0360.04628.218.4
Rededge@740 nm0.5500.5990.0340.04220.818.4
NIR0.6490.5990.0390.04918.515.5
NDVI0.7070.4160.1190.14826.213.8
GNDVI0.6490.4150.0950.11624.413.2
EVI0.7450.4740.0790.10020.913.1
NDRE0.6240.3780.0720.09423.312.9
SoybeanBlue0.8501.0530.0180.02314.610.2
Green0.8810.9930.0170.02211.69.0
Red0.8620.8220.0170.02410.59.6
Rededge@705 nm0.8830.7830.0240.03114.312.0
Rededge@740 nm0.8061.0920.0430.05218.716.9
NIR0.9251.1570.0390.05112.611.1
NDVI0.9690.9990.0380.0556.66.4
GNDVI0.9751.0580.0490.0598.55.6
EVI0.9771.1051.1050.0718.36.0
NDRE0.9880.9820.9820.0395.34.6

R: coefficient of correlation, NIR: Near infrared, NDVI: normalized difference vegetation index, GNDVI: green normalized difference vegetation index, EVI: enhanced vegetation index, NDRE: normalized difference red edge index..


References

  1. Ali, A. M., Savin, I., Poddubskiy, A., Abouelghar, M., Saleh, N., and Abutaleb, K., et al, 2021. Integrated method for rice cultivation monitoring using Sentinel-2 data and Leaf Area Index. The Egyptian Journal of Remote Sensing and Space, 24(3), 431-441. https://doi.org/10.1016/j.ejrs.2020.06.007
  2. Assmann, J. J., Myers-Smith, I. H., Kerby, J. T., Cunliffe, A. M., and Daskalova, G. N., 2020. Drone data reveal heterogeneity in tundra greenness and phenology not captured by satellites. Environmental Research Letters, 15(12), 125002. https://doi.org/10.1088/1748-9326/abbf7d
  3. Bansod, B., Singh, R., Thakur, R., and Singhal, G., 2017. A comparision between satellite based and drone based remote sensing technology to achieve sustainable development: A review. Journal of Agriculture and Environment for International Development, 111(2), 383-407. https://doi.org/10.12895/jaeid.20172.690
  4. Bollas, N., Kokinou, E., and Polychronos, V., 2021. Comparison of Sentinel-2 and UAV multispectral data for use in precision agriculture: An application from Northern Greece. Drones, 5(2), 35. https://doi.org/10.3390/drones5020035
  5. Bukowiecki, J., Rose, T., and Kage, H., 2021. Sentinel-2 data for precision agriculture?-A UAV-based assessment. Sensors, 21(8), 2861. https://doi.org/10.3390/s21082861
  6. Caparros-Santiago, J. A., Quesada-Ruiz, L. C., and Rodriguez-Galiano, V., 2023. Can land surface phenology from Sentinel-2 time-series be used as an indicator of Macaronesian ecosystem dynamics?. Ecological Informatics, 77, 102239. https://doi.org/10.1016/j.ecoinf.2023.102239
  7. Dhillon, M. S., Dahms, T., Kübert-Flock, C., Steffan-Dewenter, I., Zhang, J., and Ullmann, T., 2022. Spatiotemporal fusion modelling using STARFM: Examples of Landsat 8 and Sentinel-2 NDVI in Bavaria. Remote Sensing, 14(3), 677. https://doi.org/10.3390/rs14030677
  8. Di Gennaro, S. F., Dainelli, R., Palliotti, A., Toscano, P., and Matese, A., 2019. Sentinel-2 validation for spatial variability assessment in overhead trellis system viticulture versus UAV and agronomic data. Remote Sensing, 11(21), 2573. https://doi.org/10.3390/rs11212573
  9. Dong, T., Liu, J., Qian, B., He, L., Liu, J., and Wang, R., et al, 2020. Estimating crop biomass using leaf area index derived from Landsat 8 and Sentinel-2 data. ISPRS Journal of Photogrammetry and Remote Sensing, 168, 236-250. https://doi.org/10.1016/j.isprsjprs.2020.08.003
  10. Flood, N., 2017. Comparing Sentinel-2A and Landsat 7 and 8 using surface reflectance over Australia. Remote Sensing, 9(7), 659. https://doi.org/10.3390/rs9070659
  11. Frazier, A. E., and Hemingway, B. L., 2021. A technical review of planet smallsat data: Practical considerations for processing and using PlanetScope imagery. Remote Sensing, 13(19), 3930. https://doi.org/10.3390/rs13193930
  12. Hafeez, A., Husain, M. A., Singh, S. P., Chauhan, A., Khan, M. T., and Kumar, N., et al, 2023. Implementation of drone technology for farm monitoring & pesticide spraying: A review. Information Processing in Agriculture, 10(2), 192-203. https://doi.org/10.1016/j.inpa.2022.02.002
  13. Isaev, E., Kulikov, M., Shibkov, E., and Sidle, R. C., 2023. Bias correction of Sentinel-2 with unmanned aerial vehicle multispectral data for use in monitoring walnut fruit forest in western Tien Shan, Kyrgyzstan. Journal of Applied Remote Sensing, 17(2), 022204. https://doi.org/10.1117/1.JRS.17.022204
  14. Jiang, J., Johansen, K., Tu, Y. H., and McCabe, M. F., 2022. Multi-sensor and multi-platform consistency and interoperability between UAV, Planet CubeSat, Sentinel-2, and Landsat reflectance data. GIScience & Remote Sensing, 59(1), 936-958. https://doi.org/10.1080/15481603.2022.2083791
  15. Khaliq, A., Comba, L., Biglia, A., Ricauda Aimonino, D., Chiaberge, M., and Gay, P., 2019. Comparison of satellite and UAVbased multispectral imagery for vineyard variability assessment. Remote Sensing, 11(4), 436. https://doi.org/10.3390/rs11040436
  16. Kong, J., Ryu, Y., Huang, Y., Dechant, B., Houborg, R., and Guan, K., et al, 2021. Evaluation of four image fusion NDVI products against in-situ spectral-measurements over a heterogeneous rice paddy landscape. Agricultural and Forest Meteorology, 297, 108255. https://doi.org/10.1016/j.agrformet.2020.108255
  17. Kong, J., Ryu, Y., Jeong, S., Zhong, Z., Choi, W., and Kim, J., et al, 2023. Super resolution of historic Landsat imagery using a dual generative adversarial network (GAN) model with CubeSat constellation imagery for spatially enhanced longterm vegetation monitoring. ISPRS Journal of Photogrammetry and Remote Sensing, 200, 1-23. https://doi.org/10.1016/j.isprsjprs.2023.04.013
  18. Labib, S. M., and Harris, A., 2018. The potentials of Sentinel-2 and LandSat-8 data in green infrastructure extraction, using object based image analysis (OBIA) method. European Journal of Remote Sensing, 51(1), 231-240. https://doi.org/10.1080/22797254.2017.1419441
  19. Li, Y., Chen, J., Ma, Q., Zhang, H. K., and Liu, J., 2018. Evaluation of Sentinel-2A surface reflectance derived using Sen2Cor in North America. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 11(6), 1997-2021. https://doi.org/10.1109/JSTARS.2018.2835823
  20. Liaghat, S., and Balasundram, S. K., 2010. A review: The role of remote sensing in precision agriculture. American Journal of Agricultural and Biological Sciences, 5(1), 50-55. https://doi.org/10.3844/ajabssp.2010.50.55
  21. Mao, P., Ding, J., Jiang, B., Qin, L., and Qiu, G. Y., 2022. How can UAV bridge the gap between ground and satellite observations for quantifying the biomass of desert shrub community?. ISPRS Journal of Photogrammetry and Remote Sensing, 2, 361-376. https://doi.org/10.1016/j.isprsjprs.2022.08.021
  22. Maponya, M. G., Van Niekerk, A., and Mashimbye, Z. E., 2020. Pre-harvest classification of crop types using a Sentinel-2 time-series and machine learning. Computers and Electronics in Agriculture, 169, 105164. https://doi.org/10.1016/j.compag.2019.105164
  23. Martinez, J. L., Lucas-Borja, M. E., Plaza-Alvarez, P. A., Denisi, P., Moreno, M. A., and Hernández, D., et al, 2021. Comparison of satellite and drone-based images at two spatial scales to evaluate vegetation regeneration after post-fire treatments in a Mediterranean forest. Applied Sciences, 11(12), 5423. https://doi.org/10.3390/app11125423
  24. Mazzia, V., Comba, L., Khaliq, A., Chiaberge, M., and Gay, P., 2020. UAV and machine learning based refinement of a satellite-driven vegetation index for precision agriculture. Sensors, 20(9), 2530. https://doi.org/10.3390/s20092530
  25. Messina, G., Peña, J. M., Vizzari, M., and Modica, G. A., 2020. A comparison of UAV and satellites multispectral imagery in monitoring onion crop. An Application in the 'Cipolla Rossa di Tropea' (Italy). Remote Sensing, 12(20), 3424. https://doi.org/10.3390/rs12203424
  26. Naethe, P., Asgari, M., Kneer, C., Knieps, M., Jenal, A., and Weber, I., et al, 2023. Calibration and validation from ground to airborne and satellite level: Joint application of time-synchronous field spectroscopy, drone, aircraft and Sentinel-2 imaging. PFG-Journal of Photogrammetry, Remote Sensing and Geoinformation Science, 91(1), 43-58. https://doi.org/10.1007/s41064-022-00231-x
  27. Park, N. W., Kim, Y., and Kwak, G. H., 2019. An overview of theoretical and practical issues in spatial downscaling of coarse resolution satellite-derived products. Korean Journal of Remote Sensing, 35(4), 589-607. https://doi.org/10.7780/kjrs.2019.35.4.8
  28. Pla, M., Bota, G., Duane, A., Balagué, J., Curcó, A., and Gutiérrez, R., et al, 2019. Calibrating Sentinel-2 imagery with multispectral UAV derived information to quantify damages in Mediterranean rice crops caused by Western Swamphen (Porphyrio porphyrio). Drones, 3(2), 45. https://doi.org/10.3390/drones3020045
  29. Riihimäki, H., Luoto, M., and Heiskanen, J., 2019. Estimating fractional cover of tundra vegetation at multiple scales using unmanned aerial systems and optical satellite data. Remote Sensing of Environment, 224, 119-132. https://doi.org/10.1016/j.rse.2019.01.030
  30. Roy, D. P., Huang, H., Houborg, R., and Martins, V. S., 2021. A global analysis of the temporal availability of PlanetScope high spatial resolution multi-spectral imagery. Remote Sensing of Environment, 264, 112586. https://doi.org/10.1016/j.rse.2021.112586
  31. Ryu, J. H., Jeong, H., and Cho, J., 2020a. Performances of vegetation indices on paddy rice at elevated air temperature, heat stress, and herbicide damage. Remote Sensing, 12(16), 2654. https://doi.org/10.3390/rs12162654
  32. Ryu, J. H., Moon, H. D., Cho, J., Lee, K., Ahn, H., and So, K., et al, 2021. Response of structural, biochemical, and physiological vegetation indices measured from field-spectrometer and multi-spectral camera under drop stress caused by herbicide. Korean Journal of Remote Sensing, 37(6-1), 1559-1572. https://doi.org/10.7780/KJRS.2021.37.6.1.6
  33. Ryu, J. H., Na, S. I., and Cho, J., 2020b. Inter-Comparison of normalized difference vegetation index measured from different footprint sizes in cropland. Remote Sensing, 12(18), 2980. https://doi.org/10.3390/rs12182980
  34. Sishodia, R. P., Ray, R. L., and Singh, S. K., 2020. Applications of remote sensing in precision agriculture: A review. Remote Sensing, 12(19), 3136. https://doi.org/10.3390/rs12193136
  35. Song, X. P., Huang, W., Hansen, M. C., and Potapov, P., 2021. An evaluation of Landsat, Sentinel-2, Sentinel-1 and MODIS data for crop type mapping. Science of Remote Sensing, 3, 100018. https://doi.org/10.1016/j.srs.2021.100018
  36. Sonobe, R., Yamaya, Y., Tani, H., Wang, X., Kobayashi, N., and Mochizuki, K. I., 2018. Crop classification from Sentinel-2-derived vegetation indices using ensemble learning. Journal of Applied Remote Sensing, 12(2), 026019. https://doi.org/10.1117/1.JRS.12.026019
  37. Sozzi, M., Kayad, A., Marinello, F., Taylor, J., and Tisseyre, B., 2020. Comparing vineyard imagery acquired from Sentinel-2 and Unmanned Aerial Vehicle (UAV) platform. Oeno One, 54(2), 189-197. https://dx.doi.org/10.20870/oeno-one.2020.54.1.2557
  38. Verma, C., Tripathi, V. K., and Paikra, I., 2020. Effect of ridge and furrow system in soybean cultivation and feasibility of economics. International Journal of Chemical Studies, 8(3), 1755-1760. https://doi.org/10.22271/chemi.2020.v8.i3x.9451
  39. Zhang, J., Pan, Y., Tao, X., Wang, B., Cao, Q., and Tian, Y., et al, 2023. In-season mapping of rice yield potential at jointing stage using Sentinel-2 images integrated with high-precision UAS data. European Journal of Agronomy, 146, 126808. https://doi.org/10.1016/j.eja.2023.126808
KSRS
October 2024 Vol. 40, No.5, pp. 419-879

Share

  • line

Related Articles

Korean Journal of Remote Sensing