We say why not both?
In precision agriculture numerous decision tools have been developed to help the farmers decide where and how to plant, along with facilitating certain agricultural practices, such as irrigation, monitoring and harvest time estimation. These decision tools rely heavily on remotely sensed data from sensors, airborne platforms or satellites:
However, many end-users still highlight the satellite/drone dilemma when in fact the two data sources are complementary. Let’s get rid of this stereotypical idea that presents the two platforms as opposites so we can dive into the major differences between both data sources.
Open-source satellites provide free imagery but have a low spatial resolution (minimum area detected on the ground). Many constellations that offer free data cover at least 10 meters on the ground (except panchromatic band, which has a finer resolution at another cost of spectral resolution).
Moreover, satellites are mainly used in a continental or regional scale where the aim is to have a wider field of view. In precision farming, it means acquiring less details in favour of understanding the global heterogeneity of a farm or plot.
Various satellite constellations provide more than just optical imagery (RGB (Red-Green-Blue) and multispectral). Some provide a whole spectrum using hyperspectral imagery. In other words, they allow us to capture the reflected light of each pixel (minimum entity viewed by a sensor) and have its spectral signature. Thus, helping to distinguish between vegetation and soil or mixed pixels; pixels containing both soil and vegetation for instance.
Given their revisit time [period needed for a satellite to image the same spot] satellite imagery revealed to be useful when a temporal analysis is needed. For the same purpose, many scientists use both free and purchased imagery from Sentinel2, PlanetScope, Landsat 8, Landsat 9 along with other constellations[1, 2].
When should drones be used?
When it comes to UAVs (Unmanned Aerial Vehicles) or drones they are mainly known for their higher spatial resolution given their low flight altitudes. They enable us to capture more details in a given farm or plot. Various studies in precision agriculture demonstrated the usefulness of UAVs in building biomass, nitrogen, yield and other crops’ biophysical parameters models from destructive and non-destructive sampling methods. Some highlighted their efficacy for a thorough temporal monitoring of the fields, despite the long administrative processes needed to obtain an authorization for certain countries. Like satellites, drones also have the option of using other types of sensors. For example, Lidar, Hyperspectral, and Thermal payloads.
Now that we dwelled on how satellites and drones may be used, what about when they can’t be used?
In general, for optical satellites, the presence of certain obstructions such as clouds and shadows can create difficulties in use. However, many scientists are now finding solution to ‘de-cloud’ satellite imagery [3, 4, 5]. For drones, the biggest issue is the difficult administrative process needed for authorization in many countries, which keeps users from being able to unleashing the full power of UAVs for scientific and research purposes.
Below is a mapping of both UAV and satellite images.
The above maps show NDVI values for both satellite and UAV given the following acquisition dates: 2020-02-13 and 2020-02-14 for both S2 and UAV respectively.
In comparing both maps, we can observe that red circled areas share the same average values in both the UAV and satellite images. Whereas the blue circled areas show intense green values, which demonstrates the high vigour value in that zone.
The difference in both maps is seen in the detection of low NDVI values encircled in black on the UAV imagery but absent on the satellite map. This is due to the high spatial resolution of the UAV that enables capturing more details.
We then calculated some statistics for the same plot :
The difference in the spatial resolution is noticed when viewing the number of pixels of both UAV and satellite. Another remark that should be highlighted is that related to the minimal and maximal values. Where the min/max is 0.17/0.94 for the UAV whereas 0.59/0.74 corresponds to that of the satellite. Unlike Sentinel 2 imagery, UAV showed us more details in the plot which also means that an entire pixel might represent only soil and only vegetation. Hence the registered min/max values.
- Du, Mengmeng & Noboru, Noguchi & Atsushi, Itoh & Yukinori, Shibuya. (2017). Multi-temporal monitoring of wheat growth by using images from satellite and unmanned aerial vehicle. International Journal of Agricultural and Biological Engineering. 10. 10.25165/j.ijabe.20171005.3180.
- Bhuyar, Nirbhay. (2020). Crop Classification with Multi-Temporal Satellite Image Data. International Journal of Engineering Research and. V9. 10.17577/IJERTV9IS060208.
- Andrea Meraner, Patrick Ebel, Xiao Xiang Zhu, Michael Schmitt, Cloud removal in Sentinel-2 imagery using a deep residual neural network and SAR-optical data fusion, ISPRS Journal of Photogrammetry and Remote Sensing, Volume 166, 2020, Pages 333-346, ISSN 0924-2716, https://doi.org/10.1016/j.isprsjprs.2020.05.013. (https://www.sciencedirect.com/science/article/pii/S0924271620301398)
- Jun Liu, Xing Wang, Min Chen, Shuguang Liu, Xiran Zhou, Zhenfeng Shao, and Ping Liu, “Thin cloud removal from single satellite images,” Opt. Express 22, 618-632 (2014).
- Lin, Chao-Hung & Tsai, Po-Hung & Lai, Kang-Hua & Chen, Jyun-Yuan. (2013). Cloud Removal From Multitemporal Satellite Images Using Information Cloning. IEEE Transactions on Geoscience and Remote Sensing. 51. 232-241. 10.1109/TGRS.2012.2197682.