Open Access
2 May 2017 Detection of potato beetle damage using remote sensing from small unmanned aircraft systems
E. Raymond Hunt, Silvia I. Rondon
Author Affiliations +
Abstract
Colorado potato beetle (CPB) adults and larvae devour leaves of potato and other solanaceous crops and weeds, and may quickly develop resistance to pesticides. With early detection of CPB damage, more options are available for precision integrated pest management, which reduces the amount of pesticides applied in a field. Remote sensing with small unmanned aircraft systems (sUAS) has potential for CPB detection because low flight altitudes allow image acquisition at very high spatial resolution. A five-band multispectral sensor and up-looking incident light sensor were mounted on a six-rotor sUAS, which was flown at altitudes of 60 and 30 m in June 2014. Plants went from visibly undamaged to having some damage in just 1 day. Whole-plot normalized difference vegetation index (NDVI) and the number of pixels classified as damaged (0.70NDVI0.80) were not correlated with visible CPB damage ranked from least to most. Area of CPB damage estimated using object-based image analysis was highly correlated to the visual ranking of damage. Furthermore, plant height calculated using structure-from-motion point clouds was related to CPB damage, but this method required extensive operator intervention for success. Object-based image analysis has potential for early detection based on high spatial resolution sUAS remote sensing.

1.

Introduction

Potatoes (Solanum tuberosum L.) were the first crop for which insecticides were routinely used and currently require more pesticides than other major crops.1,2 One of the most important insect pests of potato is the Colorado potato beetle (CPB) Leptinotarsa decemlineata (Coleoptera: Chrysomelidae); larvae and adults are voracious leaf eaters that can rapidly defoliate a field of potatoes.3 CPB quickly develops resistance to insecticides leading to an unsustainable cycle of crop failure and spread of resistant populations.4,5 Integrated pest management (IPM) is a collection of control methods, including insecticides, that considers the whole system to keep insect damage to acceptable levels.2,57 Early detection allows a larger range of IPM options, reducing the amount of insecticides applied for control.

Insect defoliation of a crop canopy may be remotely sensed by the reduction of leaf area or biomass as measured by spectral vegetation indices, such as the normalized difference vegetation index (NDVI).8,9 Early detection of CPB damage using satellites depends on overpass frequency, cloud cover, sensor pixel size, and the delivery speed of data to the user. Pixel size is an important determinant for early detection because there has to be more visible foliar damage when using larger pixels. Small unmanned aircraft systems (sUAS) acquire imagery at low altitudes for higher spatial resolution and may be ideal for early detection of damage from insects.10 Frequent monitoring of crops during the growing season with unmanned aircraft was envisioned long before the technology evolved to make it practical.1113

Aerial photographs and remotely sensed imagery with very small pixel sizes have usually been interpreted visually, but this becomes burdensome with large numbers of images. Scale-invariant feature transform is used to mosaic a large number of images (stitching).14 Two methods for analysis of the stitched images have gained attention over the past 15 years. First is object-based image analysis (OBIA), which capitalizes on high-spatial resolution to group adjacent pixels with similar spectral and textural properties for classification of land-cover objects.1518 Second is photogrammetric structure from motion (SfM), which is used to generate image point clouds and digital surface elevations from large numbers of overlapping images.1921 The resulting digital elevation data are used to generate orthomosaic images. Furthermore, the SfM point clouds can also be used to determine plant height for estimating growth and biomass.2225

In contrast to spatial variation in soils, which is the basis for precision agriculture, insect emergence from the soil or immigration from other fields is essentially random. This increases the dependence on early detection of insect damage with remote sensing and automated processing. In order to evaluate possible algorithms for sUAS remote sensing, we set up an experiment in which CPB were added to irrigated potatoes in order to vary the amount of infestation during the tuber initiation and bulking growth stages. Ten flights were conducted over 15 days in June 2014 to acquire imagery just before and just after canopy damage from CPB became visible. The objective was to compare traditional spectral indices, OBIA classification, and SfM plant height at the time when CPB damage just becomes visible.

2.

Methods

The study was conducted at Oregon State University’s Hermiston Agricultural Research and Extension Center (HAREC) located in Hermiston, Oregon (45.82021°N and 119.28364°W, 180-m elevation). July is the hottest month with average high and low temperatures of 32°C and 14°C, respectively. The average annual precipitation is 266 mm, with 51 mm during the growing season. The soil type is an Adkins Sandy Loam (coarse-loamy, mixed, superactive, mesic Xeric Haplocalcids).

Small plots of potatoes (Solanum tuberosum L. “Ranger Russet”) were established on April 22, 2014, using a randomized block design with four treatments and four replications [Fig. 1(a)]. Plot size was 9.2  m×2.6  m (three rows wide). No insecticides were applied onto the plants; one application of acetamiprid (Assail 30SG, United Phosphorus, Inc., King of Prussia, Pennsylvania) was made on the soil around the experiment to control CPB migrating from surrounding areas. Irrigation, herbicide application, and fungicide application followed commercial practices.26 Fertilization was about 450  kg/ha nitrogen, 310  kg/ha phosphorus, 220  kg/ha potassium, and 80  kg/ha sulfur.

Fig. 1

(a) Plot layout for additional CPB on “Ranger Russet” potatoes. Treatments consisted of placing additional CPB per plant: control, 0; low, 1.5; medium, 4.5; and high, 7.5. (b) Color-infrared orthomosaic image from flights on June 23, 2014, at 60-m altitude above ground level. (c) Color-infrared orthomosaic from flights on June 24, 2014, at 60-m altitude.

JARS_11_2_026013_f001.png

CPB were collected early in the season and maintained on potato plants in bug dorms (Bioquip, Inc., Rancho Dominguez, California). Before introduction into the colony, beetles were checked for the presence of parasitoids. The colony was maintained at a temperature of 20±5°C and 50% to 60% relative humidity. Potato plants were changed twice a week and bug dorms were changed weekly.

On June 9, 2014, different numbers of CPB were placed in each plot: low, 1.5 CPB/plant; medium, 4.5 CPB/plant; and high, 7.5 CPB/plant [Fig. 1(a)]. The control treatment had no additional CPB; any larvae or adults found in the control plots either emerged from the soil or migrated from other plots. There was no apparent plant damage on June 23, 2014 [Fig. 1(b)]. However, on the next day, June 24, 2014, visual plant damage was obvious [Fig. 1(c)]. The first CPB population survey was conducted on July 2, 2014; 10 plants per plot were randomly selected and inspected for eggs, larvae, and adults.

All sUAS flights were conducted under a Certificate of Authorization from the United States Federal Aviation Administration. Ten flights of a “Spreading Wings” S800 hexacopter (DJI, Shenzhen, Guangdong, China) were made over the plots between 15:00 and 16:00 hours from June 10, 2014 to June 24, 2014. The sUAS was flown using an autopilot, first at 60 m and then at 30-m altitude above ground level. The sensor was a six-channel Mini Multi Camera Array (mini-MCA, Tetracam, Inc., Chatworth, California). Five channels were narrow bands (center wavelength ±10  nm) in the blue (470 nm), green (550 nm), red (660 nm), red-edge (710 nm), and near-infrared (NIR, 810 nm). The sixth channel was used for an upward-looking incident light sensor.27 The focal length of the mini-MCA was 9.6 mm, so ground sample distances (pixel sizes on the ground) were about 30 and 15 mm for 60- and 30-m altitude, respectively.

Because the objective was to compare imagery before and after detection of CPB damage, only the flights on June 23, 2014 and June 24, 2014 were analyzed. If there were any previsual-damage symptoms, it was assumed that the symptoms would be most detectable in the June 23, 2014 images. The images acquired on both days were initially processed using Tetracam’s PixelWrench-2 software to reformat the 10-bit raw imagery to 16-bit Tagged Interchange File Format (*.tif) images. Then, PixelWrench-2 was used to ratio the digital numbers from the first five channels with digital numbers from the incident light sensor channel to calculate apparent surface reflectance from 0% to 100%, scaled from 0 to 65535, respectively. Agisoft Photoscan Pro (version 1.2.6, Agisoft LLC, St. Petersburg, Russia) was used to create five-band orthomosaic images and three-dimensional digital surface elevation models from photogrammetric point clouds. Unfortunately, ground control points were not established, so the orthomosaic images were created using only the sUAS log files. Furthermore, because mini-MCA lenses were not well calibrated, the result was a pronounced curvature of the surface elevation model generated from the point cloud after optimization of camera parameters.28

The Environment for Visualizing Images (ENVI) version 5.3 (Harris Geospatial Solutions, Boulder, Colorado) was used to calculate various spectral vegetation indices. Results from the different indices were highly correlated. Therefore, only NDVI was used:

Eq. (1)

NDVI=(RNIRRR)/(RNIRRR),
where RNIR=NIR reflectance and RR=red reflectance.8 NDVI was calculated by two ways. First, each plot was outlined as a separate region of interest. Areas with low RNIR inside the plot boundaries were included in the regions of interest, whereas those along the plot boundaries (mostly shadows) were avoided. Mean reflectances of each band were calculated inside the region of interest boundaries. Second, NDVI was calculated pixel-by-pixel using the orthomosaic, which then was used for determining CPB damage either by NDVI thresholds or OBIA classification.

The ENVI 5.3 Feature Extraction module was used for OBIA classification.29 First, the NDVI image was segmented into objects by determining gradients of NDVI within a kernel of 9×9  pixels.30 The Watershed transform31 was used for initial segmentation of the image. Based on the cumulative frequency distribution of NDVI gradients (potential object boundaries), the scale parameter was used to define the steepness of the NDVI gradient for selecting object boundaries.32 Based on the cumulative frequency distribution of object mean NDVI, the merge parameter was used to determine if adjacent objects are sufficiently similar to be combined.32 Both the scale and merge parameters were set to 70, in order to define fewer objects and to aggressively merge adjacent objects. Then, a series of spectral, spatial, and textural variables was calculated for each image object, including: mean NDVI, maximum NDVI, minimum NDVI, texture range, texture mean, texture variance, and texture entropy. Using two plots for training areas (102 and 302), we developed simple classification rules using object mean NDVI to classify the orthomosaics into three classes: healthy plants, CPB damage, and soil. Classification rules using object maximum NDVI, minimum NDVI, texture range, or texture variance produced similar classifications; creating rules with several variables did not improve the classifications.

The 16 plots were visually ranked from the least damaged to the most damaged. Side-by-side comparisons of both the color infrared and true-color images were made for all plots individually to determine if one plot had more canopy damage than the next plot. Ties were assigned the average ranking of the two plots. Spearman rank correlation coefficients (rs) were calculated and t-tests were used to determine significance.33

3.

Results and Discussion

There was no indication of potato leaf loss or plant damage from the images acquired on June 23, 2014 [Fig. 1(b)]. There were visibly damaged areas in all plots on the very next day, June 24, 2014 [Fig. 1(c)]. From the visual ranking, the least impacted plots were 201 and 401, and the plot with the most damage was 102 [Fig. 1(c)]. The number of CPB found during the census and the visual ranking of damage was not related to the treatments of artificially applied CPB, based on a Kruskal–Wallace one-way analysis of variance test33 (data not shown). Furthermore, the CPB population was not related to the rank of visual damage [Fig. 2(a)] with rs=0.046. Whole-plot average NDVI was not related (rs=0.23) to the visual ranking of damage [Fig. 2(b)]. Plot NDVI was high for all plots, at both 30- and 60-m altitudes, with NDVI occurring within a narrow range of 0.84 to 0.89 [Fig. 2(b)]. High NDVI was expected because on June 22, 2013, the measured canopy leaf area index was 3.0  m2m2 and remotely sensed NDVI was 0.85 with the same potato variety.34

Fig. 2

CPB damage per plot was ranked from least to most. (a) The number of CPB counted on ten plants on July 2, 2014. (b) Whole-plot-mean NDVI on June 24, 2014, from an altitude of 30 m. The Spearman rank correlation coefficients were not significant.

JARS_11_2_026013_f002.png

For undamaged areas, pixel-based NDVI showed a much larger range from 0.78 to 0.93 (Fig. 3), which was not apparent in the whole-plot averages [Fig. 2(b)]. Canopy damage in the color-infrared images [Fig. 1(c)] corresponded to areas of low NDVI [Fig. 3(b)]. The cumulative frequency distribution of pixel NDVI from both altitudes, 30 and 60 m, was used to determine that a threshold of NDVI0.8 was a good criterion for distinguishing between damaged and undamaged plants within a plot. An NDVI of 0.7 was a good criterion for separating vegetation from cultivated bare soil.

Fig. 3

NDVI on (a) June 23, 2014 and (b) June 24, 2014, from orthomosaic images acquired at 30-m above ground level. Plots are aligned the same as in Fig. 1(a).

JARS_11_2_026013_f003.png

The number of pixels classified as damaged from the NDVI threshold was not related (rs=0.27) to the visual ranking of CPB damage [Fig. 4(a)]. The ranking of visual damage was highly correlated (rs=0.85, t=6.08 with 14 degrees of freedom) to the plot area classified as CPB damaged by feature extraction [Fig. 4(b)]. However, for most plots, the total plot area classified as damaged was less for feature extraction than for the NDVI threshold, because some pixels with low NDVI were included in image objects that had higher mean NDVI. The difference between pixel classifications using an NDVI threshold [Fig. 4(a)] and using objects from NDVI feature extraction [Fig. 4(b)] may not be important depending on the overall goals. If the goal was to detect any plant damage and its approximate location, then a NDVI threshold would have been sufficient. However, if the area of damage was used to trigger different options based on severity, then the feature extraction method would have to be considered because of its higher correlation with the visual damage rating.

Fig. 4

Analysis of the June 24, 2014, images acquired from an altitude of 30 m. (a) Area (% of plot) classified with CPB damage using 0.7NDVI0.80. (b) Relative plot area classified with CPB damage using feature extraction of the NDVI image.

JARS_11_2_026013_f004.png

For the most impacted plot [plot 102, Fig. 1(a)], the area classified as damaged was less than 10%, which was too small to have a strong effect on whole-plot mean NDVI (10%area×0.75+90%area×0.85=0.84). Based on the plot area of 24  m2, an equivalent satellite pixel size would be about 5 m, which is close to the 6-m multispectral resolution of the SPOT 7 satellite. These calculations suggest that a multispectral satellite with a 5-m pixel size would not detect the changes that occurred over just the 1 day. The WorldView-2 and WorldView-3 multispectral sensors have 1.84-m pixel resolution for an area of 3.4  m2. Assuming the same area damaged occurred in just one WorldView-2 pixel [(2.4  m2×0.75+1.0  m2×0.85)/3.4  m2=0.78], canopy damage would be visible. However, if the canopy damage was divided by 8 WorldView-2 pixels (the maximum number possible based on plot dimensions), then the amount of damage would not be detectable. Based on these calculations, it is likely that high-resolution commercial satellites would not be able to detect CPB damage just after it occurred. As the 2014 growing season progressed, CPB damage accumulated to over 75% of the area in each plot, so at some point later in the growing season, CPB damage would become detectable by satellites with larger pixel sizes. Therefore, remote sensing by aircraft, manned or unmanned, remains the best option for early detection of CPB damage. However, studies have not been conducted that show whether manned or unmanned aircraft are more cost effective.

Is there a means of acquiring more accurate information about the amount of CPB damage to a potato canopy? SFM point clouds are intermediate photogrammetric products during the creation of orthomosaic images from numerous overlapping images. With higher spatial resolution available from low-altitude sUAS, digital surface models show spatial variations in plant height.2225 We constructed digital surface models from NDVI, true-color data, color-infrared data, and single bands. On June 23, 2014, the canopy surface was uniform except for the areas between rows [Fig. 5(a)]. Areas of the canopy that were visibly lower on June 24, 2014 [Fig. 5(b)] corresponded with areas of CPB damage [Figs. 1(c) and 3(b)]. The digital surface models had overall curvature; there was an overall convex shape in Fig. 5(a) and an overall concave shape in Fig. 5(b). The digital surface curvatures were least in the center of the image, so plots 102 (the most damaged) and 302 (ranked 7th) were selected for comparisons.

After the overall curvature was taken into consideration, the maximum height of the potato canopy above the soil surface was constant for the 16 plots on both days (3D Multimedia 1). With a single image, variation in height could be attributed to nonuniform increases in plant growth instead of CPB damage. Furthermore, areas of increased growth are expected to have increased NDVI, so similar correlations between the canopy surface elevation and NDVI would result.2225 Comparison of the two images showed that the lower areas of the canopy on June 24 were clearly the result of leaf removal, and also showed that frequent monitoring is required for distinguishing CPB damage, but daily flights as done in this study would be costly and burdensome. However, estimates of plant height could be made by crop simulation models, incorporating weather and phenotypic differences, so differences between expected height and the digital surface elevation models would provide similar information compared to more frequent UAS flights.

Fig. 5

Three-dimensional perspectives looking east on (a) June 23, 2014 and (b) June 24, 2014. Damage from CPB created depressions in the canopy surface elevation model visible after 1 day. The images were mosaics of 59 and 55 near-infrared images for (a) and (b), respectively. Altitude of sUAS was 30-m above ground level, which resulted in a 15-mm ground sample distance. The full point cloud models can be viewed as a multimedia 3-D PDF file (3D Multimedia 1, PDF, 45 MB) [URL: http://dx.doi.org/10.1117/1.JARS.11.2.026013.1].

JARS_11_2_026013_f005.png

The canopy areas assessed visually as damaged were either soil or shadows created by gaps in the canopy [Fig. 6(a)]. Within a plot, CPB damage was defined when the image object’s mean NDVI was 0.7 and 0.80 [Fig. 6(c)]. There was more area classified as damaged from the threshold criteria applied to pixel-based NDVI [Fig. 6(b)] compared to the criteria for the image objects [Fig. 6(c)].

Fig. 6

(a) True color subset of plots 302 (top) and 102 (bottom) from the June 24, 2014, orthomosaic acquired at 30-m altitude. These two plots were near the center of the point cloud, which had the least overall curvature. Plot 102 was visually ranked 16th (the most visual damage), whereas plot 302 was ranked 7th. Edge pixels were not included in the analyses. (b) NDVI with color ranges selected to show damage. (c) Rule-based classification of damage (red) and no damage (green). (d) Relative elevation above ground level determined from the point cloud.

JARS_11_2_026013_f006.png

Canopy surface elevation generated from the SfM point clouds showed depressions at the same locations, where damage was indicated by NDVI. However, the area classified as damaged was larger [Fig. 6(d)]. Comparing the three methods [Figs. 6(b)6(d)] with the true-color image [Fig. 6(a)], the amount of area classified as CPB damaged is subjective, whether by visual interpretation or by the choice of algorithm for detection.35,36

There are many options in workflows for acquiring and analyzing sUAS imagery.3740 For determining in-season nitrogen fertilizer requirements, transects of single images over large fields would be most cost effective, because variation of fertilizer requirements is largely caused by variation in soil properties.41 Each image should have very-high spatial resolution to determine plant cover and chlorophyll content of single leaves. With insect pests, the pattern of damage is unpredictable, so frequent coverage of the whole field may be required. However, if the spatial distribution of damage is clumped (as in Fig. 2), then pixel sizes could be somewhat larger and would still be effective.

4.

Conclusions

Leaf and plant damage caused by CPBs was spatially unpredictable and appeared just over 1 day, so frequent sUAS flights with extensive coverage were needed for early detection. We compared three methods for supervised classification of early CPB damage: pixel-based NDVI thresholds, object-based image analysis, and plant height. Using an orthomosaic image, all three methods found small areas of CPB damage on the day that the damage was first visually detectable. Based on calculations and ignoring problems with cloud cover, satellite data with 5-m pixels would not have been effective for monitoring CPB damage. We are intrigued by the potential for using plant height from SfM point clouds, because undamaged plants could serve as field-by-field references making the overall assessment somewhat more objective. When compared to a visual damage ranking, feature extraction based on object-based image analysis was the most accurate method for detecting the relative amount of plant damage. However, both the feature extraction and plant heights required extensive operator intervention for success. Because the different methods for classification of CPB damage did not result in similar areas of damage, it is necessary for precision IPM to ascertain whether early detection is sufficient or accurate estimates of the amount of damage are required.

Acknowledgments

The project was funded in part by Boeing Research & Technology, Kent, Washington, United States. We thank Alan E. Bruce and Robert W. Turner from Boeing Research & Technology for the initial image processing. Also, we thank Josh J. Brungardt from Paradigm ISR (Bend, Oregon, United States) for managing the UAS and FAA COA. Finally, we thank Philip B. Hamm for facilitating the research at the Hermiston Agricultural Research and Extension Center.

References

1. 

J. D. Hare, “Ecology and management of the Colorado potato beetle,” Ann. Rev. Entomol., 35 81 –100 (1990). http://dx.doi.org/10.1146/annurev.en.35.010190.000501 ARENAA 0066-4170 Google Scholar

2. 

A. Alyokhin, “Colorado potato beetle management on potatoes: current challenges and future prospects,” Fruit Veg. Cereal Sci. Biotechnol., 3 (Special Issue 1), 10 –19 (2009). Google Scholar

3. 

A. Alyokhin, M. Udalov, G. Benkovskaya, “The Colorado potato beetle,” Insect Pests of Potato, 11 –29 Academic Press, Oxford, United Kingdom (2013). Google Scholar

4. 

A. Alyokhin et al., “Colorado potato beetle resistance to insecticides,” Am. J. Potato Res., 85 395 –413 (2008). http://dx.doi.org/10.1007/s12230-008-9052-0 Google Scholar

5. 

A. Alyokhin et al., “The red queen in a potato field: integrated pest management versus chemical dependency in Colorado potato beetle control,” Pest Manage. Sci., 71 343 –356 (2015). http://dx.doi.org/10.1002/ps.2015.71.issue-3 Google Scholar

6. 

S. I. Rondon, “Building and integrated pest management for potatoes in North America: where to start?,” Acta Hortic., 960 371 –384 (2012). http://dx.doi.org/10.17660/ActaHortic.2012.960.54 AHORA2 Google Scholar

7. 

S. I. Rondon, “Pest management strategies for potato insect pests in the Pacific Northwest of the United States,” Insecticides—Pest Engineering, 309 –332 InTech, Rijeka, Croatia (2012). Google Scholar

8. 

J. W. Rouse et al., “Monitoring vegetation systems in the great plains with ERTS,” Third Earth Resources Technology Satellite–1 Symposium, 1 309 –317 National Aeronautics and Space Administration, Washington (1974). Google Scholar

9. 

C. J. Tucker, “Red and photographic infrared linear combinations for monitoring vegetation,” Remote Sens. Environ., 8 127 –150 (1979). http://dx.doi.org/10.1016/0034-4257(79)90013-0 Google Scholar

10. 

J. Yue et al., “The application of unmanned aerial vehicle remote sensing in quickly monitoring crop pests,” Intell. Autom. Soft Comput., 18 1043 –1052 (2012). http://dx.doi.org/10.1080/10798587.2008.10643309 Google Scholar

11. 

W. Wester-Ebbinghaus, “Aerial photography by radio controlled model helicopter,” Photogramm. Rec., 10 85 –92 (1980). http://dx.doi.org/10.1111/j.1477-9730.1980.tb00006.x PGREAY 0031-868X Google Scholar

12. 

R. D. Jackson and J. W. Youngblood, “Agriculture’s eye in the sky. Forever plane could give continuous crop data,” Crops Soils Mag., 36 (1), 15 –18 (1983). CRSOA3 0162-5098 Google Scholar

13. 

S. R. Herwitz et al., “Imaging from an unmanned aerial vehicle: agricultural surveillance and decision support,” Comput. Electron. Agric., 44 49 –61 (2004). http://dx.doi.org/10.1016/j.compag.2004.02.006 CEAGE6 0168-1699 Google Scholar

14. 

D. G. Lowe, “Distinctive image features from scale-invariant keypoints,” Int. J. Comput. Vision, 60 91 –110 (2004). http://dx.doi.org/10.1023/B:VISI.0000029664.99615.94 IJCVEQ 0920-5691 Google Scholar

15. 

T. Blaschke, “Object based image analysis for remote sensing,” ISPRS J. Photogramm. Remote Sens., 65 2 –16 (2010). http://dx.doi.org/10.1016/j.isprsjprs.2009.06.004 IRSEE9 0924-2716 Google Scholar

16. 

A. S. Laliberte et al., “Object-oriented image analysis for mapping shrub encroachment from 1937 to 2003 in southern New Mexico,” Remote Sens. Environ., 93 198 –210 (2004). http://dx.doi.org/10.1016/j.rse.2004.07.011 Google Scholar

17. 

A. Laliberte et al., “Acquisition, orthorectification, and object-based classification of unmanned aerial vehicle (UAV) imagery for rangeland monitoring,” Photogramm. Eng. Remote Sens., 76 661 –672 (2010). http://dx.doi.org/10.14358/PERS.76.6.661 Google Scholar

18. 

D. Liu and F. Xia, “Assessing object-based classification: advantages and limitations,” Remote Sens. Lett., 1 187 –194 (2010). http://dx.doi.org/10.1080/01431161003743173 Google Scholar

19. 

D. Turner, A. Lucieer and C. Watson, “An automated technique for generating georectified mosaics from ultra-high resolution unmanned aerial vehicle (UAV) imagery, based on structure from motion (SfM) point clouds,” Remote Sens., 4 1392 –1410 (2012). http://dx.doi.org/10.3390/rs4051392 Google Scholar

20. 

S. Harwin and A. Lucieer, “Assessing the accuracy of georeferenced point clouds produced via multi-view stereopsis from unmanned aerial vehicle (UAV) imagery,” Remote Sens., 4 1573 –1599 (2012). http://dx.doi.org/10.3390/rs4061573 Google Scholar

21. 

M. J. Westoby et al., “‘Structure-from-motion’ photogrammetry: a low-cost, effective tool for geoscience applications,” Geomorphology, 179 300 –314 (2012). http://dx.doi.org/10.1016/j.geomorph.2012.08.021 Google Scholar

22. 

J. Bendig, A. Bolten and G. Bareth, “UAV-based imaging for multi-temporal, very high resolution crop surface models to monitor crop growth variability monitoring,” Photogramm. Fernerkundung Geoinf., 2013 551 –562 (2013). http://dx.doi.org/10.1127/1432-8364/2013/0200 Google Scholar

23. 

J. Bendig et al., “Estimating biomass of barley using crop surface models (CSMs) derived from UAV-based RGB imaging,” Remote Sens., 6 10395 (2014). http://dx.doi.org/10.3390/rs61110395 Google Scholar

24. 

J. Bendig et al., “Combining UAV-based plant height from crop surface models, visible, and near-infrared vegetation indices for biomass monitoring in barley,” Int. J. Appl. Earth Obs. Geoinf., 39 79 –87 (2015). http://dx.doi.org/10.1016/j.jag.2015.02.012 Google Scholar

25. 

N. Tilly, H. Aasen and G. Bareth, “Fusion of plant height and vegetation indices for the estimation of barley biomass,” Remote Sens., 7 11449 (2015). http://dx.doi.org/10.3390/rs70911449 Google Scholar

26. 

B. G. Hopkins et al., “Evaluation of potato production best management practices,” Am. J. Potato Res., 84 19 –27 (2007). http://dx.doi.org/10.1007/BF02986295 Google Scholar

27. 

Agisoft LLC, “Bowl effect after optomization (sic) with GCPs,” (2017) http://www.agisoft.com/forum/index.php?topic=3940.msg20483#msg20483 April ). 2017). Google Scholar

28. 

Harris Geospatial Solutions, “Feature extraction reference,” (2017) http://harrisgeospatial.com/docs/FeatureExtractionReference.html April ). 2017). Google Scholar

29. 

Harris Geospatial Solutions, “List of attributes,” (2017) http://www.harrisgeospatial.com/docs/attributelist.html April ). 2017). Google Scholar

30. 

J. B. T. M. Roerdink and A. Meijster, “The watershed transform: definitions, algorithms, and parallelization strategies,” Fundamenta Informaticae, 41 187 –228 (2001). Google Scholar

31. 

R. G. D. Steel and J. H. Torrie, Principles and Procedures of Statistics, McGraw-Hill, New York (1960). Google Scholar

32. 

Jr. E. R. Hunt et al., “Detection of nitrogen deficiency in potatoes using small unmanned aircraft systems,” in Proc. 12th Int. Conf. Precision Agriculture, 1431 (2014). Google Scholar

33. 

X. H. Liu, A. K. Skidmore and H. Van Oosten, “Integration of classification methods for improvement of land-cover map accuracy,” ISPRS J. Photogramm. Remote Sens., 56 257 –268 (2002). http://dx.doi.org/10.1016/S0924-2716(02)00061-8 IRSEE9 0924-2716 Google Scholar

34. 

D. Lu and Q. Weng, “A survey of image classification methods and techniques for improving classification performance,” Int. J. Remote Sens., 28 823 –870 (2007). http://dx.doi.org/10.1080/01431160600746456 IJSEDK 0143-1161 Google Scholar

35. 

J. Torres-Sánchez et al., “Configuration and specifications of an unmanned aerial vehicle (UAV) for early site specific weed management,” PLoS One, 8 (3), e58210 (2013). http://dx.doi.org/10.1371/journal.pone.0058210 POLNCL 1932-6203 Google Scholar

36. 

J. Torres-Sánchez, F. López-Granados and J. M. Peña, “An automatic object-based method for optimal thresholding in UAV images: application for vegetation detection in herbaceous crops,” Comput. Electron. Agric., 114 43 –52 (2015). http://dx.doi.org/10.1016/j.compag.2015.03.019 CEAGE6 0168-1699 Google Scholar

37. 

A. J. Mathews, “Object-based spatiotemporal analysis of vine canopy vigor using an inexpensive unmanned aerial vehicle remote sensing system,” J. Appl. Remote Sens., 8 (1), 085199 (2014). http://dx.doi.org/10.1117/1.JRS.8.085199 Google Scholar

38. 

E. Salamí, C. Barrado and E. Pastor, “UAV flight experiments applied to the remote sensing of vegetated areas,” Remote Sens., 6 11051 (2014). http://dx.doi.org/10.3390/rs61111051 Google Scholar

39. 

Jr. E. R. Hunt et al., “Remote sensing with simulated unmanned aircraft imagery for precision agriculture applications,” IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens., 7 4566 –4571 (2014). http://dx.doi.org/10.1109/JSTARS.2014.2317876 Google Scholar

40. 

S. Heinhold, “Radiometric multi-spectral or hyperspectral camera array using matched area sensors and a calibrated ambient light collection device,” U.S. Patent US 2014/0022381 A1 (2014).

41. 

X. Jin, “Segmentation-based image processing system,” U.S. Patent US 2012/8260048 B2 (2012).

Biography

E. Raymond Hunt received his BS degree in botany at Ohio University in 1978 and PhD in botany at the University of Michigan in 1984. His current research involves remote sensing for precision agriculture with unmanned aircraft systems. Past research includes remote sensing using shortwave infrared reflectances for plant water content, hyperspectral remote sensing for invasive species detection, and modeling ecosystem biogeochemical cycles.

Silvia I. Rondon is a professor in the Crop and Soils Department and an Extension Entomologist. She received her BA and MS degrees in entomology from the Agraria University in Lima, Peru, and her PhD from the University of Illinois at Urbana-Champaign. In 2002, she was a postdoctoral associate at the University of Florida in Gainesville. Her areas of expertise are pest management, insect ecology, and biological control of tree fruits and high-value vegetables.

CC BY: © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
E. Raymond Hunt and Silvia I. Rondon "Detection of potato beetle damage using remote sensing from small unmanned aircraft systems," Journal of Applied Remote Sensing 11(2), 026013 (2 May 2017). https://doi.org/10.1117/1.JRS.11.026013
Received: 3 October 2016; Accepted: 27 March 2017; Published: 2 May 2017
Lens.org Logo
CITATIONS
Cited by 33 scholarly publications.
Advertisement
Advertisement
KEYWORDS
Remote sensing

Visualization

Clouds

Image analysis

Feature extraction

3D modeling

Satellites

Back to Top