PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping, 986601 (2016) https://doi.org/10.1117/12.2244381
This PDF file contains the front matter associated with SPIE Proceedings Volume 9866, including the Title Page, Copyright information, Table of Contents, and Conference Committee listing.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Unmanned Ground Vehicles in High-Throughput Phenotyping
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping, 986602 (2016) https://doi.org/10.1117/12.2234052
Unmanned aerial vehicles (UAVs) have advantages over manned vehicles for agricultural remote sensing. Flying UAVs is less expensive, is more flexible in scheduling, enables lower altitudes, uses lower speeds, and provides better spatial resolution for imaging. The main disadvantage is that, at lower altitudes and speeds, only small areas can be imaged. However, on large farms with contiguous fields, high-quality images can be collected regularly by using UAVs with appropriate sensing technologies that enable high-quality image mosaics to be created with sufficient metadata and ground-control points. In the United States, rules governing the use of aircraft are promulgated and enforced by the Federal Aviation Administration (FAA), and rules governing UAVs are currently in flux. Operators must apply for appropriate permissions to fly UAVs. In the summer of 2015 Texas A&M University's agricultural research agency, Texas A&M AgriLife Research, embarked on a comprehensive program of remote sensing with UAVs at its 568-ha Brazos Bottom Research Farm. This farm is made up of numerous fields where various crops are grown in plots or complete fields. The crops include cotton, corn, sorghum, and wheat. After gaining FAA permission to fly at the farm, the research team used multiple fixed-wing and rotary-wing UAVs along with various sensors to collect images over all parts of the farm at least once per week. This article reports on details of flight operations and sensing and analysis protocols, and it includes some lessons learned in the process of developing a UAV remote-sensing effort of this sort.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping, 986603 (2016) https://doi.org/10.1117/12.2234051
Developed agriculture uses massive amounts of energy in a myriad of forms, from the energy associated with chemicals used to control pests and diseases, through fertilisers, to the tractors themselves and the fuel to power them. This energy is often wasted as it goes off-target, is expensive and will become more so in the future. Smarter machines should use the minimum amount of energy to turn the natural environment into useful agriculture thus cutting out wasted energy and reducing costs. As agricultural engineers we are continually looking to find ways of making the crop and animal production processes more efficient and have developed the concept of Precision Farming, where we recognise the natural variability found on our farms and change the management and treatments to suit. This variability takes both spatial and temporal forms. Spatial variability can be understood and managed by creating yield maps and soil maps. Temporal variability is often fundamentally linked to changes in weather over time resulting in the need for real-time management. In industry, we used to have a production line mass producing one item and are now moving over to flexible manufacturing, where each item is developed individually. In agriculture we can see a similar approach by reducing the scale of treatments from farm scale, to field scale, to sub-field scale and even individual plant treatment.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
David Gouache, Katia Beauchêne, Agathe Mini, Antoine Fournier, Benoit de Solan, Fred Baret, Alexis Comar
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping, 986604 (2016) https://doi.org/10.1117/12.2229389
Digital and image analysis technologies in greenhouses have become commonplace in plant science research and started to move into the plant breeding industry. However, the core of plant breeding work takes place in fields. We will present successive technological developments that have allowed the migration and application of remote sensing approaches at large into the field of crop genetics and physiology research, with a number of projects that have taken place in France. These projects have allowed us to develop combined sensor plus vector systems, from tractor mounted and UAV (unmanned aerial vehicle) mounted spectroradiometry to autonomous vehicle mounted spectroradiometry, RGB (red-green-blue) imagery and Lidar. We have tested these systems for deciphering the genetics of complex plant improvement targets such as the robustness to nitrogen and water deficiency of wheat and maize. Our results from wheat experiments indicate that these systems can be used both to screen genetic diversity for nitrogen stress tolerance and to decipher the genetics behind this diversity. We will present our view on the next critical steps in terms of technology and data analysis that will be required to reach cost effective implementation in industrial plant breeding programs. If this can be achieved, these technologies will largely contribute to resolving the equation of increasing food supply in the resource limited world that lies ahead.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping, 986605 (2016) https://doi.org/10.1117/12.2228790
High throughput phenotyping (HTP) is an emerging frontier field across many basic and applied plant science disciplines. RGB imaging is most widely used in HTP to extract image-based phenotypes such as pixel volume or projected area. These image-based phenotypes are further used to derive plant physical parameters including plant fresh biomass, plant dry biomass, water use efficiency etc. In this paper, we investigated the robustness of regression models to predict fresh biomass of maize plants from image-based phenotypes. Data used in this study were from three different experiments. Data were grouped into five datasets, two for model development and three for independent model validation. Three image-derived phenotypes were investigated: BioVolume, Projected.Area.1, and Projected.Area.2. Models were assessed with R2, Bias, and RMSEP (Root Mean Squared Error of Prediction). The results showed that almost all models were validated with high R2 values, indicating that these digital phenotypes can be useful to rank plant biomass on a relative basis. However, in many occasions when accurate prediction of plant biomass is needed, it is important for researchers to know that models that relate image-based phenotypes to plant biomass should be carefully constructed. Our results show that the range of plant size and the genotypic diversity of the calibration sets in relation to the validation sets have large impact on the model accuracy. Large maize plants cause systematic bias as they grow toward the top-view camera. Excluding top-view images from modeling can there benefit modeling for the experiments involving large maize plants.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Seth C. Murray, Leighton Knox, Brandon Hartley, Mario A. Méndez-Dorado, Grant Richardson, J. Alex Thomasson, Yeyin Shi, Nithya Rajan, Haly Neely, et al.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping, 986607 (2016) https://doi.org/10.1117/12.2228323
The next generation of plant breeding progress requires accurately estimating plant growth and development parameters to be made over routine intervals within large field experiments. Hand measurements are laborious and time consuming and the most promising tools under development are sensors carried by ground vehicles or unmanned aerial vehicles, with each specific vehicle having unique limitations. Previously available ground vehicles have primarily been restricted to monitoring shorter crops or early growth in corn and sorghum, since plants taller than a meter could be damaged by a tractor or spray rig passing over them. Here we have designed two and already constructed one of these self-propelled ground vehicles with adjustable heights that can clear mature corn and sorghum without damage (over three meters of clearance), which will work for shorter row crops as well. In addition to regular RGB image capture, sensor suites are incorporated to estimate plant height, vegetation indices, canopy temperature and photosynthetically active solar radiation, all referenced using RTK GPS to individual plots. These ground vehicles will be useful to validate data collected from unmanned aerial vehicles and support hand measurements taken on plots.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping, 986608 (2016) https://doi.org/10.1117/12.2229513
A multi-view stereo vision system for true 3D reconstruction, modeling and phenotyping of plants was created that successfully resolves many of the shortcomings of traditional camera-based 3D plant phenotyping systems. This novel system incorporates several features including: computer algorithms, including camera calibration, excessive-green based plant segmentation, semi-global stereo block matching, disparity bilateral filtering, 3D point cloud processing, and 3D feature extraction, and hardware consisting of a hemispherical superstructure designed to hold five stereo pairs of cameras and a custom designed structured light pattern illumination system. This system is nondestructive and can extract 3D features of whole plants modeled from multiple pairs of stereo images taken at different view angles. The study characterizes the systems phenotyping performance for 3D plant features: plant height, total leaf area, and total leaf shading area. For plants having specified leaf spacing and size, the algorithms used in our system yielded satisfactory experimental results and demonstrated the ability to study plant development where the same plants were repeatedly imaged and phenotyped over the time.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Unmanned Ground and Aerial Vehicles in High-Throughput Phenotyping
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping, 98660B (2016) https://doi.org/10.1117/12.2224423
High-Throughput Phenotyping (HTP) is a discipline for rapidly identifying plant architectural and physiological responses to environmental factors such as heat and water stress. Experiments conducted since 2010 at Maricopa, Arizona with a three-fold sensor group, including thermal infrared radiometers, active visible/near infrared reflectance sensors, and acoustic plant height sensors, have shown the validity of HTP with a tractor-based system. However, results from these experiments also show that accuracy of plant phenotyping is limited by the system’s inability to discriminate plant components and their local environmental conditions. This limitation may be overcome with plant imaging and laser scanning which can help map details in plant architecture and sunlit/shaded leaves. To test the capability for mapping cotton plants with a laser system, a track-mounted platform was deployed in 2015 over a full canopy and defoliated cotton crop consisting of a scanning LIDAR driven by Arduinocontrolled stepper motors. Using custom Python and Tkinter code, the platform moved autonomously along a pipe-track at 0.1 m/s while collecting LIDAR scans at 25 Hz (0.1667 deg. beam). These tests showed that an autonomous LIDAR platform can reduce HTP logistical problems and provide the capability to accurately map cotton plants and cotton bolls.
A prototype track-mounted platform was developed to test the use of LIDAR scanning for High- Throughput Phenotyping (HTP). The platform was deployed in 2015 at Maricopa, Arizona over a senescent cotton crop. Using custom Python and Tkinter code, the platform moved autonomously along a pipe-track at <1 m/s while collecting LIDAR scans at 25 Hz (0.1667 deg. beam). Scanning data mapped the canopy heights and widths, and detected cotton bolls.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Joe Mari J. Maja, Todd Campbell, Joao Camargo Neto, Philip Astillo
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping, 98660C (2016) https://doi.org/10.1117/12.2228929
One of the major criteria used for advancing experimental lines in a breeding program is yield performance. Obtaining yield performance data requires machine picking each plot with a cotton picker, modified to weigh individual plots. Harvesting thousands of small field plots requires a great deal of time and resources. The efficiency of cotton breeding could be increased significantly while the cost could be decreased with the availability of accurate methods to predict yield performance.
This work is investigating the feasibility of using an image processing technique using a commercial off-the-shelf (COTS) camera mounted on a small Unmanned Aerial Vehicle (sUAV) to collect normal RGB images in predicting cotton yield on small plot. An orthonormal image was generated from multiple images and used to process multiple, segmented plots. A Gaussian blur was used to eliminate the high frequency component of the images, which corresponds to the cotton pixels, and used image subtraction technique to generate high frequency pixel images. The cotton pixels were then separated using k-means cluster with 5 classes. Based on the current work, the calculated percentage cotton area was computed using the generated high frequency image (cotton pixels) divided by the total area of the plot. Preliminary results showed (five flights, 3 altitudes) that cotton cover on multiple pre-selected 227 sq. m. plots produce an average of 8% which translate to approximately 22.3 kgs. of cotton. The yield prediction equation generated from the test site was then use on a separate validation site and produced a prediction error of less than 10%. In summary, the results indicate that a COTS camera with an appropriate image processing technique can produce results that are comparable to expensive sensors.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Yeyin Shi, Seth C. Murray, William L. Rooney, John Valasek, Jeff Olsenholler, N. Ace Pugh, James Henrickson, Ezekiel Bowden, Dongyan Zhang, et al.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping, 98660E (2016) https://doi.org/10.1117/12.2228737
Recent development of unmanned aerial systems has created opportunities in automation of field-based high-throughput phenotyping by lowering flight operational cost and complexity and allowing flexible re-visit time and higher image resolution than satellite or manned airborne remote sensing. In this study, flights were conducted over corn and sorghum breeding trials in College Station, Texas, with a fixed-wing unmanned aerial vehicle (UAV) carrying two multispectral cameras and a high-resolution digital camera. The objectives were to establish the workflow and investigate the ability of UAV-based remote sensing for automating data collection of plant traits to develop genetic and physiological models. Most important among these traits were plant height and number of plants which are currently manually collected with high labor costs. Vegetation indices were calculated for each breeding cultivar from mosaicked and radiometrically calibrated multi-band imagery in order to be correlated with ground-measured plant heights, populations and yield across high genetic-diversity breeding cultivars. Growth curves were profiled with the aerial measured time-series height and vegetation index data. The next step of this study will be to investigate the correlations between aerial measurements and ground truth measured manually in field and from lab tests.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping, 98660F (2016) https://doi.org/10.1117/12.2228872
Variety choice is the most important production decision farmers make because high yielding varieties can increase profit with no additional production costs. Therefore, yield improvement has been the major objective for peanut (Arachis hypogaea L.) breeding programs worldwide, but the current breeding approach (selecting for yield under optimal production conditions) is slow and inconsistent with the needs derived from population demand and climate change. To improve the rate of genetic gain, breeders have used target physiological traits such as leaf chlorophyll content using SPAD chlorophyll meter, Normalized Difference Vegetation Index (NDVI) from canopy reflectance in visible and near infra-red (NIR) wavelength bands, and canopy temperature (CT) manually measured with infra-red (IR) thermometers at the canopy level; but its use for routine selection was hampered by the time required to walk hundreds of plots. Recent developments in remote sensing-based high throughput phenotyping platforms using unmanned aerial vehicles (UAV) have shown good potential for future breeding advancements. Recently, we initiated a study for the evaluation of suitability of digital imagery, NDVI, and CT taken from an UAV platform for peanut variety differentiation. Peanut is unique for setting its yield underground and resilience to drought and heat, for which yield is difficult to pre-harvest estimate; although the need for early yield estimation within the breeding programs exists. Twenty-six peanut cultivars and breeding lines were grown in replicated plots either optimally or deficiently irrigated under rain exclusion shelters at Suffolk, Virginia. At the beginning maturity growth stage, approximately a month before digging, NDVI and CT were taken with ground-based sensors at the same time with red, blue, green (RGB) images from a Sony camera mounted on an UAV platform. Disease ratings were also taken pre-harvest. Ground and UAV derived vegetation indices were analyzed for disease and yield prediction and further presented in this paper.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Sindhuja Sankaran, Lav R. Khot, Juan Quirós, George J. Vandemark, Rebecca J. McGee
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping, 98660G (2016) https://doi.org/10.1117/12.2228550
In plant breeding, one of the biggest obstacles in genetic improvement is the lack of proven rapid methods for measuring plant responses in field conditions. Therefore, the major objective of this research was to evaluate the feasibility of utilizing high-throughput remote sensing technology for rapid measurement of phenotyping traits in legume crops. The plant responses of several chickpea and peas varieties to the environment were assessed with an unmanned aerial vehicle (UAV) integrated with multispectral imaging sensors. Our preliminary assessment showed that the vegetation indices are strongly correlated (p<0.05) with seed yield of legume crops. Results endorse the potential of UAS-based sensing technology to rapidly measure those phenotyping traits.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping, 98660H (2016) https://doi.org/10.1117/12.2227720
The grape industry relies on regular crop assessment to aid in the day-to-day and seasonal management of their crop. More specifically, there are six key nutrients of interest to viticulturists in the growing of wine grapes, namely nitrogen, potassium, phosphorous, magnesium, zinc and boron. Traditional methods of determining the levels of these nutrients are through collection and chemical analysis of petiole samples from the grape vines themselves. We collected ground-level observations of the spectra of the grape vines, using a hyperspectral spectrometer (0.4–2.5um), at the same time that petioles samples were harvested. We then interpolated the data into a consistent 1 nm spectral resolution before comparing it to the nutrient data collected. This nutrient data came from both the industry standard petiole analysis, as well as an additional leaf-level analysis. The data were collected for two different grape cultivars, both during bloom and veraison periods to provide variability, while also considering the impact of temporal/seasonal change. A narrow-band NDI (Normalized Difference Index) approach, as well as a simple ratio index, was used to determine the correlation of the reflectance data to the nutrient data. This analysis was limited to the silicon photodiode range to increase the utility of our approach for wavelength-specific cameras (via spectral filters) in a low cost drone platform. The NDI generated correlation coefficients were as high as 0.80 and 0.88 for bloom and veraison, respectively. The ratio index produced correlation coefficient results that are the same at two decimal places with 0.80 and 0.88. These results bode well for eventual non-destructive, accurate and precise assessment of vineyard nutrient status.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Ke Ding, Amar Raheja, Subodh Bhandari, Robert L. Green
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping, 98660I (2016) https://doi.org/10.1117/12.2228695
Historically, investigation of turfgrass characteristics have been limited to visual ratings. Although relevant information may result from such evaluations, final inferences may be questionable because of the subjective nature in which the data is collected. Recent advances in computer vision techniques allow researchers to objectively measure turfgrass characteristics such as percent ground cover, turf color, and turf quality from the digital images. This paper focuses on developing a methodology for automated assessment of turfgrass quality from aerial images. Images of several turfgrass plots of varying quality were gathered using a camera mounted on an unmanned aerial vehicle. The quality of these plots were also evaluated based on visual ratings. The goal was to use the aerial images to generate quality evaluations on a regular basis for the optimization of water treatment. Aerial images are used to train a neural network so that appropriate features such as intensity, color, and texture of the turfgrass are extracted from these images. Neural network is a nonlinear classifier commonly used in machine learning. The output of the neural network trained model is the ratings of the grass, which is compared to the visual ratings. Currently, the quality and the color of turfgrass, measured as the greenness of the grass, are evaluated. The textures are calculated using the Gabor filter and co-occurrence matrix. Other classifiers such as support vector machines and simpler linear regression models such as Ridge regression and LARS regression are also used. The performance of each model is compared. The results show encouraging potential for using machine learning techniques for the evaluation of turfgrass quality and color.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping, 98660J (2016) https://doi.org/10.1117/12.2227214
Vegetation health and vigor can be assessed with data from multi- and hyperspectral airborne and satellite- borne sensors using index products such as the normalized difference vegetation index (NDVI). Recent advances in unmanned aerial systems (UAS) technology have created the opportunity to access these same image data sets in a more cost effective manner with higher temporal and spatial resolution. Another advantage of these systems includes the ability to gather data in almost any weather condition, including complete cloud cover, when data has not been available before from traditional platforms. The ability to collect in these varied conditions, meteorological and temporal, will present researchers and producers with many new challenges. Particularly, cloud shadows and self-shadowing by vegetation must be taken into consideration in imagery collected from UAS platforms to avoid variation in NDVI due to changes in illumination within a single scene, and between collection flights. A workflow is presented to compensate for variations in vegetation indices due to shadows and variation in illumination levels in high resolution imagery collected from UAS platforms. Other calibration methods that producers may currently be utilizing produce NDVI products that still contain shadow boundaries and variations due to illumination, whereas the final NDVI mosaic from this workflow does not.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Haly L. Neely, Cristine L. S. Morgan, Scott Stanislav, Gregory Rouze, Yeyin Shi, J. Alex Thomasson, John Valasek, Jeff Olsenholler
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping, 98660K (2016) https://doi.org/10.1117/12.2228732
The goal of precision agriculture is to increase crop yield while maximizing the use efficiency of farm resources. In this application, UAV-based systems are presenting agricultural researchers with an opportunity to study crop response to environmental and management factors in real-time without disturbing the crop. The spatial variability soil properties, which drive crop yield and quality, cannot be changed and thus keen agronomic choices with soil variability in mind have the potential to increase profits. Additionally, measuring crop stress over time and in response to management and environmental conditions may enable agronomists and plant breeders to make more informed decisions about variety selection than the traditional end-of-season yield and quality measurements. In a previous study, seed-cotton yield was measured over 4 years and compared with soil variability as mapped by a proximal soil sensor. It was found that soil properties had a significant effect on seed-cotton yield and the effect was not consistent across years due to different precipitation conditions. However, when seed-cotton yield was compared to the normalized difference vegetation index (NDVI), as measured using a multispectral camera from a UAV, predictions improved. Further improvement was seen when soil-only pixels were removed from the analysis. On-going studies are using UAV-based data to uncover the thresholds for stress and yield potential. Long-term goals of this research include detecting stress before yield is reduced and selecting better adapted varieties.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
John Valasek, James V. Henrickson III, Ezekiel Bowden, Yeyin Shi, Cristine L. S. Morgan, Haly L. Neely
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping, 98660L (2016) https://doi.org/10.1117/12.2228894
As small unmanned aircraft systems become increasingly affordable, reliable, and formally recognized under federal regulation, they become increasingly attractive as novel platforms for civil applications. This paper details the development and demonstration of fixed-wing unmanned aircraft systems for precision agriculture tasks. Tasks such as soil moisture content and high throughput phenotyping are considered. Rationale for sensor, vehicle, and ground equipment selections are provided, in addition to developed flight operation procedures for minimal numbers of crew. Preliminary imagery results are presented and analyzed, and these results demonstrate that fixed-wing unmanned aircraft systems modified to carry non-traditional sensors at extended endurance durations can provide high quality data that is usable for serious scientific analysis.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
E. Raymond Hunt Jr., Silvia I. Rondon, Philip B. Hamm, Robert W. Turner, Alan E. Bruce, Josh J. Brungardt
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping, 98660N (2016) https://doi.org/10.1117/12.2224139
Remote sensing with small unmanned aircraft systems (sUAS) has potential applications in agriculture because low flight altitudes allow image acquisition at very high spatial resolution. We set up experiments at the Oregon State University Hermiston Agricultural Research and Extension Center with different platforms and sensors to assess advantages and disadvantages of sUAS for precision farming. In 2013, we conducted an experiment with 4 levels of N fertilizer, and followed the changes in the normalized difference vegetation index (NDVI) over time. In late June, there were no differences in chlorophyll content or leaf area index (LAI) among the 3 higher application rates. Consistent with the field data, only plots with the lowest rate of applied N were distinguished by low NDVI. In early August, N deficiency was determined by NDVI, but it was too late to mitigate losses in potato yield and quality. Populations of the Colorado potato beetle (CPB) may rapidly increase, devouring the shoots, thus early detection and treatment could prevent yield losses. In 2014, we conducted an experiment with 4 levels of CPB infestation. Over one day, damage from CPB in some plots increased from 0 to 19%. A visual ranking of damage was not correlated with the total number of CPB or treatment. Plot-scale vegetation indices were not correlated with damage, although the damaged area determined by object-based feature extraction was highly correlated. Methods based on object-based image analysis of sUAS data have potential for early detection and reduced cost.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Carlos Espinoza Zúñiga, Lav R. Khot, Pete Jacoby, Sindhuja Sankaran
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping, 98660O (2016) https://doi.org/10.1117/12.2228791
Increased water demands have forced agriculture industry to investigate better irrigation management strategies in crop production. Efficient irrigation systems, improved irrigation scheduling, and selection of crop varieties with better water-use efficiencies can aid towards conserving water. In an ongoing experiment carried on in Red Mountain American Viticulture area near Benton City, Washington, subsurface drip irrigation treatments at 30, 60 and 90 cm depth, and 15, 30 and 60% irrigation were applied to satisfy evapotranspiration demand using pulse and continuous irrigation. These treatments were compared to continuous surface irrigation applied at 100% evapotranspiration demand. Thermal infrared and multispectral images were acquired using unmanned aerial vehicle during the growing season. Obtained results indicated no difference in yield among treatments (p<0.05), however there was statistical difference in leaf temperature comparing surface and subsurface irrigation (p<0.05). Normalized vegetation index obtained from the analysis of multispectral images showed statistical difference among treatments when surface and subsurface irrigation methods were compared. Similar differences in vegetation index values were observed, when irrigation rates were compared. Obtained results show the applicability of aerial thermal infrared and multispectral images to characterize plant responses to different irrigation treatments and use of such information in irrigation scheduling or high-throughput selection of water-use efficient crop varieties in plant breeding.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping, 98660P (2016) https://doi.org/10.1117/12.2223368
This paper is based on a proposed unmanned aerial system platform that is to be outfitted with high-resolution sensors. The proposed system is to be tethered to a moveable ground station, which may be a research vessel or some form of ground vehicle (e.g., car, truck, or rover). The sensors include, at a minimum: camera, infrared sensor, thermal, normalized difference vegetation index (NDVI) camera, global positioning system (GPS), and a light-based radar (LIDAR). The purpose of this paper is to provide an overview of existing methods for pollution detection of failing septic systems, and to introduce the proposed system. Future work will look at the high-resolution data from the sensors and integrating the data through a process called information fusion. Typically, this process is done using the popular and well-published Kalman filter (or its nonlinear formulations, such as the extended Kalman filter). However, future work will look at using a new type of strategy based on variable structure estimation for the information fusion portion of the data processing. It is hypothesized that fusing data from the thermal and NDVI sensors will be more accurate and reliable for a multitude of applications, including the detection of pollution entering the Chesapeake Bay area.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping, 98660Q (2016) https://doi.org/10.1117/12.2224248
Unmanned ground vehicles have been utilized in the last few decades in an effort to increase the efficiency of agriculture, in particular, by reducing labor needs. Unmanned vehicles have been used for a variety of purposes including: soil sampling, irrigation management, precision spraying, mechanical weeding, and crop harvesting. In this paper, unmanned ground vehicles, implemented by researchers or commercial operations, are characterized through a comparison to other vehicles used in agriculture, namely airplanes and UAVs. An overview of different trade-offs of configurations, control schemes, and data collection technologies is provided. Emphasis is given to the use of unmanned ground vehicles in food crops, and includes a discussion of environmental impacts and economics. Factors considered regarding the future trends and potential issues of unmanned ground vehicles include development, management and performance. Also included is a strategy to demonstrate to farmers the safety and profitability of implementing the technology.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.