Source: KAIROSYS INC. submitted to NRP
ALFALFA BLOOM PREDICTION USING PLANT IMAGES AND PROCESSING TECHNIQUES FOR THE PURPOSE OF IMPROVING POLLINATION USING ALFALFA LEAFCUTTING BEES
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
COMPLETE
Funding Source
Reporting Frequency
Annual
Accession No.
1009472
Grant No.
2016-33610-25369
Cumulative Award Amt.
$99,363.00
Proposal No.
2016-00980
Multistate No.
(N/A)
Project Start Date
Aug 1, 2016
Project End Date
Sep 30, 2017
Grant Year
2016
Program Code
[8.13]- Plant Production and Protection-Engineering
Recipient Organization
KAIROSYS INC.
16645 PLUM RD
CALDWELL,ID 83607
Performing Department
(N/A)
Non Technical Summary
The purpose of this proposal is to conduct research that will provide alfalfa seed growers a bloom forecast application to improve their pollination practices and increase yield. Most U.S. growers outside of California use Alfalfa Leafcutting Bees (ALB) for pollination because they increase yield by more than 50% over honey bees. Growers spend ~$400/acre for ALB's, yet our customer diligence indicates that a further 250 lbs/acre yield is lost as they are unable to respond adequately to changing bloom conditions in the field.Using test plots provided by our commercial partner and an USDA research lab, we will develop a heuristic for alfalfa bloom prediction using imaging techniques to track the stages of plant development and comparing spectral images with manual observation. Ground sensors will be deployed to calculate the Growing Degree Days and photoperiod needed through the development progression to build predictive models for bloom onset using bud development markers as a basis. Sensors would also track weather, weed, and pest infestations to provide better control windows for bee release and to localize and reduce pesticide use.When commercialized, growers will use a smartphone application to receive a reliable forecast for bloom onset. We estimate we can double the decision window for growers to make the two most important pollination decisions; 1) bee release timing (incubation delay) and 2) pesticide application, resulting in significantly increased yield and revenue while simultaneously improving bee health.
Animal Health Component
0%
Research Effort Categories
Basic
50%
Applied
0%
Developmental
50%
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
2057210106050%
2051640207030%
3153085106020%
Goals / Objectives
This research seeks to set in place an improved pollination management protocol by forecasting alfalfa bloom using advanced imaging techniques. With this information easily accessed through smart devices, growers will be better equipped to address the following decisions;When should bees be released for optimal pollination?Should bees be released in one batch or several batches?What pesticides and herbicides need to be applied?When should pesticides and herbicides be applied?Should bee incubation be paused? If so, when and for how long?During Phase I (including internal development)Aerial imaging using small Unmanned Aerial Systems (sUAS) of alfalfa plots (Internal Funds for Equipment)Ground-level imaging of alfalfa test plots using RGB and multispectral cameraEnvironmental monitoring using ground sensors - temperature, humidity, photoperiod etc. (Internal funds for cloud-connected hardware)Image analysis to create a heuristic for bud development stagesPredictive models for onset of bloom using bud development markers and environmental sensor dataDevelopment of alpha software app (Internal Funds)Technical Questions to be addressed during Phase ICan spectrometry and vegetation indices be used to detect differences of bloom development from green vegetation with sufficient resolution?What vegetation indices and imaging methodologies are best suited for estimating plant progression ?Can onset of bloom be predicted with sensor data ?Proposed During Phase II (including internal development)Aerial imaging using small Unmanned Aerial Systems (sUAS) of alfalfa fields planted with different varietalsField level testing and adjustment of Phase I model for prediction accuracy and timing of bloom onsetModeling and analysis to select appropriate vegetation indices for pests and weedsModeling and correlation of pest occurrence with Degree Day accumulation, photoperiod and other environmental factorsAssessing the impacts of cross field variation and crop varietal on Phase I model accuracy and complete any refinements necessaryAssessing the impact of delayed introduction of bees on crop yieldAssessing the impact of water stress on progression from bud development to bloomDevelopment of hardware and software pre-commercialization product and test in the field (Internal Funds and potential industrial partner Funds)
Project Methods
The primary purpose of Phase I is to demonstrate feasibility of predicting bloom using imaging analysis. Since Phase I is limited in funding and duration, certain important objectives listed in the section above will be completed concurrently with internal funds. Moreover, certain tasks associated with Phase I research will be initiated prior to the proposed June 1 funding date using internal funds.Aerial Imaging using sUAS: The primary method for capturing images will be using a DJI small unmanned aerial system (sUAS). The DJI Phantom 2 is a quadcopter mounted with a camera gimbal that carries two sensors, a RGB camera and a NIR-GB camera. The Phantom 2 has waypoint navigation capability that allows it to fly at the same position at different times. The typical flow of sUAS imaging are:Planning the mission (evaluate terrain, altitude, and sUAS tasks)sUAS image capture (one pass for RGB and NIR sensor respectively)Image pre-processing (ortho-rectification and mosaicking)sUAS images will be collected at two different altitudes from four 1-5 acre grower plots. The fields may not all be seeded with the same variety, but is proposed to be second- or third-year crop of a dormant variety with typical purple blooms. Images will be captured once a week throughout the season (typically May 15 - Aug 15) and at least three times a week during bud development and early bloom.The resolution of the images captured by the sUAS system is dependent on the camera specifications, imager speed, and imager distance from crop. Gaussian blur will be applied to these sample images obtained at lower altitudes and speeds to simulate the resolution from different heights and higher speeds. If blurring is a good proxy for altitude and speed, additional image datasets will be created during post-processing for prediction evaluation without additional flight time. Ground level imaging using RGB and multispectral camera: A high-resolution portable spectral imager - ASDI FieldSpec Pro will be used to obtain ground level images. A telescopic monopod setup (capable of 10 m extension) will be used to increase the field of view to at least 3-5 m2 area. A 12 Mega Pixel (MP) P&S RGB camera will also be used to obtain comparative low-resolution images. We will evaluate the use of reflectance reference targets (e.g. Spectralon) for radiometry correction and correlation of ground to aerial images. In each site, on-ground image, sUAS hover image, and manual observation will be captured.On-ground images will also be captured in 4 sites spread over two 0.25 acre ALB experimentation plots managed by the USDA ARS in Logan, UT. These are historical plots with good documentation and control of cultural practices, which will serve as a robust benchmark and baseline. These plots will be used for continuity for a Phase II study to understand bud and bloom progression in the absence of pollination to analyze the yield impact of pollination delay.Environmental monitoring: Cloud-connected sensor hardware from our internal development will be used to collect temperature, humidity, photoperiod, and soil moisture. In each of the grower and USDA plots, sensor hardware will be placed at appropriate sites, preferably co-located with the imaging sites chosen in Objective 2. Image analysis to create heuristics for onset of bloom: In this objective, we will analyze all the images obtained to evaluate the predictive capability of each image acquisition method. Predictive capability will have two components - accuracy and timing. The percentage of stems having bud clusters with maturing buds will be determined in a 3-5 square meter area. Vegetation Indices (VI) signatures will be calculated for all images. A prediction accuracy score will be assigned based on a variance between the rate of bud development progression indicated by the VI and that indicated by manual observation. A prediction timing score will be assigned based on the earliest stage of bud development that can be differentiated from subsequent stages of bud development.We anticipate that light-use-efficiency indices such as photochemical reflectance index (PRI) , structure insensitive pigment index (SIPI) and red green ratio (RGR) will provide the best differentiators of color pigmentation useful in bud detection. Common broadband greenness indicators such as Normalized Difference Vegetation Index (NDVI) will be used to normalize the green leaf scattering, thereby increasing the sensitivity when used in combination with light-use-efficiency indices. If additional sensitivity is required for detection, narrowband greenness indicators such as red-edge NDVI will need to be used.The purpose of evaluating multiple imaging sensor and image acquisition platform configurations is to determine a cost-performance matrix. This will guide the development of the hardware and software during Phase II and eventual commercialization. The process may also uncover the potential to introduce multiple products into the market depending on grower needs (for e.g., sUAS with multispectral vs handheld camera with RGB). Each of these products will have the same smartphone application for decision support.Development of software app: A smartphone application will be developed using internal funds. In Phase I, the app will be designed to interrogate a database and retrieve the status of plant progression towards onset of bloom. The analysis and storage of the heuristics and results from image processing will be maintained in the cloud. This process and application flow closely mimics the development of Incusense, our existing bee incubator monitoring application where the progression towards emergence of bees is shown. This objective will be funded internally.

Progress 08/01/16 to 09/30/17

Outputs
Target Audience:1. Alfalfa Grower Community 2. Alfalfa seed companies 3. USDA Researchers 4. University Researchers 5. General Public Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?Although this project is not intended or designed for training or professional development, some training and development opportunities were realized. 1) Under the guidance of USDA ARS scientists, students/young researchers and technicians gained experience in monitoring the alfalfa plant progression and developing techniques for estimation of plant floral resources. 2) Under the guidance of Kairosys, undergraduate students seeking careers in Agriculture gained valuable experience and professional development. How have the results been disseminated to communities of interest?Agronomists: Kairosys presented results from the study to agronomistswithin the S&W (strategic partner) network. Growers: Kairosys and S&W Agronomists presented partial results to participating growers. The participating growers were already using the Incubation monitoring solution developed by Kairosys. The impact of the holistic solution combining the incubation product and the bloom prediction product was disseminated and received well by the grower community. What do you plan to do during the next reporting period to accomplish the goals? Nothing Reported

Impacts
What was accomplished under these goals? IMPACT The purpose of this proposal is to conduct research that will provide alfalfa seed growers a pollination management tool that enables synchronization of Alfalfa Leafcutting Bee (ALB) release with prediction of alfalfa bloom. These protocols will help increase yield and improve bee health and recovery. Most U.S. growers outside of California use ALBs for pollination because they increase yield by more than 50% over honey bees. Growers deploy ~$400/acre worth of ALBs. Kairosys' customer diligence indicates that a further 250 lbs/acre yield can be gained by responding adequately to changing field bloom conditions. Kairosys will develop a heuristic for alfalfa bloom prediction using data at grower sites, USDA plots, and its commercial partner's greenhouse. Employing imaging and machine learning techniques, we will track stages of plant development and compare spectral images with manual observation. Sensors deployed to calculate the Growing Degree Days and photoperiod needed through the development progression will enable predictive models for bloom onset using bud development markers as a basis. Sensors also track weather, weed, and pest infestations to provide better control windows for bee release and to localize and reduce pesticides. When commercialized, growers will receive a reliable forecast for bloom onset and bee emergence through a smartphone application. This will roughly double the management window for growers to make the two most important pollination decisions; 1) bee release timing to synchronize with bloom and 2) pesticide application, resulting in significantly increased yield and revenue while simultaneously improving bee health. Accomplishments: The primary goal in Phase I was to create a preliminary heuristic model for understanding bloom progression and prediction of bloom onset. Our technical objectives were to explore: Aerial imaging using small Unmanned Aerial Systems (sUAS) of alfalfa plots Ground-level imaging of alfalfa test plots using RGB and multispectral camera Environmental monitoring using ground sensors - temperature, humidity, photoperiod Image analysis to create a heuristic model for onset of bloom Predictive models for onset of bloom To accomplish these objectives, the Kairosys team gathered a variety of data (sUAS, RGB, NIR, Hyperspectral, Environmental sensors, Manual Floral counts, and seed yield) from multiple locations (three grower sites, two greenhouse experiments, and two USDA plots). At the request of S&W and participating growers, we included yield measurement and the associated correlation to bloom prediction models in Phase I studies (originally targeted in Phase II). Color threshold modeling of bloom progression: RGB images from USDA plots (2 plots, 4 sites each, 14 days), grower fields (3 locations, ground and aerial, 35 days), and S&W greenhouse (39 locations, 40 days) were used to compute a bloom intensity based on pixel counts meeting a color thresholding criteria of hue and saturation (RGB color thresholding model fit for flower counts (270 0.5; v > 0.5)).R statistical software with imager and caret packages was used for this analysis with 30% of the data as training samples. The Adjusted R2 for the test sample is ~ 0.80 with the residuals showing mostly underestimation of predicted values. At high manual floral counts, this could be due to flowers being shielded from the camera by vegetation. In one growerfield, two bloom levels were simulated by cutting back 100 ft2 in 4 plant rowsand designating an adjacent 100 ft2 as control. The bloom in the cutback cohort was delayed by ~15 days. Pro-rated yield for the cutback region was 224 lbs/ac, while the uncut region yielded 592 lbs/ac. For two other uncut fields,yields of 1303 lbs/ac and 362 lbs/ac, respectively, were reported. The predicted bloom intensity using our models show a 3x difference andcorrelate well with thedifference in measured yields. The estimation of bloom progression from the color thresholding model as applied to three grower fields, shows that when the bees are introduced and pollination begins, there is a rapid reduction in flower number. At this point there is an interplay of the loss of flowers and the creation of new flowers, which could translate directly to yield. If the new flower production is not sufficient to sustain the bees, then bees could drift away (Pitts-Singer 2013b). If pollination occurs too rapidly due to surfeit of bees, resulting stress could shut down subsequent flower production (Carlson 1928, Free 1993, Strickler 1999). Hyperspectral imaging of leaves, buds and flowers: Significant spectral differences (using ASD 350-2500 nm with contact probe) are observed among isolated targets of leaves, buds, and flowers. These include samples from two locations (grower site and greenhouse) collected over two days, and sample variance is minimal. These observations indicate that Shortwave Infrared (SWIR) bands may not be necessary, and imaging bands in the visible and NIR will be sufficient to delineate several predictive wavelengths and vegetation indices. This is a promising result, since SWIR filters are significantly more expensive than visible/NIR filters. Based on this preliminary finding, the hsdar R package was used to derive several light-use efficiency, leaf pigment, narrowband greenness, and broadband greenness vegetation indices (Section 8). Figure 6 shows four of the many indices that with significant differences in the response to leaves, buds, and flowers, namely, Normalized Difference Vegetation Index (NDVI), Structure Insensitive Pigment Index (SIPI), Anthocyanin Index (ACI), and Red-Edge Position (REP). NDVI and ACI indices obtained from grower fields show critical differences in plant development and senesence. Established stands showed much higher NDVI vs spring seeding at the beginning of the season and fields with greater seed showed a rapid decline of NDVI later in the season, which could be used as a signal for harvest. The ACI indices used in combination with NDVI showed a strong correlation to the differences in yield between grower fields. Aerial Imaging: A DJI Phantom 3 small Unmanned Aerial System (sUAS) was used to capture 12 Mp RGB and NIR-GB images at a height of 35 ft at two grower sites on 35 days from June 6 to August 11. In addition, RGB images were obtained in "hover" mode over a specific spot that included a reflectance marker. The color thresholding model applied to the hover images shows strong correlation (>0.9 R2) to bloom intensity predicted by ground images, while those at 35 ft did not possess adequate resolution for predicting bloom intensity. The NIR-GB images are useful in predicting the progress of green vegetation and are able to detect post-pollination senescence in the crop. Machine learning algorithms for bud and bloom detection: Multispectral and Hyperspectral data was used in conjunction with RGB images to develop a supervised machine learning algorithm to detect bud and bloom progression. Sample point software was used to count plant features and a PLSR algorithm (written in R software) was used to extract the principal wavelengths of interest from the multispectral and hyperspectral data. This work will further be fine tuned with data from the subsequent seasons. The output will be to create unique filters which respond to alfalfa plant phenology. Conclusion Models generated from imaging can predict the progression of bloom in the alfalfa plant. Ongoing work will strengthen these models and also include the prediction of progression of buds. Bloom differences between fields predicted by the models are accurately able to predict differences in yields between the fields. Hyperspectral data can be used to get unique combination of wavelengths for creating custom filters to monitor plant progression

Publications


    Progress 08/01/16 to 07/31/17

    Outputs
    Target Audience:1. Alfalfa Grower Community : Two growers in Canyon County, ID who allowed use to capture images of the bloom progression in the field 2. Alfalfa seed companies : S&W seed company who has become a partner in this project 3. USDA and University Researchers - USDA Lab researchers in Logan, UT; Professors in Boise State University and Northwest Nazarene University 4. General public and end consumers : Idaho Department of Commerce (Global Entrepreneurship Mission advisory council) Changes/Problems:We decided to conduct a wider study in the green house in addition to our grower field study. This was possible because we deepened our commercial partnership relationship with S&W seed company. As a part of this study we managed to make an initial experiment on yield measurements with different bloom conditions. This is of great interest to us and our commercial partner and will accelerate our understanding and adoption of our solution. What opportunities for training and professional development has the project provided? Nothing Reported How have the results been disseminated to communities of interest?Preliminary results have been communicated with our partner - S&W seed company What do you plan to do during the next reporting period to accomplish the goals?- Complete the modeling and analysis of the data. - Generate the appropriate questions for Phase 2 research.

    Impacts
    What was accomplished under these goals? RGB and NIR Aerial imaging using small Unmanned Aerial Systems (sUAS) of alfalfa plots (Internal Funds for Equipment) Ground-level imaging of alfalfa test plots using RGB, NIR and hyperRGB, spectral camera Environmental monitoring using ground sensors - temperature, humidity, photoperiod etc. (Internal funds for cloud-connected hardware) Simulated differences in bloom progression in test plot and collected seeds for yield analysis and correlation with bloom prediction model.

    Publications


      Progress 08/01/16 to 01/31/17

      Outputs
      Target Audience:1. Alfalfa Grower Community : Two growers in Canyon County, ID who allowed use to capture images of the bloom progression in the field 2. Alfalfa seed companies : S&W seed company who has become a partner in this project 3. USDA and University Researchers - USDA Lab researchers in Logan, UT; Professors in Boise State University and Northwest Nazarene University 4. General public and end consumers : Idaho Department of Commerce (Global Entrepreneurship Mission advisory council) Changes/Problems:We decided to conduct a wider study in the green house in addition to our grower field study. This was possible because we deepened our commercial partnership relationship with S&W seed company. As a part of this study we managed to make an initial experiment on yield measurements with different bloom conditions. This is of great interest to us and our commercial partner and will accelerate our understanding and adoption of our solution. What opportunities for training and professional development has the project provided? Nothing Reported How have the results been disseminated to communities of interest?Preliminary results have been communicated with our partner - S&W seed company What do you plan to do during the next reporting period to accomplish the goals?- Complete the modeling and analysis of the data. - Generate the appropriate questions for Phase 2 research.

      Impacts
      What was accomplished under these goals? RGB and NIR Aerial imaging using small Unmanned Aerial Systems (sUAS) of alfalfa plots (Internal Funds for Equipment) Ground-level imaging of alfalfa test plots using RGB, NIR and hyperRGB, spectral camera Environmental monitoring using ground sensors - temperature, humidity, photoperiod etc. (Internal funds for cloud-connected hardware) Simulated differences in bloom progression in test plot and collected seeds for yield analysis and correlation with bloom prediction model.

      Publications