Categories
Notable Grants

Research Insights

CAREER: From Underground to Space: An AI Infrastructure for Multiscale 3D Crop Modeling and Assessment 

A crop’s traits and 3D structure (the shape and architecture of plants, including both above- and below-ground parts) are the attributes that chiefly influence crop growth and yield and provide critical evidence for plant phenotyping (the characterization assessment of plant traits). Crop yield predictions can be made by assessing 3D plant structures using crop sensing methods. However, crop sensing results at different scales are usually analyzed in isolation, which overlooks essential connections. Moreover, while root systems play a central role in plant functions, current methods mainly assess crops based on above-ground crop structure due to the difficulty of accessing roots. Current methods use satellites for remote sensing and drones for local sensing, enabling crop assessment at varying scales; however, it is difficult to integrate these observations effectively, and the information stream is formidable. The overarching objective of this project is to develop a novel AI infrastructure to integrate these observations to model and assess 3D crop structures at multiple scales and enhance below-ground sensing capabilities. Using this infrastructure, 3D crop structures can be estimated accurately at the individual, farm, and satellite scales, facilitating crop assessment and yield prediction. The project dramatically enhances and accelerates the ability of growers and agronomists to assess critical crop field structural variation for both above- and below-ground components, enabling large-scale crop management. This project also benefits students, from the high school to the Ph.D. level, by applying multi-scale 3D models of above- and below-ground crop structures to immersive education methods (Virtual Reality (VR), Augmented Reality (AR), and online learning), which are well-suited to solving the challenges of distance learning, especially for subjects like agriculture requiring field study. The multi-scale sensing system is also capable of estimating 3D landscape structures and large-scale crop structures and can be utilized in other areas, such as Arctic Sea ice modeling, forestry, and climate change studies. This project aims to connect a plant’s structural phenotypes below- and above-ground and link in-situ measurements to satellite sensing data, thus enabling non-destructive crop root sensing and root-system status estimation based on observation of plant growth above-ground while at the same time empowering satellite images to assess these factors to furnish more local and detailed information. This project establishes a method for 3D crop sensing of individual plants, crop fields, and satellite regions to provide multi-scale crop structural evidence for crop assessment and yield prediction. This project also develops a novel AI neural network to sense root structures and predict traits based on sensing above-ground plant structures.

This project investigates methods for satellite-based 3D sensing and nondestructive below-ground root sensing. Novel AI infrastructures are explored to address critical issues in computer vision and remote sensing, efficient integration of multi-scale sensing, 3D structure prediction, and spatial-temporal 4D inference. Such an approach can lower the ceiling for operational adoption of satellite and in-situ imagery assessments, based on a scientifically underpinned, multi-scale, 3D assessment workflow. In addition to its essential and practical implications for agriculture professionals, this project also explores novel AI solutions within computer vision and remote sensing. Crop structures are highly diverse, complicated, and changing phenomena. Therefore, agriculture presents an ideal research domain for investigating novel AI methods. This research advances AI by 1) largely improving the fusion effectiveness of various remote sensing modalities from sensors mounted on different devices, 2) significantly enhancing the learning capability by connecting sensing outputs expressed in multiple scales, 3) enabling 3D structure prediction for objects across different domains, and 4) providing future status prediction based on 4D spatial-temporal neural networks.

Funder: National Science Foundation 

Amount: $549,928 

PI: Guoyu Lu, College of Engineering