Deep Learning and Image Processing for Transformational Environmental Science

This cluster focuses on the application of Image Processing and Machine Learning to a range of problems in Environmental/Ecological Sciences. High-resolution imaging sensing has enabled the production of huge amounts of data that is informationally-dense and relatively easy to share. A number of promising opportunities exist to leverage these. Using traditional image processing techniques such as object detection and segmentation, for example, a wide range of modelling activities can be performed in order to quantify various aspects of plants and animals such as for phenotyping or other classification activities. Furthermore, with the increased proliferation of high quality and inexpensive sensors comes a proportional increase in the amount of data from them that must be analyzed. There are a number of directions at the intersection of machine learning and signal processing that are very promising, allowing functions such as automatically “filtering” datasets (such as to remove uninteresting clips), processing for automatic ID of animals, and many more.

Project 1 - Deep Learning for Automated Plant Phenotyping

Climate-related phenotypic changes in plants are perhaps most prevalent in leaves, where evapotranspiration, temperature regulation, and photosynthetic intensity are adjusted with changes in leaf appearance, such as size, shape, length to width ratio, and apex, base, lobing, and tooth form. This project would leverage Herbarium digitization efforts, which, over the past two decades have resulted in the availability of millions of images of herbarium specimens. Using these collections, we will apply modern computer vision and data-driven machine learning approaches to automatically extract leaf contours and quantify their phenotypic properties. By assembling large datasets of extracted phenotypes with their corresponding climate data, we will perform large-scale analysis of correlations between leaf phenotypes and climate conditions at massive and unprecedented taxonomic and geographic scales.

Primary Advisor:

Aaron Dollar (Professor, Mechanical Engineering & Materials Science and Computer Science)

Collaborators:

Dr. Nelson Rios (Head of Biodiversity of Informatics Research, Yale Peabody Museum)

Dr. Patrick Sweeney (Senior Collections Manager, Botany, Yale Peabody Museum)

Project 2 - Texture-based Analysis of Environmental Image Data

Textural information in images can provide a rich suite of information about the content that isn’t immediately apparent from simpler image characteristics such as color or saturation. This theme includes two projects related to texture: 1) Correlating animal health metrics - texture in images, such as from a trap camera campaign, to information about the health of the individual animals documented. 2) Remote sensing landscapes - Differentiating plantation vs natural forest regeneration in tropical forest landscapes from satellite images, for instance, is not yet possible through traditional approaches. Texture-based approaches can likely resolve remote sensing images to a much greater extent than currently possible.

Primary Advisor:

Holly Rushmeier (Professor, Computer Science)

Collaborators:

Nyeema Harris (Associate Professor, School of the Environment)

Liza Comita (Professor, School of the Environment)

Postdoctoral Positions

Two open Postdoctoral Positions are available at Yale in Image-based techniques for Environmental/Ecological Science. Click HERE to learn more.