This is an idea proposed in 2024 as a Cambrige Computer Science Part III or MPhil project, and is available for being worked on. It will be supervised by Anil Madhavapeddy and Sadiq Jaffer as part of the Remote Sensing of Nature project.
This project investigates how to build remote sensing data-driven models for the evolution of landscapes, which we can use to better predict deforestation, flooding and fire risks. Diffusion models are now widespread for image generation and are now being applied to video.[1] In addition the GenCast project from Google Deepmind used a diffusion model ensemble for weather forecasting, resulting in a high degree of accuracy compared to traditional methods.[2]
The goal of this project is to train a video diffusion model on time series of optical and radar satellite tiles and evaluate its performance in predicting changes in land use / land cover (such as deforestation or flooding).[3] A stretch goal is to build a user interface over this to predict and visualise the effects of a given change in land cover over time.
"GenCast: Diffusion-based ensemble forecasting for medium range weather", arXiv:2312.15796
↩︎︎"Video Diffusion Models: A Survey" (May 2024), https://video-diffusion.github.io.
↩︎︎"DiffusionSat: A Generative Foundation Model for Satellite Imagery" (Dec 2023)
↩︎︎