Environmental monitoring
As a part of the ASU Center for Global Discovery and Conservation Science, we are developing a variety of environmental robotics solutions to enable data collection at forests and coral reefs alike, at unprecedented spatial and temporal scales. Our methodology for sampling, modeling, and prediction considers both in-situ and ex-situ labeling of samples. Cheap but potentially more noisy remotely sensed data can guide collection of water, leaf, soil, or air samples for ex-situ analysis to improve prediction accuracy. Observations can be assimilated to update probabilistic predictive models, so as to allocate resources such as UAVs. Bayesian optimization in the contextual bandits setting is a powerful method to close the loop on such decision problems. Deep reinforcement learning is a possible technique for learning policies to optimize robotic sampling in uncertain and unstructured environments.
Precision agriculture
We are developing smart robotic systems to improve efficiency and yield of farm operations. Our goal is to provide specialty crop growers with a data-driven deployment strategy that makes synergistic use of a networked robotic system working interactively with a human scout.
Semantic object mapping
With applications in geology and disaster response, we are exploring methods to organize and analyze large aerial image datasets, in order to search for and map semantic objects (damaged infrastructure, rocks, trees).
Geomorphological Analysis Using Unpiloted Aircraft Systems, Structure from Motion, and Deep Learning
Cyber-physical systems (CPS) challenge
We have developed the OpenUAV testbed for robotics education and research. Supported by the NSF Cyber-physical Systems (CPS) program, the testbed includes standardized unpiloted aerial vehicle (UAV) hardware and an end-to-end simulation stack built upon open source technologies. The testbed facilitated UAV competitions held at the TIMPA airfield in Tucson Arizona, in 2016, 2018, and 2019. In these events, teams from Vanderbilt University, University of Arizona, UCLA, University of Pennsylvania, Embry-Riddle Aeronautical University, Halmstad University(Sweden), and Arizona State University demonstrated with varying degrees of autonomy, the deployment and retrieval of sensor probes and other objects. The first competition in 2016 motivated by Microsoft Research’s Project Premonition, which also funded the hardware for the participating teams.
In May 2020, we held an online competition SoilScope -- Mars Edition. The Mars 2020 inspired mission scenario for the 2020 NSF CPS Challenge was a one month virtual event, emulating an autonomous probe deployment science mission by a rover and drone duo, at the Jezero crater landing site. The outdoor event for this competition will happen in Fall 2020.
Robotics and AI for the arts
Projection mapping, interactive art installations, drone shows -- these are examples of robotics, computer vision, and AI seamlessly blending together to inspire the public, and promote STEM education in a fun and engaging setting.
We foresee the technology developed for STEM research translating into audio-visual art projects, driven by students with diverse interests and backgrounds.