NSF Research Experiences for Undergraduates

Visual Media REU

Title: Computational Imaging and Mixed-Reality for Visual Media Creation and Visualization 

Dates: May 30 - July 28th, 2023 at Arizona State University

Apply here: https://forms.gle/JxKNysdDMGn8Rm367 (note: you need a gmail account to fill out the form)

(Note: if you are having trouble with the form, please contact Dr. Jayasuriya at sjayasur@asu.edu and he can send you the application to fill out via email. )



Welcome to the Visual Media REU webpage at Arizona State University! This first-year REU site will be hosted at ASU main campus and will recruit eight undergraduate students per year in the summer for 10 weeks. This REU will provide integrative research experiences at the overlap of computational imaging, computer vision and graphics, and virtual/augmented/mixed reality (VR/AR/MR), to develop systems and algorithms for visual media creation, visualization, and analysis. Students will receive a paid stipend for performing research during these 10 weeks, and will be mentored by faculty and graduate students at ASU. 

REU Site Objectives

a) Introduce students to general research practices by immersing them in research activities;

b) Engage students in the integrated design of visual sensing systems and algorithms including computational photography, computer vision, machine learning, VR/AR/MR;

c) Provide exposure to innovative applications in the media arts and sciences;

d) Motivate REU students to innovate, pursue research careers, and attend graduate school;

e) Provide cross-cutting training: enhance presentation and demonstration skills, patent and publication preparations;

f) Provide strong experiences in multidisciplinary research.


Potential Projects for Summer 2022, (will be updated for Summer 2023 shortly):

Students will be paired with mentor(s) to guide them in research topics. Below we list sample research topics that students can potentially participate in. Note these are not all the projects being offered, other projects or ideas will be available at the start of the program. Also students can feel free to discuss and propose research projects in collaboration with their mentors. 

Topic Description Mentors


Dynamic Light Transport

Students will work with data from a fast projector-camera system which can capture the linear transport of light in a scene that is moving. Techniques involved will include machine learning, optical flow estimation, and speeding up acquisition using coded patterns. Applications including virtual relighting and seeing around corners (non-line-of-sight) imaging. 

Dr. Suren Jayasuriya (https://web.asu.edu/imaging-lyceum)

Lensless Video Reconstruction

Research around making lensless cameras, cameras which utilize optical diffusers above the sensor instead of the camera. Machine learning algorithms will help reconstruct high quality video from these new types of cameras.

Dr. Suren Jayasuriya (https://web.asu.edu/imaging-lyceum)


Illumination Estimation for Augmented Reality

Students in the REU who select the project would implement various techniques to perform such illumination estimation, e.g., from automated visual inspection of reflective objects or through machine learning to infer illumination patterns from camera frames. Students would leverage virtual environment engines, such as the Unity Game Engine or the Unreal Engine as platforms to to compose augmented reality scenes. Students would evaluate the runtime performance and quality of illumination estimation through user study.


Dr. Robert LiKamWa (https://meteor.ame.asu.edu/)



Reconfigurable Visual Computing Pipelines

Adaptive real-time configuration will necessitate altering the resolution and frame rate of pixel transmission across the computer system depending on the needs of the visual application. Depending on the specified resolution and frame rate needs of particular regions within a frame, the system can discard pixels before they incur expensive read/write memory operations. As part of the REU experience, students will leverage FPGA tools to design interface architecture to perform such pixel selection and operating system drivers to control the pixel selection process. They will leverage these designs to characterize benefits of adaptive configuration on visual systems energy-efficiency and task performance.



Dr. Robert LiKamWa (https://meteor.ame.asu.edu/)

The Air Around Us
Volumetric Capture

The goal of this project is to bring together experts in art, design, engineering, community partnerships, and more to take a comprehensive approach toward creating awareness, visibility, and action in response to the unhealthy conditions we are all living within. Some of the key innovative approaches we will pursue include: a) new kinds of inexpensive citizen science platforms to measure air quality, b) integrating heterogeneous data from government sources with citizen-science data, c) impactful visualization and sonification for public access and interaction with data, d) insightful analytics from the multimodal data to inform policy-making, and e) the development of scalable practices which individuals and institutions can respectively actionize.



The goal of this project is to bring together experts in art, design, and engineering, to assist GML in developing mobile/portable solutions for volumetric capture, utilizing smartphones. Some of the key initiatives we will pursue include: a) creating high quality volumetric capture from a single camera source. b) integrating A.I./M.L. tools to help solve dimensional limitations. c) streamlining a process for outputting volumetric captures, which can be utilized inside of game engines. d) the development of scalable practices which can shift the landscapes and industries of volumetric capture practices.



Dr. Pavan Turaga and Max Bernstein (https://pavanturaga.com/geometric-media-lab/)


Digitally Augmented Violins 

This project will help build and augment violin/string instruments by processual and experiential research methodology. This project will design hardware and sound + signal processing to be intuitive and appealing to traditional performers. Existing projects have included haptic violin shoulder rests and camera visualization for real-time audiovisual feedback for performers. 

Dr. Seth Thorn,  (https://www.seththorn.net/,



Visualizing Computational Fluid Dynamics Data in Virtual Reality

As supercomputers become faster and more powerful, large scale numerical simulations of fluid flows on these machines results in ever increasing amounts of data. Visualization of the three dimensional flow field from such large simulations on standard computer screens becomes both cumbersome and difficult to interpret. Virtual reality offers the opportunity to visualize the data in its native three-dimensional space, which allows a better understanding of the physics. Students working on this project will create a virtual environment using the Unity Game Engine, where the simulation data can be loaded and visualized with a virtual reality headset. They will compare the effectiveness of this approach compared to traditional computer screen visualizations.



Dr. Mohamed Houssem Kasbaoui (https://kasbaoui.bitbucket.io/)


Summer 2023 - Current plans

We will be offering an in-person REU this summer. 


  •  Undergraduate Students majoring in a Science, Technology, Engineering and/or Mathematics (STEM) discipline (note: interdisciplinary and digital media programs will also be considered)
  • US citizens or permanent residents
  • Undergraduates in good academic standing


  • Exposure to exciting research topics in computer vision and graphics, computational imaging, mobile systems and mixed/augmented reality.
  • Exposure to the exciting world of Machine Learning and its applications.
  • Work with experienced student mentors and experts in the field.
  • Paid travel and accommodation expenses.
  • Receive a competitive stipend for per diem expenses 
  • Visit Arizona attractions.
  • Meet new people.
  • Make new friends

First Round Application Deadline: February 10th, 2023. Applications will be also considered after the deadline if positions are open on a rolling basis.