When you apply, we will ask you to provide a ranking of prospective mentors and select from a list of research projects that you would potentially be interested in working on. To frame the social relevance of the research, each project is aligned with at least one Sustainable Development Goal (SDG) defined by the United Nations.
Your mentor rankings and project selections are important because they will determine which people will evaluate your application. We therefore encourage applicants to explore each mentor's website to learn more about the individual research activities of each lab. Note that some mentors listed below have more than one project.

SDGs: Reduced Inequalities; Industries, Innovation, and Infrastructure
SDG: Good Health and Well-Being
International research has shown multiple benefits to health and well-being from exposure to natural environments ("forest-bathing"). In some situations, such as for populations who have limited access to real-life nature immersion experiences, such as elderly residents of care homes, or for people who are temporarily undergoing a stressful experience from which respite would be welcome, such as chemotherapy or dialysis patients, virtual reality-based nature immersion may offer some value. In this project, the summer intern will work with an interdisciplinary team of researchers to explore fundamental questions informing the optimal deployment of VR technologies for this purpose.
Mentor: Zhu-Tian Chen
SDGs: Decent Work and Economic Growth, Sustainable Cities and Communities
Augmented Reality (AR) has shown potential to assist individuals across a wide spectrum of disabilities, including those with visual impairments, hearing difficulties, and cognitive disorders, by leveraging its real-time, multi-modal capabilities. This project seeks to harness the power of AR through an intelligent agent designed to assist individuals in their daily tasks. Our AR agent will serve as an assistive tool to foster an inclusive environment for individuals in various contexts, including workplaces, educational settings, and public areas, making our daily environments safer, more intelligent, and more welcoming for everyone, especially those in vulnerable situations. Participants in this project will have the opportunity to work on cutting-edge AR interfaces, delve into AI integration, and apply human-centric design principles. Students will advance their technical skills in AR and AI and their understanding of how these technologies can be applied to enrich human experiences across different domains.
Democratizing AI: Using Visualizations to Bridge Learning Inequality
Mentor: Qianwen Wang
SDG: Reduced Inequalities
AI is being progressively employed in medical domains, including medical image diagnostics and personalized medicine. While these applications offer significant advantages, especially in regions and among patients with limited medical resources, they also bring forth challenges such as potential biases in decisions (e.g., disparities in accuracy based on race and gender) and a lack of transparency (e.g., patients do not know why a certain decision is made and whether they should trust it). The objective of this project is to create interactive visualization tools that empower users to comprehend the decisions made by medical AI models, detect potential biases, and make informed decisions for their health and well-being. In this project, students will explore different explainable AI algorithms and combine them with interactive visualization techniques to decode AI decision-making processes, breaking them down into simpler and understandable components.
Computational Support for Recovery from Addiction
Mentor: Lana Yarosh
SDG: Good Health and Well-Being
Addiction and alcoholism are one of the greatest threats to people’s health and well-being. Early recovery is a particularly sensitive time, with as many as three-fourths of people experiencing relapse. Computing provides an opportunity to connect people with the support they need to get and stay in recovery. Students may contribute to the development and deployment of interactive technologies for computational support for recovery or to empirical work to better understand opportunities for computing to amplify people's recovery. As some examples, we may work on a self-tracking and visualization system to help people in recovery be more resilient to stress or we may investigate how newcomers are welcomed and supported in an online recovery forum.
Opioid Overdose Detection and Bystander Intervention
Mentor: Lana Yarosh
SDG: Good Health and Well-Being
Opioid overdose is the greatest modern threat to public health, rapidly becoming the most common cause of death for people between the ages of 18 and 44. We are developing a system called CORRA (Community Overdose Response Respiratory Alert) so that no addict needs to die of an overdose in a public space (e.g., light rail trains, buses, public libraries). CORRA currently uses thermal sensing to detect if somebody has stopped breathing. If it detects that a person has stopped breathing, it contacts emergency services while mobilizing bystanders to administer Narcan. This summer, we are exploring multimodal sensing approaches for more robust detection, running testing studies, and working with members of our community to understand the acceptability and feasibility of such testing systems.
Multisensory Robot Task Learning from Human Demonstrations
Mentor: Karthik Desingh
SDGs: Industries, Innovation, and Infrastructure, Sustainable Cities and Communities
Robotics has long been a driver of industrial innovation and a key enabler of sustainable communities. This project aims to develop fundamental methods that allow future assistive robots to understand their environment and interpret tasks demonstrated by humans through a multisensory setup incorporating visual, audio, and haptic sensors. The targeted tasks include everyday activities such as washing dishes, making coffee, and rearranging household objects. Our approach involves developing algorithms capable of deciphering human language narration, gestures, and human-object interactions during task demonstrations. By leveraging deep learning techniques, we will model these tasks from multisensory observational data. The learned models will be evaluated by enabling robots to replicate tasks using imitation learning techniques. Ultimately, we envision a future where robots can be trained by end users for specific applications, significantly benefiting small-scale industries and sustainable communities—particularly in assisting aging populations.
Towards Safer Robotic Colorectal Surgeries: Visual Scene Recognition in the Abdominal Cavity to Prevent Inadvertent Surgical Injuries
Mentor: Junaed Sattar
SDG: Good Health and Well-Being
This project will focus on developing automated strategies for surgical robots to understand the visual images seeing through a laparoscopic camera during colorectal surgery. The end goal is to ensure the prevention of surgical injuries and minimize post-surgical recovery cost and time, and improve patient well-being. The project will build upon the work of the interactive robotics and vision laboratory, particularly in the space of visual scene classification and panoptic segmentation in real-time, and deep-learned saliency-based attention modeling for human-robot interaction for surgical tasks.
Finding Trash: Algorithmic Search Strategies for Locating Debris with Autonomous Underwater Robots
Mentor: Junaed Sattar
SDGs: Clean Water and Sanitation; Life Below Water
This project will focus on developing search strategies for autonomous underwater robots to find trash and debris underwater, with the long-term goal of removing such debris to conserve the health of the aquatic and marine ecosystems. The project will build upon the work of the interactive robotics and vision laboratory, in which underwater robots equipped with imaging sensors have been used to detect and localize crashes underwater in cooperation with human divers. The majority of underwater trash, which is also a significant health hazard, comprises plastic materials known to deform easily. This project will combine the trash detection capabilities of robots with algorithmic search strategies to efficiently find, and eventually remove these items, which pose a significant threat to the health of marine flora and fauna.

