When you apply, we will ask you to provide a ranking of prospective mentors and select from a list of research projects that you would potentially be interested in working on. To frame the social relevance of the research, each project is aligned with at least one Sustainable Development Goal (SDG) defined by the United Nations.

Your mentor rankings and project selections are important because they will determine which people will evaluate your application. We therefore encourage applicants to explore each mentor’s website to learn more about the individual research activities of each lab. 


Reducing Inequitable Barriers for Engaging with Virtual Reality Technologies
Mentor: Isayas Berhe Adhanom and Evan Suma Rosenberg
SDG: Gender Equality

Half or more of all people who have ever used VR technology have at some point experienced cybersickness – feelings of nausea, disorientation and eye strain – and cybersickness is becoming a major obstacle to the wider deployment of VR for socially beneficial purposes in areas such as education, psychotherapy, job training, implicit/unconscious bias reduction training, cultural heritage, manufacturing, design, and more. As cybersickness disproportionately affects women, developing strategies to predict and prevent cybersickness onset or mitigate cybersickness severity is especially important to ensure equal opportunity of access to these emerging technologies. The REU student will work with a team of students and faculty from Computer Science and Kinesiology to help design, develop, and evaluate novel cybersickness mitigation techniques.

Exploring Virtual Reality Nature Immersion for Well-Being
Mentor: Victoria Interrante
SDGs: Climate Action, Life on Land

International research has shown multiple benefits to health and well-being from exposure to natural environments (“forest-bathing”).  For populations who have limited access to real-life nature immersion experiences, such as elderly residents of care homes, virtual reality-based nature exposure may offer some value. In this project, the summer intern will work with an interdisciplinary team of researchers in forest therapy and designers from local VR company MindVue to explore fundamental questions informing the optimal deployment of VR technologies for this purpose.

Education and VR: Opportunities and Challenges for Social Embodied Learning
Mentor: Lana Yarosh
SDG: Quality Education

Educational VR provides the promise of immersion and engagement, potentially leading to better learning outcomes both in K-12 schooling and in higher education. Developing effective systems in these contexts requires substantial consideration of social, pedagogical, and technological factors across many stakeholders (e.g., students, instructors, parents, staff). Working on this project, you will conduct empirical qualitative investigations with stakeholders to understand their needs and challenges (including potential ethical concerns regarding the use of VR in education) and leverage these findings to design and build novel systems that make VR-based education a reality. Some examples of these systems may include instructional interactive volumetric video, immersive 360 video field trips, collaboration spaces and rooms for team-based learning, and augmented and tangible reality learning games.

Computational Support for Recovery from Addiction
Mentor: Lana Yarosh
SDG: Good Health and Well-Being

Addiction and alcoholism are one of the greatest threats to people’s health and well-being. Early recovery is a particularly sensitive time, with as many as three fourths of people experiencing relapse. Computing provides an opportunity to connect people with the support they need to get and stay in recovery. Working closely with members of the recovery community, we will conduct and analyze qualitative formative work to understand people's needs and values. Based on insights gained from this work, students will contribute to the development and deployment of interactive technologies for computational support for recovery. As some examples, we may work on a conversational agent to help connect people with support or web-based writing support tools to help people write more helpful comments in online health communities.

Design of a Smartphone App for Sensitive Self-disclosures
Mentors: Fernando Maestre and Lana Yarosh
SDG: Good Health and Well-Being

People living with stigmatized identities or conditions (e.g., members of the LGBTQA+ community, people living with chronic conditions such as HIV) experience high levels of stigma from society. Self-disclosure to others about their stigmatized conditions is an important step prior to activating social support exchange from others. Informed by interview data gathered thus far and a low fidelity prototype, we aim to develop a functional prototype of a smartphone app that would facilitate the disclosure process of sensitive information in both in-person and remote settings.

Human-Centered AI for Mental Illness on Social Media Sites
Mentor: Stevie Chancellor
SDG: Good Health and Well-Being

There are online communities that support those with mental illnesses like depression, going through crisis, and dealing with substance use disorders. Professor Chancellor's work focuses on studying these communities from two angles: 1) building more performant and efficient models to detect the presence of these behaviors and 2) building AI technology that is more compassionate and ethical in these high-stakes scenarios. This project will use computational techniques from natural language processing, applied machine learning, and data mining to study these communities. Students will also learn about what it takes to develop a data pipeline for projects like this, including data gathering, annotation, and cleaning; model training and engineering; and model testing and evaluation on real-world examples; and ethical outcomes of these systems.

Sort-Empty-Clean-Dry using Robots for Effective Recycling
Mentor: Karthik Desingh
SDG: Responsible Consumption and Production, Sustainable Cities and Communities

The challenge for sustainable waste management begins from the sorting stage at our homes, offices, and other public places, where we put the trash into different bins - compost, recycle, and
purely waste. While we have Materials Recycling Facilities (MRFs) for sorting materials collected from the recycling bins, they are often ineffective due to “recycling contamination” from the sorting stage. While there are various types of “recycling contamination,” this project will focus on food waste contamination at the sorting stage (e.g., recyclable take-home food boxes, jars, and containers that have not been emptied or rinsed out). Negligence to not empty, clean, and dry (ECD) such recyclables make the recycling process less effective. This project aims to develop robotic technology to reduce food waste contamination and serve towards effective recycling. A robot with two arms can sort trash following the ECD principles to maximize the value of recyclables. In this project, the student will work toward building a robotic system that would focus on sensing the objects in the disposable stage, sorting them into respective categories, and performing emptying, cleaning, and drying recyclable categorical items before they go into the recycling bins.

Sensing Objects in Healthcare Environments for Improving Patient Care Through Telepresence Robots
Mentor: Karthik Desingh
SDG: Good Health and Well-being

Providing healthcare via telepresence robots can change how the healthcare industry operates, caters to future demands, and make healthcare accessible to remote locations and underdeveloped communities. This project aims to deliver impactful inventions to the technology of sensing objects in healthcare environments to allow interaction by a telepresence robot. Objects in healthcare environments have challenging properties (e.g., textureless, reflective, transparent, translucent, deformable) and are present in challenging circumstances (e.g., being in contact with liquids, tissue, and bio-materials). So while these robots will be teleoperated, they must be capable of intelligently interacting with this complex world, gaining a level of autonomy so as to decrease the workload of the healthcare professional rather than add to it. In this project, the student will work toward hardware and software inventions to sense these objects with challenging properties under challenging circumstances.

Reducing Dependency on Large Datasets for Visual Underwater Human-Robot Collaboration
Mentor: Junaed Sattar
SDG: Life Below Water

This project will focus on reducing the dependency on large datasets for vision-based autonomous underwater robotics, particularly when collaborating with human partners. Underwater robots capable of following divers, understanding their hand gestures, or their actions and intent are essential for collaborative task execution underwater, but developing such algorithms using machine learning requires a significant amount of imagery. Collecting such datasets can be extremely costly and dangerous to humans, robots, and the environment, and annotating these is an even more laborious process. This project will look into creating visual learning-based methods that do not require large-scale datasets for underwater human-robot collaborative missions.

Autonomous Docking for Sustainable, Long-Term AUV Deployment
Mentor: Junaed Sattar
SDG: Life Below Water

This project will be focused on designing a robot vision guidance system for autonomously docking underwater robots to a floating docking station. The goal of this project is to allow autonomous underwater vehicles to be used in long-term deployment for a variety of applications, by providing a platform on the surface of the water to 'dock' with for charging, data transfer, and a number of other possible purposes. The Interactive Robotics and Vision Lab (IRVLab) has a working prototype of a surface docking station which will serve as a physical platform to validate this research. Skills that will help the undergraduate researcher succeed include a working knowledge of electronics, robotics, and computer vision. 

Underwater HRI