Skip to Content, Navigation, or Footer.

Brown, Cornell researchers develop VR software that uses robot proxy for enhanced remote collaboration

New technology allows for hands-on collaboration over a distance, regardless of size differences between facilities

<p>The idea for VRoxy was sparked by Cornell PhD student Mose Sakashita’s frustrations working remotely as a teaching assistant during the COVID-19 pandemic.</p><p>Photo Courtesy of Juan Siliezar / Brown University </p>

The idea for VRoxy was sparked by Cornell PhD student Mose Sakashita’s frustrations working remotely as a teaching assistant during the COVID-19 pandemic.

Photo Courtesy of Juan Siliezar / Brown University

Virtual reality has the potential to make remote collaboration far more feasible in the workplace, according to Brown researchers. With headsets, robots can act as proxies for remote workers in healthcare facilities or classrooms.

But current VR technology has limitations when it comes to spatial awareness. If the space in which a person operates a VR headset doesn’t match neatly with the space a robot proxy might operate in, movement can become challenging.On Oct. 29, a team of researchers from Brown and Cornell presented new VR software that aims to alleviate this problem at the 2023 Association for Computing Machinery Symposium on User Interface Software and Technology.

The software — called VRoxy, a play on the term “virtual reality proxy” — allows remote collaborators to use a small space to physically interact with others in much larger facilities via a robot proxy. 

“When people focus on VR interactions, … they mainly focus on collaboration with digital aspects,” Mose Sakashita, a PhD student at Cornell studying human-computer interaction, told The Herald. “We wanted to focus on collaboration around the physical objects in the physical space.”


“How can we facilitate complex tasks that cannot be supported by video-conferencing systems like Zoom?” he asked.

Sakashita, the lead author of a recent paper describing the project, was among those presenting at last week’s symposium. 

The idea for VRoxy was sparked by his frustrations working remotely as a teaching assistant during the COVID-19 pandemic. “It was really difficult for me to engage in collaboration,” Sakashita said. “Those dynamic or nuanced cues like head direction … or my body position in relation to physical objects (were) completely missing.” But robots have the capacity for these nuanced movements, he said. So “we started working on projects that use robotic embodiment to enhance the sense of being physically together.”

Existing VR software allows users to control robots remotely, but it requires them to have a space that mimics their human collaborators’ facilities in size and layout. “They're likely not going to have the same lab space or the infrastructure for it,” Brandon Woodard GS, a PhD student at Brown studying human-computer interaction, told The Herald. 

Woodard worked on the methodologies and testing of VRoxy remotely, operating a robot located at Cornell’s Ithaca campus all the way from Providence. 

VRoxy users see a 3D rendering of the remote space they are working in. When the user is navigating this space, “you're kind of in this cartoon world,” Woodard said. “If we were constantly just running a 360-degree (live) video stream, … there (could) be a lot of glitches.”

The user navigates this space by walking around and using teleportation links, represented by circles on the ground. If the user steps into the circle, they can teleport a much greater distance in the VR space than they are actually traveling in their physical location. 

VRoxy “allows a remote person to be as mobile as anyone else who is physically in the same space” — if not more, wrote Jeff Huang, associate professor of computer science and Woodard’s PhD advisor, in an email to The Herald. With just a few steps, users can even “move” to a completely different building, activating a different robot, Woodard explained.

Once the user navigates to their desired workspace within the facility, a 360-degree video feed is displayed. “So now, instead of seeing a 3D copy of the room, you actually see the robot’s camera view. You can see people there, and you can fully look around and start interacting with people,” Sakashita said. 

As the user moves around, the robot can mimic their non-verbal cues, such as head rotation, facial expression, eye gaze and pointing gestures, Sakashita explained. 


Once the user has completed their desired tasks in a particular location, they can go back into navigation mode where the 360 camera is disabled and they once again enter the 3D rendering.

Though the software is still in its infancy, the team is already working on improvements. “We have a mobile robot now, (but) it can only point and reference things,” Woodard said. “The next step is to have a robot that can grab objects and manipulate them.”

With future improvements, the software has the potential to be implemented in various fields, he added. Some of the team’s target scenarios include remote teaching and telehealth. 

“I can (also) really see remote surgery benefiting from something like this, especially in countries where they don't really have the infrastructure they may need,” Woodard said.

Get The Herald delivered to your inbox daily.

In any scenario, physical presence and non-verbal cues help facilitate a better understanding of colleagues’ intentions and enhance the social aspects of collaboration, researchers told The Herald. 

“Zoom and video conferencing tools have focused on the face and spoken word as the primary channel for communication,” Huang wrote. “Being able to share and interact at the room level allows a fuller whole-person social experience.”

Liliana Cunha

Liliana Cunha is a staff writer covering Science and Research. She is a sophomore from Pennsylvania concentrating in Cognitive Neuroscience. In her free time, she loves to play music and learn new instruments.


Powered by SNworks Solutions by The State News
All Content © 2024 The Brown Daily Herald, Inc.