Imagine grabbing a block from a stack on a table, throwing a ball toward a caricature in your backyard in a game of Pokemon GO or lighting up a dim room — only to realize that the cubes, the ball, the Pokemon pal and the illuminating lantern do not actually exist.
A new technology called ‘Portal-ble’ now brings to life these theorized possibilities. The software constructs these things in an alternate reality depicted through the screen of a smartphone, and with it, one’s hand can traverse the boundary into this virtual landscape, blurring the lines between the real world and augmented reality.
Developed by Assistant Professor of Computer Science Jeff Huang in collaboration with students in his lab and other University faculty members, the software incorporates virtual objects into a person’s surroundings. The technology also requires an additional attached infrared sensor to recreate a user’s hand and show its interaction with nonexistent objects in this virtual world. The sensor is connected to a mini-computer in the form of a compute stick that processes the input information and sends a wireless signal to the phone to simulate the image, Huang said.
The researchers have made the software available to the public for download so that anyone could build upon it to develop their own applications. Solely relying on the camera of a phone to run the technology is sufficient, Huang said, but for an optimal user experience, an infrared sensor is needed to provide depth-perception and expand the field of view.
The idea for the concept was born about three years ago, a time when virtual reality and augmented reality devices were popular but in need of improvement. VR systems had at the time existed primarily as wearable headsets, and the researchers hoped to change that by using a portable, everyday item that many people owned — a smartphone — to allow users to partake in these alternate realities while still interacting with their real environments. Smartphones are capable of rapidly processing images, and their use eliminates the need to wear cumbersome devices. Huang also hoped they could improve upon the unsatisfying experience in some augmented reality environments, like the act of swiping a screen to capture Pokemon in Pokemon GO.
In scenarios like Pokemon GO, “all your interactions are happening on a screen where the (actions) are supposed to be interpreted as part of the 3D environment, so there’s a disjunction here,” Jing Qian GS, lead author and developer of Portal-ble, said.
The researchers tried to maximize the way people interacted with the augmented environment, keeping the screen as a mere window to the world as opposed to the means of interaction with it.
“That’s exactly the most interesting part of it — how do we sort of trick ourselves into making these virtual objects as close to reality as possible?” Huang said.
The technology will correct for natural errors in the way people interact with the objects, like grasping an object too strongly or too lightly in the way they move their hand, Huang said. Users also learn and adapt as they continue to use the product because the app can train people by vibrating, for instance, when a person’s hand gets close to the virtual object they plan to touch.
One of the challenges of using a phone to simulate three-dimensional objects is the loss of depth perception that occurs when someone moves from watching the world with two eyes to viewing everything through a single camera. But the infrared sensor attachment in Portal-ble solves this problem, Huang said.
Assistant Professor of the Practice of Engineering and Computer Science Ian Gonsher, who was not involved in the creation of Portal-ble, wrote in an email to The Herald that the product does increase accessibility to augmented reality. But “as this kind of augmentation becomes increasingly overlaid onto our experiences, it becomes increasingly difficult to determine what is real and what is not,” Gonsher cautioned.
In one application, Portal-ble was used by RISD students to map a 3D garden. Another program allows a user to carry a virtual lantern while the software recreates the shadows and light contrast that someone would see if they were actually lighting up the room, Huang said.
The technology allows for the creation of many other programs for a variety of applications in the future. Portal-ble could theoretically be used in a medical setting to view and virtually touch the internal components of a human body. In another example, Huang described how a user could throw a dart and hit a target without the perfect physical coordination necessary in the real world.
Holding up a smartphone and the attached sensor may be heavy for a length of time, but Qian, who presented on Portal-ble at the ACM Symposium on User Interface Software and Technology this week, expects smartphones to become more lightweight in years to come.
“Apart from directly controlling (augmented or virtual reality environments) with your mind — which is not possible yet — the most practical option is to use bare hands. To get there, Portal-ble provides the much needed systematic analysis of the usability issues that may arise,” Assistant Professor at the Institute for Software Research at Carnegie Mellon Mayank Goel, who also attended the UIST 2019 conference, wrote in an email to The Herald.
“I know my students and I are going to use it as soon as we get a chance!” Goel wrote to The Herald.
Clarification: A previous version of this article named Jiang Qian as one of Portal-ble's developers, but it is more accurate to say that he is lead author and developer of Portal-ble.