Close

Computer Science

Extended reality

Computer science professor strives to improve remote collaborations through XR headsets

Susmit Shannigrahi, assistant professor of computer scienceWhen Tennessee Tech Assistant Professor Susmit Shannigrahi slips on an extended reality headset, his surroundings become a new immersive experience.

Virtual objects appear, merging with the real world.

He can grab a floating keyboard, adjust the angle and enter letters and numbers. He can view various screens and then swipe them away with the wave of his hand. And so much more.

This emerging technology, known as XR, encompasses augmented, virtual and mixed realities, and it’s the “future of work and school,” Shannigrahi said – when the internet is ready for it.

But some barriers remain.

Through his research in networking, Shannigrahi has been focusing on those barriers that pertain to speed and distance.

“We need something that can support low latency,” he said. “Internet things are bound by physical distance, so there’s a limit to how fast you can go. For example, if you’re sending something from here to California, it will take anywhere from 100 to 150 milliseconds.”

That’s too slow, he said, for XR to be useful in applications like remote troubleshooting, surgery and other interactions that must happen in real-time.

“If you’re in a room and everyone’s wearing a headset, it works great,” Shannigrahi said. “But if one person is in New York and another person is in California, the network doesn’t really support it. You cannot break the speed of light – so what are the mechanisms that will make XR headsets usable for people who are collaborating remotely? How can we make the network more suitable for these devices?”

To find answers, Shannigrahi has been collaborating with Colorado State University and the University of Nebraska Omaha, having been awarded an 18-month planning grant from the National Science Foundation. Tennessee Tech students Alex Marti, who has since graduated, and Nathan Melton, along with a couple of CSU students, have also been involved in the project, which was funded through September 2022. Another proposal is in the works.

“Our work looks at how to make extended reality headsets useful for everyday interactions,” Shannigrahi said. “For example, what are the challenges of creating a hybrid classroom where remote students join in the augmented reality space? Through a survey of the AR and VR research community, we found that the current network does not support such use cases well. We are using Microsoft HoloLenses to look at how to make such collaborations a reality.”

Within the next decade, Shannigrahi envisions remote collaborations in hybrid environments, including industrial settings, using XR technology.

“If you’re troubleshooting remotely, you could project something and say, ‘Turn that knob’ or ‘Tighten that screw,’” he said.

Without the lag of course – which can cause additional problems, like cyber sickness.

“If it’s below 30 milliseconds, you’ll be fine,” Shannigrahi said. “But if it’s more than that in a virtual scenario, the lag between what is said to you and when your brain can interpret it can make you physically sick.”

How can the problem be solved? One way, Shannigrahi believes, is by predicting what’s going to happen.

“Think of playing chess,” he said. “When one player makes a move, the other can think of the next possible moves. So, in virtual reality, if I’m picking up an object, the computer program might be able to precompute the next moves and send choices to the user.”

As part of their research, Shannigrahi and others on the team used cloud computing to observe and quantify the latency of remote interactions from Tennessee to Colorado and California as well as to Asia and Europe.

“We ran experiments to see how long it takes, and we’re trying to come up with new networking models that can support the latency needed by the headsets,” he said.

The team also collected data from others in the field through survey questions like, “What’s the biggest hurdle for your research in AR and VR?” “How do you use the network?” “What kind of latency do you see?” and “What kind of infrastructure do you have in your university or organization?”

Shannigrahi is ready for the next stage of the research project.

“The goal now is to propose what we are going to do to fix the headsets,” he said. “That will involve 5G or 6G, Software Defined Networks and Edge Computing.”

It’s a just matter of time… and reality.

COLLEGE OF ENGINEERING NEWSROOM