Artificial Intelligence News

‘Explainable AI’ can efficiently detect AR/VR cyber ailments


May 02, 2023

(Nanowerk News) Exposure to augmented reality (AR) or virtual reality (VR) environments can cause people to experience cybersickness — a specific type of motion sickness with symptoms ranging from dizziness to nausea — and the research that exists to reduce symptom severity often relies on a one-size-fits-all approach .

However, Khaza Anuarul Hoque, assistant professor in the Department of Electrical Engineering and Computer Science at the University of Missouri, and a team of researchers are working to develop a personalized approach to identifying cyber-disease by focusing on the root causes, which can be different. for everyone. This roller coaster simulation in virtual reality was used by Khaza Anuarul Hoque and a team of researchers to simulate and detect cyber-disease. (Image: Khaza Anuarul Hoque)

“Cybersickness is not common. For example, one simulation may trigger cyber sickness in me while the same simulation may not cause cyber sickness in others,” said Hoque, who also leads the Reliable Cyber ​​Physical Systems Laboratory at MU. “One of the problems that people usually encounter when wearing virtual reality or augmented reality headsets is that the user experience can get worse after a while, including symptoms of nausea and vomiting, especially if the user is immersed in a virtual environment where a lot of movement occurs. involved. It can depend on many factors, including a person’s gender, age and experience.”

Hoque said he wanted to concentrate on new angles using explainable AI because it has the potential to transform the AR and VR industry.

“Explainable AI is a great tool to help with this because usually a machine learning or deep learning algorithm can tell you what the predictions and decisions are, whereas explainable AI can also tell the user how and why the AI ​​is making decisions,” Hoque said. . “So rather than imposing static mitigation techniques on all users, it is more effective if we know why a particular person is experiencing cyber-illness and provide the exact mitigation that person needs. Explainable AI can help us do that without compromising the user experience.”

In addition to observing his own students experience cyber-illness, Hoque notes that academic and industry approaches to identifying cyber-illness over the past five to seven years have often focused on data-driven techniques such as machine learning (ML) and deep learning (DL).

“Such approaches are often ‘black boxes’, and as such they are less clear,” says Hoque. “I also realized that explaining cybersickness DL models can significantly improve model understanding and provide insight into why and how these AI models arrive at certain decisions. In addition, by identifying and understanding what types of critical features can cause cyber-illness, we can help designers develop more effective cyber-illness detection models.”

Hoque says explainable AI can also help software developers identify the most important features needed to optimize models to teach AI how to identify someone with cyber-sickness. This is especially important for users wearing stand-alone VR headsets.

This research was recently presented at the top three conferences for AR/VR research:

“LiteVR: Interpretable and Lightweight Cybersickness Detection using Explainable AI” presented at the IEEE Virtual Reality Conference on 25-29 March 2023.

“VR-LENS: AI-guided Super Learning-Based Disease Detection and Deployment Explainable in Virtual Reality” presented at the ACM Conference on Intelligent User Interfaces March 27-31, 2023.

“TruVR: Reliable Detection of Cyber ​​Diseases using Explainable Machine Learning” presented at the International Symposium on Mixed and Augmented Reality (ISMAR) Conference on 17-21 October 2022.





Source link

Related Articles

Back to top button