U of A Fuses AI and Immersive Tech to Shape the Future of Learning and Industry

Students trying their XR applications during the implementation phase of their projects.
Ehsan Azimi, assistant professor of electrical and computer engineering and computer science at the University of Arizona.
At the University of Arizona, the convergence of artificial intelligence and extended reality (XR) is transforming how students learn, experiment, and innovate. Through pioneering courses, hands-on labs, and interdisciplinary collaboration, the U of A is creating a dynamic ecosystem where virtual and augmented experiences aren't just imagined—they're built.
One cornerstone of the university's XR education efforts is the course, Introduction to Extended Reality, taught by Ehsan Azimi, assistant professor of electrical and computer engineering and computer science. With more than 14 years of experience in XR, Azimi is helping shape the future of immersive technology by providing students from diverse disciplines with both a solid theoretical foundation and hands-on training in XR systems.
“This course is carefully structured to blend theoretical foundations with industry-level, hands-on problem-solving experiences," says Azimi, whose research bridges human-computer interaction, artificial intelligence, and robotics. “Students don’t just learn about XR—they actively build and innovate with it.”

Dr. Azimi and some of the students visit UofA nanofabrication cleanroom to model and implement XR-based digital twin for immersive training.
Ehsan Azimi, assistant professor of electrical and computer engineering and computer science at the University of Arizona.
The course begins by introducing essential concepts such as spatial transformations, camera modeling, interaction modalities, user-centered design and perception. Students also explore advanced topics like digital twins—virtual replicas used for simulation and analysis—alongside voice and gesture input, eye gaze tracking, and immersive training systems that use virtual, augmented and mixed reality. XR itself serves as an umbrella term encompassing various technologies: virtual reality (VR), which creates entirely immersive digital environments; augmented reality (AR), which overlays virtual elements onto the real world; and mixed reality (MR), which contextually integrates digital content with physical surroundings.
In the second half of the semester, students apply these concepts using state-of-the-art hardware like Microsoft HoloLens and Meta Quest, gaining hands-on experience in developing immersive applications. The course is open to both undergraduate and graduate students from various disciplines.
AI: The Engine Behind Smart Immersive Systems
Artificial intelligence plays a critical role in this learning journey.
"AI makes XR systems smarter and more responsive, enabling dynamic user interactions, advanced natural language processing and automated environment generation,” Azimi says. Students explore how AI-enhanced XR is already transforming fields from surgery and mining to semiconductor education and entertainment.
Such AI-powered XR applications are reshaping industries by enabling advanced simulation, training and visualization. In medicine, augmented reality can help surgeons visualize tumors in real-time during operations. In mining, XR is used for remote inspections and hazard simulations. In manufacturing, trainees can fully immerse themselves in a digital twin of a nanofabrication cleanroom, learning the steps of semiconductor manufacturing through realistic virtual environments.
“By immersing students in both theory and practice, we prepare them for the future of work—whether that’s designing AI-powered XR systems or deploying them in critical industries,” Azimi says.
Building an XR Ecosystem Across Campus
Complementing this academic foundation are collaborative initiatives like the AI and XR Studio course—co-taught by Ash Black, Bryan Carter, and Matthew Briggs—which combines storytelling, experiential media design, and AI-driven world-building.
“Students often find themselves working with technologies that are just beginning to enter the public sphere, which keeps the learning process fresh, exciting, and highly relevant,” said Briggs. “What’s especially rewarding is the way students take ownership of these tools to build immersive, creative projects in just a few days, transforming ideas into rich, interactive experiences. They are not just exploring emerging technologies; they are using them to imagine and construct better, alternative worlds.”
The AI Core’s XR Lab further deepens student engagement through research in spatial computing, generative AI and interactive environments. “Right now, AI models like ChatGPT and Claude are trained on massive amounts of language data, but XR is what’s going to allow AI to understand the physical world," said Ash Black, "that’s why this technology is so exciting—it’s not just about what’s written, but about how the world actually functions. This is the future of computing, and we’re preparing students to be at the forefront of it."