\ vɪ‧kɑ‧rɪ‧ɒs \
We aim to stimulate users to vicariously experience a remote environment, real or virtual. Through approaches in immersive mixed reality, telerobotics, and biomechanical simulations and control, the objective is to advance the understanding of human interaction with remote environments.


Team Leader


PhD Candidate


XR Developer


Research Intern


Postdoc Researcher


VR Developer


Postdoc Researcher


Fellow Researcher

Sunny Katyara

Dario Mazzanti​

Abdeldjallil Naceri​

Patricia Yáñez Piqueras​​

Brendan Emery​

Mohammad Fattahi Sani​

Our research stands on THREE main pillars

To enhance Real-time Immersive Interaction in Telerobotics

We study how the combination of immersive mixed reality (MR) interfaces, intuitive control devices, real-time data from remote sensors (RGB-D cameras, microphones, F/T sensors, etc.) can allow high-fidelity in the perception-action loop, offering a real-time immersive interaction experience to the human user in telerobotics applications.

To advance state-of-the-art in Immersive Training with eXtended Reality (xR)

VR serves as a promising technology demonstration for safety training, providing risk free, immersive learning, among other features. We explore how xR technologies, combining VR with spatially contextualized physical interaction, can help make training sessions more effective in the acquisition of safety behaviour, while increasing the trainee’s engagement.

To understand Human Interaction Behaviour through Predictive Simulation and xR

The overarching objective of our research is to improve our knowledge of how humans interact with remote environments, real or virtual. We investigate how predictive full-body biomechanical simulations, biophysiological parameter tracking, combined with xR can help understand a human user’s behaviour during their interaction with remote environments.


Immersive Remote Telerobotics (IRT) Interface

The interface facilitates intuitive real-time remote teleoperation, while utilizing the inherent benefits of VR, including immersive visualization, freedom of user viewpoint selection, and fluidity of interaction through natural action interfaces


Robot Manipulators, Multi-cam streaming, Jetson AGX Xavier, HTC Vive Pro, UE4, Point-cloud streaming, Telepresence, Gstreamer, FFmpeg

Immersive Remote Visualization using Foveated Rendering

A remote 3D data visualization framework that utilizes the natural acuity fall-off in the human eyes to facilitate the processing, transmission, buffering, and rendering in VR of dense point-clouds / 3D reconstructed scenes.


Sampling, ElasticFusion, HTC Vive Pro Eye (Gaze tracking), CUDA, OpenGL-SL, OpenMP, Gstreamer, FFmpeg, UE4

Truncated Signed Distance Fields (TSDFs), 3D Reconstruction, and real-time haptic feedback in remote exploration applications

A unified system for simultaneously estimating obstacle avoidance vectors while generating a TSDF-based 3D reconstruction of the environment, in real-time


TSDF, Open3D, CUDA, ROS, Real-time 3D reconstruction, Robot Manipulators, “Feel-where-you-don't-see"

Mixed Reality Training for Industrial Workers

In the field of safety training, immersive, virtual training environments are seen to be more effective than conventional methods, when it comes to the acquisition of safety behaviour, and can increase trainee’s engagement.


UE4, Quest System, Cyberith Virtualizer, HTC Vive Focus 3, Oculus Quest 2

Predictive Full-body Biomechanical Motion Simulation (PFM-Sim)

Simulations are an important means to study biomechanical phenomena in industrial scenarios, e.g., falling from heights. “What If?” simulations help predict new outcomes using the simulations of the technologies being designed.


OpenSim, SCONE, Matlab / Simulink, Musculoskeletal model


Robot Teleoperativo 2

This project advances the results from the Robot Teleoperativo 1 project, bringing the developed technologies closer to the application domain through extensive field testing and use-case demonstrations.

Role: Project Coordinator

Duration: 1-Jan-2021 -to- 31-Dec-2023

Sponsor: Istituto Nazionale per l'Assicurazione contro gli Infortuni sul Lavoro (INAIL)

More Details: https://advr.iit.it/projects/inail-scc/teleoperazione

Caduta dall’Alto

(Falling from Heights)

The project is aimed at the design and development of novel strategies and solutions aimed at preventing accidents as well as protecting workers working at heights. The project will focus on advancements in wearable sensing & actuation technologies, including technologies for fall impact reduction and smart monitoring, as well as on new paradigms in immersive training for workers.

Role: Project Co-Coordinator

Duration: 1-Jan-2021 -to- 31-Dec-2023

Sponsor: Istituto Nazionale per l'Assicurazione contro gli Infortuni sul Lavoro (INAIL)

Robot Teleoperativo 1

This project aims at enhanced occupational safety in hazardous environments through substitution, i.e., removing the worker from the unsafe area and having robotic technologies do the same tasks through remote robotic teleoperation. To achieve this goal, a collaborative robotic system is proposed, composed of a mobile manipulator FIELD robot, teleoperated from the immersive haptic PILOT station,

Role: Project Coordinator

Duration: 1-Sep-2017 -to- 31-Dec-2020 (Completed)

Sponsor: Istituto Nazionale per l'Assicurazione contro gli Infortuni sul Lavoro (INAIL)

More Details: https://advr.iit.it/projects/inail-scc/teleoperazione



  • Yonas Tefera et al., “Towards Foveated Rendering for Immersive Remote Telerobotics”, In 5th International Workshop on Virtual, Augmented, and Mixed-Reality for Human-Robot Interactions (VAM-HRI), Mar. 7, 2022, Sapporo, Japan. <pdf> <bibtex>

  • Sunny Katyara et al., “Fusing Visuo-Tactile Perception into Kernelized Synergies for Robust Grasping and Fine Manipulation of Non-rigid Objects,” arXiv preprint <pdf> <bibtex> [Collab. w/APRIL Lab]
  • A Naceri et al., “The Vicarios Virtual Reality Interface for Remote Robotic Teleoperation,” Journal of Intelligent & Robotic Systems 101 (4), pp. 1-16. <bibtex>
  • G. Barresi et al., “Exploring the Embodiment of a Virtual Hand in a Spatially Augmented Respiratory Biofeedback Setting,” Frontiers in Neurorobotics, 114. [Collab. w/ Rehab Technologies] <bibtex>
  • L. S. Mattos et al., “μRALP and beyond: Micro-technologies and systems for robot-assisted endoscopic laser microsurgery,” Frontiers in Robotics and AI, 240. [Collab. w/ Biomedical Robotics Lab]. <bibtex>

  • A. Acemoglu et al., “Operating from a distance: robotic vocal cord 5G telesurgery on a cadaver,” Annals of internal medicine 173 (11), 940-941. [Collab. w/ Biomedical Robotics Lab ]. <bibtex>
  • J. Lee et al., “Microscale precision control of a computer-assisted transoral laser microsurgery system,” IEEE/ASME Transactions on Mechatronics 25 (2), 604-615 [Collab. w/ Biomedical Robotics Lab]. <bibtex>

  • M. F. Sani et al., “Towards Sound-source Position Estimation using Mutual Information for Next Best View Motion Planning,” 2019 19th International Conference on Advanced Robotics (ICAR), pp. 24-29. <bibtex>
  • A. Naceri et al., “Towards a virtual reality interface for remote robotic teleoperation,” 2019 19th International Conference on Advanced Robotics (ICAR), pp. 284-289. <bibtex>
  • A. Acemoglu et al., “The CALM system: New generation computer-assisted laser microsurgery,” 2019 19th International Conference on Advanced Robotics (ICAR), pp. 641-646 [Collab. w/ Biomedical Robotics Lab]. <bibtex>


Get In Touch!

Vicarios Mixed Reality and Simulations Lab

Advanced Robotics

Istituto Italiano di Tecnologia (IIT)

Via Morego 30, 16163, Genova, Italia