A known issue in teleoperation concerns dexterity in grasping objects, which can be severely limited if not properly coupled with perfect stereoscopic vision and force/tactile feedback. This work presents the development and testing of a subsystem used for a virtual dashboard within the Sully teleoperation system, with hardware and software components. It is used to capture images, gather object information, and display a virtual dashboard that provides the operator with information about the distance and reachability of objects, to support stereoscopic vision visualized through a virtual reality headset. The hardware consists of two front cameras, used as eyes for the teleoperated robot, and two Time-of-Flight (ToF) sensors to estimate the distance between the teleoperated robot and the Object of Interest (OoI). The software processes images captured by cameras to identify the objects, detects the OoI, and estimates its distance from the robot using the measurements of ToF sensors. Performance results showed an optimal trade-off between image quality, frame rate, and use experience. Indeed, a mean accuracy of 88% was achieved in identifying and detecting OoIs, and an average mean absolute percentage error (MAPE) of 1.40% to estimate the distance from the robot to the OoI at 120 cm.

Virtual Dashboard Design for Grasping Operations in Teleoperation Systems

Di Tecco, Antonio
Primo
;
Camardella, Cristian;Leonardis, Daniele;Loconsole, Claudio;Frisoli, Antonio
Ultimo
2024-01-01

Abstract

A known issue in teleoperation concerns dexterity in grasping objects, which can be severely limited if not properly coupled with perfect stereoscopic vision and force/tactile feedback. This work presents the development and testing of a subsystem used for a virtual dashboard within the Sully teleoperation system, with hardware and software components. It is used to capture images, gather object information, and display a virtual dashboard that provides the operator with information about the distance and reachability of objects, to support stereoscopic vision visualized through a virtual reality headset. The hardware consists of two front cameras, used as eyes for the teleoperated robot, and two Time-of-Flight (ToF) sensors to estimate the distance between the teleoperated robot and the Object of Interest (OoI). The software processes images captured by cameras to identify the objects, detects the OoI, and estimates its distance from the robot using the measurements of ToF sensors. Performance results showed an optimal trade-off between image quality, frame rate, and use experience. Indeed, a mean accuracy of 88% was achieved in identifying and detecting OoIs, and an average mean absolute percentage error (MAPE) of 1.40% to estimate the distance from the robot to the OoI at 120 cm.
2024
979-8-3503-7799-6
File in questo prodotto:
File Dimensione Formato  
MetroXRAINE2024_Conference.pdf

non disponibili

Descrizione: Virtual Dashboard Design for Grasping Operations in Teleoperation Systems
Tipologia: PDF Editoriale
Licenza: Copyright dell'editore
Dimensione 379.4 kB
Formato Adobe PDF
379.4 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11382/573852
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
social impact