This work describes an immersive control system for the Spot robot by Boston Dynamics, designed to track the head movements of the operator wearing the Meta Quest 2, while also utilizing joystick commands for locomotion control.
Authors:
- Ali Yousefi, [email protected]
- Zoe Betta, [email protected]
- Giovanni Mottola, [email protected]
- Carmine Tommaso Recchiuto, [email protected]
- Antonio Sgorbissa, [email protected]
©2024 RICE Lab - DIBRIS, University of Genova
If you use this work in an academic context, please cite:
@inproceedings{Yousefi2024zedoculusspot,
title={Immersive control of a quadruped robot with Virtual Reality Eye-Wear},
DOI={10.1109/ro-man60168.2024.10731469},
author={Yousefi, Ali and Betta, Zoe and Mottola, Giovanni and Recchiuto, Carmine Tommaso and Sgorbissa, Antonio},
booktitle={2024 33rd IEEE International Conference on Robot and Human Interactive Communication (ROMAN)},
year={2024}
}
This repository provides a modified version of the software available on zed-oculus. Since the IMU and touch input data is required for this work, the main.cpp
is modified in such a way that it reads the angular velocities, and touch input data using ts.HeadPose.AngularVelocity
, InputState.Thumbstick[ovrHand_Right]
, and InputState.Thumbstick[ovrHand_Left]
class attributes, and sends them to the process executed by main.py
. Additionally, considering the fact that the ZED camera is not connected to the user PC with a USB cable, another modification is done in main.cpp
file, in order to open the ZED camera from the socket input, by changing the init_paramters
values in zed.open(init_parameters)
same as the method HERE.
Furthermore, the scripts
folder is added to this package, which contains the sofware developed for the headtracking task, and locomotion with joysticks. The script files are described as follows:
Script | Description |
---|---|
main.py | Gets executed by the main.cpp file. Uses the SpotInterface and Controller classes methods for the headtracking and locomotion tasks. |
controller.py | Provides a simple closed loop controller using the simple-pid python module with the method get_hmd_controls(setpoints) . Additionally, computes the locomotion control signal based on the touch input reference signals with the method get_touch_controls(setpoints) . |
spot_interface.py | Initializes the Lease, eStop, Power, RobotState, and RobotCommand clients. Provides the required method for sending the control signals to the robot set_controls(controls, dt) , and receving robot angular velocities get_body_vel() |
The system architecture for this work is shown as it follows:
A Wi-Fi bridge could be implemented on the Raspberry Pi board using this tutorial. Once it is ready, the components of the local network could be configured with the following ip addresses:
Component | IP Address |
---|---|
Raspberry Pi 4 Model B | 192.168.220.1 (Server - Ethernet and Wireless) |
Jetson Nano | 192.168.220.50 (Client - Ethernet) |
OMEN PC | 10.42.0.210 (Client - Wireless) |
Spot Robot | 10.42.0.211 (Client - Wireless) |
For the purpose of this work, the IMU data generated by the HMD, is transmitted to the main.py
control process, using a named pipe. Moreover, the stereo camera images is transmitted from the robot to the HMD, using a socket, with the method of this tutorial. Sending the control signals and receiving the robot state data is done using gRPC.
- Windows 64 bits
- Spot SDK
- simple-pid
- ZED SDK 3.x and its dependencies (CUDA). Last tested with ZED SDK 3.2.2
- Oculus SDK (1.17 or later)
- GLEW included in the ZED SDK dependencies folder
- SDL
Download the sample and follow the instructions below: More
- Create a folder called "build" in the source folder
- Open cmake-gui and select the source and build folders
- Generate the Visual Studio Win64 solution
- Open the resulting solution and change configuration to Release. You may have to modify the path of the dependencies to match your configuration
- Build solution
- On the robot side (Linux/Jetson), build and run the streaming sender using the method shown HERE.
- On the user side (Windows), run the ''ZED_Stereo_Passthrough.exe'' in a terminal as it follows:
./ZED_Streaming_Receiver <ip:port>
Once it is executed, the stereo passthourgh from ZED camera to Oculus starts. Moreover, it will automatically run the main.py
script, that provides the control system and iterface with the robot.
For future work, we aim to address the limitations of the current study. This includes implementing a more efficient communication system to control the robot from greater distances, and ensuring a comparable field of view between the HMD and the tablet-based controller.