Unity simulation environment for AGH Space Systems robotics projects.
- Runtime Prerequisites
- Getting Started
- IMU Simulation
- GPS Simulation
- RealSense Cameras
- RealSense Publisher Plugin
- Creating a New Map
- CMake 3.20 or newer
- any C++ 20 capable compiler
Clone the repository to your ROS 2 workspace and build it:
Note
This is only for standalone usage.
You can skip this step if you use kalman_robot
since unity_sim
is already included there as a submodule.
git clone [email protected]:agh-space-systems-rover/unity_sim.git src/unity_sim
colcon build --symlink-install
source install/setup.bash
Once the workspace is built, find out which Unity version is required by this project:
ros2 run unity_sim unity_version
And this command will yield something like Unity 2023.1.9f1
. (Exact version may differ!)
To setup this version of Unity on your system, you'll need Unity Hub. It can be installed by following the official instructions.
Tip
Users on Arch Linux can install unityhub
from the AUR.
Once Unity Hub is installed, it should prompt you to sign into your account in order to activate a personal license. Once logged in, you can install the required version of Unity. Note that Unity Hub only provides download links for the latest versions, so you'll need to access Unity Archive to find the specific version you need. From the archive, you can click a button to open Unity Hub and begin the download.
Once the download is complete, it's a good idea to test your installation by running other demo projects. But you can close Unity Hub for now and proceed to run the simulation:
Important
Make sure that runtime prerequisites are installed:
# Arch Linux
sudo pacman -S cmake base-devel
# Ubuntu/Debian
sudo apt install cmake build-essential
# Fedora
sudo dnf install cmake gcc-c++
ros2 launch unity_sim unity_sim.launch.py
On first run, the project will take a while to start as it needs to download some packages from the web and import all the assets. However, future runs will be significantly faster. Unity may still take up to a dozen of seconds to start, so you can use the above command to run the simulation in a new terminal window, while you continue to restart other nodes separately.
If you wish to upgrade the simulation to a newer version of Unity, please open it using Unity Hub. The project directory can be found here and will need to be manually selected in Unity Hub.
The simulation provides a virtual IMU sensor. It is a standalone C# script that can be attached to any GameObject of choice.
The sensor works by comparing the subsequent positions of the GameObject between subsequent physics frames. This allows it to recover velocity. Then the velocity is compared between subsequent frames to compute acceleration. Angular values are derived in a similar manner.
You can configure the IMU's topic and report frequency in the settings of the script component. Data is published to /imu/data.
By default, a simulated GPS unit is attached as a prefab to Kalman. It is a GameObject with an attached C# script that searches for GPSBaseStation objects in the scene and uses them as reference points to calculate the GPS position of the unit. The position is published to /gps/fix
.
You can add an instance of the GPSProbe prefab to your scene and use a button in its custom editor UI to log information about the base stations and the current GPS position of the probe.
Additionally, the probe will log the yaw of the entire Unity world relative to true north, which can be copied into IMU's Yaw Offset field to ensure that the IMU correctly points to the north.
GPSProbe can also be used to bulk-find GPS coordinates of waypoints specified as X,Z positions in a text file, formatted like this:
W1 -1.4 2.3
W2 0.5 1.2
...
This way you can easily get a list of earth-referenced waypoints that can be imported into the Ground Station.
In Unity each RealSense is a prefab consisting of a single GameObject
that contains a camera and a control script. The script renders the camera at a set rate and applies a shader that embeds depth information in the alpha channel of the image.
When a frame is available, it is read by the script onto the CPU, and from there, native C++ code sends the unmodified binary buffer over a Unix socket to a unity_rs_publisher
ROS node.
The node republishes received data as image messages along with camera_info
. Notably, the node subscribes to {camera}/unity_rs_publisher/meta
topics to receive camera metadata which is needed to construct the messages.
This Unity native plugin forwards live feeds from simulated RealSense cameras, over sockets, to a ROS 2 publisher node.
If the simulation starts and the plugin is not found at ./unity_project/unity_sim/Assets/Simulation/RealSense/UnityRSPublisherPlugin.so
, it will be automatically compiled.
In order to rebuild the plugin, remove the file and restart the simulation.
During the European Rover Challenge, you typically receive a 3D model of the terrain before the competition. You can use this model to create a new terrain in Unity:
- Clean the imported 3D geometry.
- Create a a custom shader that colour points based on their height (pass colour directly to the Surface output, do not use Principled/Diffuse).
- Disable any tonemappers and post-processing effects.
- Render the heightmap using an orthographic camera.
- Save as 16 bit grayscale PNG.
- Convert to RAW using GIMP.
- Create a new scene in Unity and Save it to
Assets/Simulation/Scenes/NameOfMyNewScene.unity
. - Use Window -> Terrain Toolbox in Unity to import the heightmap and create a terrain.
- Move the newly generated Assets/Terrain directory to
Assets/Simulation/Scenes/NameOfMyNewScene
folder. - Optionally copy over global post processing volume and a skybox from another scene.
You will now have the terrain set up in Unity. While you can later paint textures on it, this is not necessary for the simulation.
Moving on, you can add in the robot and geographic reference points:
- Instantiate Kalman Prefab in the scene.
- You can now drive on the terrain.
- Instantiate the FollowCamera Prefab.
- Configure the FollowCamera Component on FollowCamera object to follow the Kalman object (to adjust the look-at pos, add an Empty GameObject as a child of Kalman and follow that).
- Now your view follows the robot.
- Add at least 3 GPSBaseStation prefabs. Configure each one with the assumed lat/long of the base station.
- Use GPSProbe to find the world's yaw relative to true north and put that value as IMU's yaw offset in Kalman -> BaseLink -> IMU.
- Now
/imu/data
and/gps/fix
messages will be published with the correct data. You should be able to see Kalman's pose on the map in Ground Station.
To keep the repository size minimal, only the basic ArUco dictionaries are included. In case you need to simulate a world with a different type of ArUco tags, you must add a new dictionary to the project:
- Modify generate_aruco_textures.py to include the new dictionary.
- Run the script in its directory to generate new textures in
Assets/Simulation/ArUco/Textures
. - Select the new textures in Unity and set their filter mode to Point (no filter) and wrap mode to Clamp.
- If the texture size is not a power of 2, set Advanced -> Non Power of 2 to None.
- Modify ArUcoTag.cs to include the new dictionary.
You should now be able to access the new dictionary in the ArUcoTag prefab.