Skip to content
Rajat Mishra edited this page Oct 5, 2021 · 1 revision

Underwater Robotics Simulator using Unity3D and ROS

Simulations are a crucial part of the design and deployment process as it gives us an opportunity to test our software and algorithms before deploying them in the physical world. This ensures that the operators are well versed with the system and reduces the risk of deploying the system directly. In this project, we have built an open source simulator to test the control system for Remotely Operated Vehicles (ROV).

We have used Unity3D, a physics engine to simulate the environment in which the ROV is deployed, and Robot Operating System (ROS) to build the control system for the same.

The Mission Statement

The tasks that can be undertaken by the ROV after deployment:

  1. ROV released from the Control station.
  2. Find and follow the Pipeline.
  3. Search for the leak in the pipeline and mark its coordinates.
  4. Cross gate checkpoints on the way back.
  5. Generate a 3D map of the surrounding.

The Scene

The Scene of the Simulator consists of hilly terrain water-filled in the form of a lake. The water contains a pipe with a leak and four gates. The scene also contains a control station that harbors the ROV. The seabed is a separate object used both for texturing purposes as well as a reference object for various functions involving depth information. The following images display the Scene of the Simulation.

Along with functionality, aesthetics also play a major role in the simulator. We have created the seabed using simplex noise and custom-built shaders to simulate the underwater environment. A distortion effect is added to the camera view to depict the light distortion that occurs when the water is moving, as well as underwater fog. Light diffusion effects are also added to simulate how sunlight diffuses through the water and falls on underwater objects.


The Control Station

Control Station


The Terrain

The Terrain


The Underwater Scene

The Underwater Scene


The User Interface

The User Interface (UI) allows the operator to monitor the state of the vehicle and help them to control the ROV. The layout of the UI is shown in the following image -

UI


Overview of the simulator

The simulation starts with the ROV attached to the control station ready for the launch. As soon as the user presses ’R’, the ROV is released from the station into the water. From hereafter, the operator has to control the 6 degrees of freedom of the ROV and to find the pipeline. The operator uses two camera views along with other sensor data as listed in the User Interface section, to control the vehicle. The degree of freedom and their controls are as follows: The ROV is aligned with the pipeline and then follows it to find the leak. When the leak is visible in the downwards facing camera, we consider the leak to be found, the coordinates of the leak are marked, and then the ROV proceeds towards gates suspended in the water, to better test the controls of the vehicle. Once we have crossed all four gates, we consider the mission a success.

The vehicle experiences buoyant force according to its submerged volume. This is achieved by dividing the vehicle mesh into triangles and then calculating the number of triangles that are submerged, whether fully or partially, underwater. An upward force is applied to individual triangles and the cumulative effect of those forces is seen as the buoyant force on the body. But to use this force, we need a constant downward force from the vehicle’s thrusters to keep the ROV stabilized at a depth, which can only be achieved after a control system integration from the ROS side. So to get the desired effect when using Unity3D alone, gravity is switched off.

Blender

All the Prefabs of the simulation were modeled using Blender. Objects are exported as a Filmbox (.fbx) file type from Blender, so they can be imported into Unity3D. The object's properties should be defined after it is imported into Unity. The major difference comes in the way the shaders built-in Blender behave in Unity, so sometimes new shader definitions are required after the objects are imported into Unity.

The Sensors

The sensors deployed on the ROV are as follows:

  • Two cameras- Front and Downward facing.
  • Off-the-Shelf Sonar
  • Inertial Measurement Unit (IMU)
  • Pressure Sensor
  • Temperature Sensor
  • Proximity Sensor
  • Navigation Lights