TUM LSR Technik Autonomer Systeme WS15/16
Group 6 (Laurenz Altenmüller, Frederik Ebert, Quirin Körner, Konrad Vowinckel)
https://www.lsr.ei.tum.de/en/courses/vorlesungen/wintersemester/master/technik-autonomer-systeme/
See installation instructions.
roslaunch tas_simulator startSimulation.launch
roslaunch tas_simulator startNavigation.launch
roslaunch tas hardware.launch
roslaunch tas odom.launch
roslaunch tas move_base.launch
More info on the ROS network setup is on ROS Wiki.
Prerequisites: Append the ROS_HOSTNAME line to both the car's and your client machine's .bashrc:
export ROS_HOSTNAME=$(hostname).local
- if you are using ROS in a virtual machine, make sure it is set to bridged network mode.
- ssh into the car:
ssh [email protected]
(or create an ssh config file so you can just dossh vettel
and skip password prompt) to launch your files - on your laptop, start a terminal and connect to the car via ethernet or wifi
- set the ROS master location:
export ROS_MASTER_URI=http://vettel.local:11311
- start your local nodes: e.g.
rviz
, orrosrun image_view image_view image:=/px4flow/camera_image
Laurenz: Odometry
Konrad: Trajectory Rollout
Frederik: SBPL [LQR Controller](#LQR Controller)
Quirin: Parking, Rotated Template Matching
The enhanced odometry stack uses the following external nodes:
- [robot_localization ekf_localization_node] (http://wiki.ros.org/robot_localization#ekf_localization_node) as extended kalman filter
- laserscan_multi_merger for merging front and back laser scans
- laser_scan_matcher Laser-based odometry estimation
- Custom px4flow_node fork for communication with visual odometry sensor
- mtnode.py xsens_driver for IMU communication
- rosserial_arduino and rosserial_python for communication during motor encoder processing
The tas_odometry package contains the following nodes created by us:
- perfect_odometry, which analyzes gazebo link states to provide "perfect" odometry during simulation
- imu_bias_compensation, which tries to compensate constant acceleration offsets in the IMU (when EKFs gravity compensation is not used)
- optflow_odometry, which converts data from the PX4flow sensor to usable odometry (twist) messages. It calculates EKF uncertainty covariance matrix value from image quality data.
- motor_odometry, which generates odometry (twist messages) from encoder inter-tick times as reported from the encoder microcontroller unit
Code specific to motor encoder processing:
- MotorOdometry: Code running on Atmega32u4 MCU. Communicates with ROS via native USB using the rosserial protocol.
- MotorTest: Test sketch for the MCU. Prints detected encoder revolutions via raw serial.
For prepairing the algorithm and for visualization the Bézier Curve was programmed in Matlab. To execute the matlab functions:
- addpath( [matlab_code](/Matlab Code Konrad/) )
- read in Costmap [>>Example](/Matlab Code Konrad/Readme.md)
- alternatePath(Costmap, hight, width, resolution, minRadius)
The ownLocalPlanner package subscribes the following Nodes:
- Global Plan: Subscribe to the Global Path wich is published by move_base
- Local Costmap: Subscribe to the Local Costmap wich is published by move_base. The current position is in the middle point of the costmap.
- TF: The TransformListener reads the transfrom between "/map" and "/base_link" to get the global postion of the car
rosrun ownlocalplanner ownlocalplanner
The global Path with the ownlocal-Path on the first 3 Meter is published as "/ownPath". This is, as the global Path, a nav_msgs. The Path can be visualized by RVIZ and can be used for the Controller. Therefor change the String for Subscribtion from "/move_base_node/TrajectoryPlannerROS/global_plan" to "/ownPath".
Sbpl is installed from the repo:
[email protected]:MASKOR/maskor_navigation.git
The controller is launched with:
roslaunch tas_autonomous_control controller.launch
There are two nodes for the parking process.
- The "findpark"-node detects parking spots with template matching or feature extraction.
- The "parking"-node runs the actual parking procedure.
Before starting the parking slot detection, you have to accquire a map, includeing at least one parking spot. After that run:
roslaunch findpark findpark.launch
Once the car reached the start position run the "parking"-node
Note: For showcasing without car, the findpark.cpp is now in testmode. This means, that the testMap.pgm is loaded instead of the real map. If you want, you may change this in the code with the TEST-flag.
For more detailed information how the findpark-node works, look into the readme: /findpark/readme
Once the car is at the right starting position (calculated by "findpark" or the marked spots at the floor) start:
rosrun parking parking
The parking node uses the raw data of the messages
- /scan_back
- /scan
These messages contain the laser distance table. The movements a directly published to the node
- /servo
For more information how the parking-node works, look into the readme: /parking/readme