Week #3-4 (12-03 & 19-03) | Understanding developed code, thesis roadmap presentation and first development steps.

The third week of work was primarily spent on understanding the code previously created by former students that worked on the ATLASCAR project, mainly:

  • Visual perception unit, created by Diogo Correia as part of his masters thesis, more specifically the code that interfaces with the sensors, their calibration and the positioning of their respective coordinate frames.
  • Inclination module, developed by Armindo Silva, to understand how it calculates the vehicle orientation components (roll and pitch).
  • Dynamic map reconstruction, created by Pedro Salvado in his masters thesis, was the most thorough analysis, to understand and replicate how the local map reconstruction was achieved.

In this week there was also a small presentation for the LAR members and advisers in which we presented a crude tasks roadmap for the masters thesis followed by a discussion with the advisers.

In the fourth week began the code development.

The arduino code created by Armindo Silva was rebuilt to be used as a ROS node that uses RS232 communication to publish the orientation calculated from the sensor data.

In this first approach it was considered that the road is always flat. This simplification allows to ignore the spacial positioning of the vehicle in relation to the world frame, since in order to calculate it IMU and odometry data is necessary and is not available at the moment. So the efforts where focused on the relation between the car and the road, with the possibility of defining the relation between the road and the world in the future.

To do that two more coordinate frames where added to the car “robot” model, to represent the ground (static) and the centre car axis (transformation matrix calculated with the data sent from the orientation module).

Following that all the sensor frames where also changed to depend on the centre car axis (which moves according to the rotation of the vehicle), so that the measurements depend on a moving frame.

A ROS bag with the orientation information was created to test the system and the result was quite good.


View of the car and sensor frames (with static STL model)

View of the car and sensor frames (without STL model)


As we can see all the sensor frames move accordingly to the centre car axis.

View of the car and all sensor frames motion (static STL model)

Lastly it was also acquired a pointcloud dataset from the Sick LD-MRS sensor in order to verify its movement in relation to the movement of the car, but there was a small problem with the program that lead to some unexpected frame behaviour caused by the use of Rosbag, that needs to be corrected.

LIDAR sensor readings and representation in reference to the respective moving frame (/ld-mrs)

It was also created a GitHub repository where the developed code will regularly be updated. https://github.com/TiagoSMarques/Masters-Thesis_ATLASCAR2



Week #1-2 (26-02 & 5-03) | ROS Workshop, project research and preliminary report writing and discussion.

The last couple of weeks where spent mostly on the research and writing for the preliminary report. The research was focused on information gathering about the state of the ATLASCAR2 project, and the hardware and software solutions that are already implemented, specially focusing  on the hardware to be used, the Sick LD-MRS, as well as studying the problem for the thesis and the work that needs to be done to solve said problem. Naturally there were some oversights and a little confusion in relation to the work that was intended and the main guidelines of the project, but after the report discussion with my adviser, professor Vítor Santos , those misconceptions were corrected and complemented and  my understanding of the overall task in now much clearer.

This week there was also a workshop for the students that work at LAR, on ROS (Robot Operating System) given by professor Miguel Oliveira, in which we acquired the basic knowledge to work with the ROS environment, as well as some the tools this system provides that will be very useful for us to use in our projects.

The throughout the workshop all the members developed players for a game in the ROS environment as a way of learning and how this system works and its capabilities. Here’s a video showcasing the final game that was played, in which all the moving arrows represent a player and the objective is to hunt a team without being killed by the other,  the scoring is determined by the diference of kills and deaths for each player in a team.

In the end we gathered a good knowledge basis for ROS and its tools. Features like, driver packages for the main sensors that are to be used, the ease of data flux between nodes of the program, tools to easily deal with several data types like point clouds, laser scans, and others. This solutions that are already implemented  allow us to better focus on solving the problems and less on the hardware and the reading of its data, and is the main reason for the use for this environment.

Lastly by the end of week #2 my colleague Ricardo Silva and I, with the valuable help of another colleague Nuno Silva, were able to use the previous work done by Diogo Correia, and communicate and retrieve data from the sensors in ATLASCAR2 and visualizing them in Rviz tool, from the ROS system.

In the following week(s) I will be trying to focus my efforts on the first step of the project which is the creation of a ROS node for the integration of the vehicle rotation measurement module, as well as continuing to learn ROS syntax and taking a online C++ course from “http://www.learncpp.com/cpp-tutorial” that will be useful along the project.