This repository was archived by the owner on Feb 27, 2025. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 29
This repository was archived by the owner on Feb 27, 2025. It is now read-only.
Execute ROS2 migration testing plan #387
Copy link
Copy link
Closed
Labels
systemsRelated to building and automation (docker/github)Related to building and automation (docker/github)
Description
Background
Before merging #350, we need to create and execute a thorough test plan to make sure all of our systems work in ROS2 as well as they do in ROS. All tests in the testing plan should be run both on a local setup and on the robot computer unless otherwise stated.
Important Notes
This issue is missing testing plans for offboard_comms, task_planning, execute, and gui. Once these packages are migrated they should be added to the testing plan below.
Running ROS2 docker images
To run the updated docker images:
- Comment
/build-artifactsunder Migrate to ROS2 #373. This will trigger the build-artifacts workflow, which builds the docker images and makes them available as docker images that can be downloaded from GitHub. Alternatively, the images can be built locally but this is generally slower. - Navigate to Actions > build-artifacts and select the workflow corresponding to your comment. The artifacts will be located at the bottom of the window. Download
onboardandlandside. - Navigate to the downloaded files on your machine. Note that the onboard artifact is two images. We only need
amd64-onboard.tar.gz. - Load the images with
docker load -i amd64-onboard.tar.gzanddocker load -i amd64-landside.tar. This will create an image calleddukerobotics/robosub-ros:onboard-amd64anddukerobotics/robosub-ros:landside - Re-tag the images with the following sequence of commands:
docker tag dukerobotics/robosub-ros:onboard-amd64 dukerobotics/robosub-ros2:onboard
docker image rm dukerobotics/robosub-ros:onboard-amd64
docker tag dukerobotics/robosub-ros:landside dukerobotics/robosub-ros2:landside
docker image rm dukerobotics/robosub-ros:landside- Note that if you had a
dukerobotics/robosub-ros:landsideimage before, it will be untagged after step 4. If you want to retag it, usedocker image lsto find its image id, and rundocker tag <image_id> dukerobotics/robosub-ros:landsideafter step 5. The reason to do this is so that both the original ROS1 and ROS2 images are available on your machines, so you can swap between the two easily. - Change the image names in
docker-compose.yaml, so that the ROS2 images are referenced:
services:
onboard:
image: dukerobotics/robosub-ros2:onboard
...
landside:
image: dukerobotics/robosub-ros2:landside- Now, you can run
docker compose up -dordocker runas normal to start the ROS2 containers!
Running ROS2 code
- Once inside the docker containers, run
./build.shto build the corresponding ROS2 workspaces. - Run
source ${COMPUTER_TYPE}/ros2_ws/install/setup.bashto make our custom packages recognizable. - To run nodes:
ros2 run <package> <node> --ros-args -p <param_1_name>:=<param_1_value> -p <param_2_name>:=<param_2_value>Don't use a.pyextension when calling the node. - To run launch files:
ros2 launch <package> <launch_file>.launch.py <arg_1_name>:=<arg_1_value>. Note that the.launch.pyextension is required, otherwise your launch file won't be recognized
Testing Plan
This testing plan encompasses all of the general functionality we need before we can safely migrate to ROS2 (along with our CI checks).
Landside
camera_view
- Record bag from cameras using ROS2 bag cli. Add usage to the README
- Run
bag_to_videoto convert the bag to an avi file, and verify that the feed looks correct - Run
video_to_bagto convert the avi file back to a bag file. Note that the size should increase drastically due to frame padding, this is expected (use a small original bag file). - View stereo and mono camera feeds from landside while
avt_camerais publishing. Update README to include new commands for viewing feeds.
joystick
- Connect F310 joystick to the robot computer and run
F310.launch.py. Verify thatjoystick/rawandcontrols/desired_powerare receiving messages. - Connect Thrustmaster joystick to the robot computer and run
thrustmaster.launch.py. Verify thatjoystick/rawandcontrols/desired_powerare receiving messages.
simulation
- Run
test_sim_comm.launch.pyin an empty scene and verify that there are no errors (the robot doesn't need to move in a square). - Run
test_sim_comm.launch,pyin a scene with an object (i.e. gate) and verify thatfake_cv_maker.pyworks correctly.
Onboard
acoustics
- Run
acoustics.launch.pyin simulation and use the ROS2 action cli to generate some sample data. Verify that the results are expected.
avt_camera
- Connect the left and right Allied Vision cameras to the robot computer and run
mono_cameraon both. Verify that the cameras connect and the corresponding topics are being published to (camera/left/image_rawandcamera/left/camera_info) - Run
stereo_cameras.launch.pyand verify that all of the corresponding topics are being published.
controls
- Run
controls.launch.py transform:=trueto verify that controls can be tested without simulation - Run
controls.launch.py sim:=truewhile the simulation is running. Then runtest_state_publisherwith pose, velocity, and power control and verify that there are no errors (movement will probably be pretty bad). - Run
controls.launch.pyon the robot computer whilestate.launch.pyis running. Verify that the robot moves.
cv
- Move a test model to the
modelsfolder and rebuild the package. - Run
cv.launch.pyandros2 run cv test_imagesand verify that the expected topics are published to without error. Note thattest_imagesis currently configured to send images to the left camera topic.
data_pub
- Run
pub_dvl.launch.pyand verify that the computer connects to the dvl. Verify that thedvl/rawanddvl/odomtopics are being published to. Make sure the data is reasonable and the publishing rate is adequate. - Run
pub_imu.launch.pyand verify that the computer connects to the imu. Verify that thesensors/imu/imuandsensors/imu/magtopics are being published to. Make sure the data is reasonable and the publishing rate is adequate. - Run
pub_depth.launch.pyand verify thatoffboard/pressureis receiving values andsensors/depthis being published to. Make sure the data is reasonable and the publishing rate is adequate. This requiresoffboard_commsto be running to receive pressure sensor data from the Arduino. - Run
pub_all.launch.pyto make sure all of the sensors work together.
sensor_fusion
- Run
fuse.launch.pywhile the DVL and IMU are publishing. Verify that/stateis published and has reasonable values. - Verify that we don't need to publish
robot_descriptionto get tf2 transforms. Our old documentation says that this is needed but I don't think this is the case anymore.
static_transforms
- Run
static_transforms.launch.pyand verify that the correct transform values are being published.
system_utils
- Run
system_infoand verify that the correct system usage messages are published. - Run
remote_launchand verify that thestart_nodeandstop_nodeservices are created. Use theros2 servicecli to start and stop a test node and test launch file.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
systemsRelated to building and automation (docker/github)Related to building and automation (docker/github)