-
Notifications
You must be signed in to change notification settings - Fork 268
Running integration tests with mock hardware #1226
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #1226 +/- ##
==========================================
+ Coverage 3.59% 19.10% +15.51%
==========================================
Files 13 33 +20
Lines 947 3428 +2481
Branches 152 414 +262
==========================================
+ Hits 34 655 +621
- Misses 843 2739 +1896
+ Partials 70 34 -36
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
1c6e90d
to
44cfe0b
Compare
0d259f8
to
1727bd1
Compare
db57a95
to
72ac7ce
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it is a good idea to split up the robot_driver test into individual tests.
However, I think it would be beneficial to be able to specify mock hardware outside of the test parametrization. This way, we can run the mock hardware tests also when the integration tests flag is not set. This would effectively allow running some tests on the buildfarm, as well.
…)" This reverts commit 341506a.
)" This reverts commit 9959221.
Added some launch arguments and some checks to avoid connecting to the dashboard interface when using mock hardware, as that will not work. Maybe not the most elegant, but it works for now. Currently passthrough controller does not work at all with mock hardware, and is bypassed if using that in the test. test_trajectory_scaled_aborts_on_violation fails, as the hardware doesnt abort. test_set_io also fails as the controller cant verify that a pin has been set.
And into separate test cases. Moved timout for trajectory execution to common file
ac702c4
to
c488e21
Compare
@@ -364,10 +366,47 @@ def generate_dashboard_test_description(): | |||
) | |||
|
|||
|
|||
def generate_mock_hardware_test_description( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think that doesn't require a use_mock_hardware
parameter, right?
def generate_driver_test_description( | ||
tf_prefix="", | ||
initial_joint_controller="scaled_joint_trajectory_controller", | ||
controller_spawner_timeout=TIMEOUT_WAIT_SERVICE_INITIAL, | ||
use_mock_hardware="false", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since you created a separate one, we don't need to add the parameter here.
@@ -300,7 +300,9 @@ | |||
|
|||
|
|||
<gpio name="${tf_prefix}get_robot_software_version"> | |||
<state_interface name="get_version_major"/> | |||
<state_interface name="get_version_major"> | |||
<param name="initial_value">1</param> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in the config controller we test that it is not equal to 0. With this change, that would always be true, right?
@launch_testing.parametrize("tf_prefix, use_mock_hardware", [("", "true"), ("my_ur_", "true")]) | ||
def generate_test_description(tf_prefix, use_mock_hardware): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess, the use_mock_hardware
parameter can be removed from that.
rclpy.init() | ||
cls.node = Node("mock_hardware_test") | ||
time.sleep(1) | ||
cls.mock_hardware = use_mock_hardware == "true" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cls.mock_hardware = use_mock_hardware == "true" |
def test_start_scaled_jtc_controller(self): | ||
self.assertTrue( | ||
self._controller_manager_interface.switch_controller( | ||
strictness=SwitchController.Request.BEST_EFFORT, | ||
activate_controllers=["scaled_joint_trajectory_controller"], | ||
).ok | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When BEST_EFFORT is used, that will return ok, as soon as the controller exists.
|
||
def setUp(self): | ||
time.sleep(1) | ||
self.assertTrue(self._io_status_controller_interface.resend_robot_program().success) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think that this makes a lot of sense here.
def test_trajectory_scaled(self, tf_prefix): | ||
"""Test robot movement.""" | ||
# Construct test trajectory | ||
test_trajectory = [ | ||
(Duration(sec=6, nanosec=0), [0.0 for j in ROBOT_JOINTS]), | ||
(Duration(sec=6, nanosec=500000000), [-1.0 for j in ROBOT_JOINTS]), | ||
] | ||
|
||
trajectory = JointTrajectory( | ||
joint_names=[tf_prefix + joint for joint in ROBOT_JOINTS], | ||
points=[ | ||
JointTrajectoryPoint(positions=test_pos, time_from_start=test_time) | ||
for (test_time, test_pos) in test_trajectory | ||
], | ||
) | ||
|
||
# Execute trajectory | ||
logging.info("Sending goal for robot to follow") | ||
goal_handle = self._scaled_follow_joint_trajectory.send_goal(trajectory=trajectory) | ||
self.assertTrue(goal_handle.accepted) | ||
|
||
# Verify execution | ||
result = self._scaled_follow_joint_trajectory.get_result( | ||
goal_handle, | ||
TIMEOUT_EXECUTE_TRAJECTORY, | ||
) | ||
self.assertEqual(result.error_code, FollowJointTrajectory.Result.SUCCESSFUL) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That test doesn't make a lot of sense, since the mock hardware isn't scaling.
).ok | ||
) | ||
|
||
def test_trajectory(self, tf_prefix): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These are basically the same as the ones from the test_scaled_jtc.py test. I think, we should share the code, then.
Added some launch arguments and some checks to avoid connecting to the dashboard interface when using mock hardware, as that will not work. Maybe not the most elegant, but it works for now. Currently passthrough controller does not work at all with mock hardware, and is bypassed if using that in the test. test_trajectory_scaled_aborts_on_violation fails, as the hardware doesnt abort. test_set_io also fails as the controller cant verify that a pin has been set.