You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+25-17
Original file line number
Diff line number
Diff line change
@@ -20,34 +20,42 @@ First of all, set up and run the example, as described in the [Getting Started](
20
20
documentation.
21
21
22
22
## Setting Up Your Own Robot
23
-
Provided a URDF, you only need to adapt the fusion tracker config. Take a look
24
-
at the [dbrt_example](https://git-amd.tuebingen.mpg.de/open-source/dbrt_getting_started/tree/master/dbrt_example)
25
-
to get started.
26
23
27
-
In the fusion tracker config file, you have to map all the joint names to
28
-
uncertainty standard deviations for the joint process model and joint
29
-
observation models. The [dbrt_example](https://git-amd.tuebingen.mpg.de/open-source/dbrt_getting_started/tree/master/dbrt_example)
30
-
package provides a good starting point.
24
+
Now you can use the working example as a starting point. To use your own robot, you will need
25
+
its URDF, and you will need to modify some launch and config files in [dbrt_example](https://git-amd.tuebingen.mpg.de/open-source/dbrt_getting_started/tree/master/dbrt_example). The launch files
26
+
should be self explanatory and easy to adapt. You will need to edit
27
+
the file fusion_tracker_gpu.launch (fusion_tracker_cpu.launch) to use
28
+
your own robot model, instead of Apollo.
29
+
30
+
The main work will be to adapt the fusion_tracker_gpu.yaml
31
+
(fusion_tracker_cpu.yaml) file to your robot. All the parameters
32
+
for the tracking algorithm are specified in this file, and it is robot
33
+
specific. You will have to adapt the link and joint names to your robot.
34
+
Furthermore, you can specify which joints should be corrected using the
35
+
depth images, how aggressively they should be corrected, and whether
36
+
you want to estimate an offset between the true camera and the
37
+
nominal camera in your robot model.
31
38
32
39
### URDF Camera Frame
33
40
34
-
In case your URDF model does not specify a camera link, you have to attach
35
-
one to some part of the robot where the camera is mounted. This requires
41
+
Our algorithm assumes that the frame of the depth image (specified by
42
+
the camera_info topic) exists in your URDF robot model. You can check the camera frame
43
+
by running
44
+
```bash
45
+
rostopic echo /camera/depth/camera_info.
46
+
```
47
+
If this frame does not exist in your robot URDF, you have to add such a camera frame to the
48
+
part of the robot where the camera is mounted. This requires
36
49
connecting a camera link through a joint to another link of the robot. Take a
37
50
look at [head.urdf.xacro](https://git-amd.tuebingen.mpg.de/open-source/dbrt_getting_started/blob/master/apollo_robot_model/models/head.urdf.xacro#L319) .
38
51
39
-
The XTION camera link *XTION_RGB* is linked to the link *B_HEAD* through the
52
+
The XTION camera link *XTION_RGB* is connected to the link *B_HEAD* through the
40
53
joint *XTION_JOINT*. The transformation between the camera and the robot is not
41
-
required to be very precise. However, it must be accurate enough to provide
54
+
required to be very precise, since our algorithm can estimate an offset.
55
+
However, it must be accurate enough to provide
42
56
a rough initial pose.
43
57
44
-
Finally, the camera link name (here XTION_RGB) must match the camera frame
45
-
provided by the point cloud topic. To determine the name of the depth camera
46
-
frame or the RGB frame if registration is used, run
0 commit comments