- Follow the instructions in Nvidia IsaacSim Installation Guide to:
- Install Nvidia drivers compatable with IsaacSim and your Nvidia graphics card. You can check your graphics driver version with the command
nvidia-smi
, if you already have Nvidia graphics drivers installed. - Install docker engine (not desktop!)
- Download Nvidia's IsaacSim Docker image. You will need to generate an NGC Api key to obtain the IsaacSim docker image.
- Install the Nvidia container toolkit.
- Run the IsaacSim container as detailed in the tutorial above. Close any other compute-intense applications before running the IsaacSim container for the first time, it can take a few minutes to loadup the first time. Close the IsaacSim container before you move on with the following instructions.
- Download the Omniverse Launcher (not SDK) from Nvidia's website. To run the .AppImage file, apt install
libfuse2
, notfuse
.
-
(first time only) Complete steps in
Nvidia Setup
. -
(first time only) Run
docker compose -f docker-compose.isaacsim.yaml build
. -
Run
docker compose -f docker-compose.isaacsim.yaml up
. This can take 20-30s for IsaacSim server to start. -
Download the Nvidia omniverse launcher.
-
Start the omniverse launcher by navigating to your downloads folder and running
chmod +x omniverse-launcher-linux.AppImage
and./omniverse-launcher-linux.AppImage
. This will open the Omniverse Launcher:
- Download the Omniverse Streaming Client from the launcher's "Exchange" tab. Launch the streaming client from the Omniverse Launcher Library tab. Enter the IP address of the IsaacSim container when prompted (127.0.0.1). This should open a view of an empty sim in IsaacSim.
Ctrl + O
to open/isaac-sim/humble_ws/src/antworker_isaacsim_world/world/ant_drone_greenhouse.usd
and press the play button to start the simulation.
-
Optional: Plugin a Microsoft Xbox Controller (Model 1914) via USB.
X
button is deadman switch for rotation-only movement,right bumper
is deadman switch for linear/rotational movement.Right joystick
causes movement. -
Run
docker compose -f docker-compose.teleop.yaml
and use the teleop terminal or xbox controller the move the robot. Scale down velocity to approx 0.3 m/s and 0.6 rad/s, the simulated robot is unstable at faster velocities!
- (first time only) Run
gitmodule init
from the repo root. - (first time only) Run
docker compose -f docker-compose.slam.yaml build
- Start Isaacsim simulation as detailed above. You do not need to start the teleop file.
- Run
docker compose -f docker-compose.slam.yaml up
. - RVIZ2 will open a view of the robot, including its base frame, laser scans, and the map being generated.
- Drive the robot around with the teleop terminal or xbox controller.
- The map will expand as the robot drives around. The robot should stay well-localized to the map, and the scans should stay aligned to the map.
-
When you are happy with your map, run
docker exec -it linorobot2-slam bash
andros2 run nav2_map_server map_saver_cli -t /nav2/map -f /home/humble_ws/src/linorobot2_navigation/maps/greenhouse
-
Be sure the robot is in the map bounds or the map will not save.
-
You can view your map in the map folder
- Install gimp:
apt install gimp
- Open gimp and open the .pgm map image file generated by nav2
- Remove any dynamic obstacles that should not be in the static costmap with the
Eraser Tool
.
- Select
Tools -> Transform tools -> Rotate
, and rotate the image until the pipe rail lengths look horizontal. Rotation will induce a blur.
The edges of a rotated pipe rail should not slant from one row of pixels up or down, but be perfectly level.
- Select
Image -> Canvas Size
and change the canvas to fit the rotated image.
- Select
Filters -> Enhance -> Sharpen
. Change the "Amount" parameter until the pipe rails show up black.
The edges of a rotated & sharpened pipe rail should not slant from one row of pixels up or down, but be perfectly level.
- Use the
Pencil Tool
with white ink to remove any undesired dark pixels added in the sharpening.
- Press
Ctrl + Shift + E
and export the new map to .pgm format (use default export settings). - Press
Ctrl + Shift + E
and export the new map to .png format, if you want to use it in RMF traffic editor (use default export settings).
- Follow this tutorial to create the .pgm and .yaml files that represent a keepout mask.
- For compatbility with RMF traffic editor, set the map origin in the yaml file to
[0.0, y, 0.0]
, wherey = -1 * resolution * map height in px
(You can find map height in the .pgm file's properties). - Edit filter_mask_server.yaml ->
filter_mask_server -> yaml_filename
to point to the yaml file that lists your keepout filter .pgm image. - Run ___-compose.yaml. The keepout filter can be viewed in foxglove or rviz2.
https://github.com/open-rmf/rmf_demos/tree/humble
RMF traffic: If a vertex is not connected to a lane, IT WILL NOT BE INCLUDED IN THE NAV GRAPH!!
Nav2 critic scale: https://robotics.stackexchange.com/questions/105749/dwb-planner-in-nav2-does-not-properly-set-scale-for-critics
To get RMF fleet manager to recognize completed task, robot state must have empty path, and mode set to idle or charging
Forked from linorobot2.
- Run
xhost +local:docker
to allow RVIZ to display from Docker containers.
- (Optional) Plug in a Microsoft Xbox controller (Model 1914) to USB.
- Run
docker compose build
. - Run
docker compose up
. It may take 1-2 minutes for Gazebo to load. Ateleop twist
terminal window will open for you to drive the robot. - If running NAV, set the robot's starting position in RVIZ2 with the "2D pose estimate" button. Spin the robot a few times until the scans and map line up in RVIZ2.
- Drive the robot using:
- Xbox controller (Hold the right bumper and toggle the right joystick to move), or
- Run
docker exec -it linorobot2-nav-slam bash
andros2 run teleop_twist_keyboard teleop_twist_keyboard
to control the robot via keyboard.
-
Gazebo will display a view of the robot and the world, including all the lidar scans.
-
SLAM: RVIZ will open with the robot's view of the world and should begin auto-generating a map using SLAM.
-
NAV: RVIZ will open the robots view of the world with the pre-generated map.