OneNote Links for my Personal Documentation

you can find the complete notes here.

Simulation of UR5

Using GUI

The following lines of code can bring up the Gazebo, Rviz with ur5 bot in it.

roslaunch ur_gazebo ur5.launch limited:=true
roslaunch ur5_moveit_config ur5_moveit_planning_execution.launch sim:=true limited:=true
roslaunch ur5_moveit_config moveit_rviz.launch config:=true
roslaunch ur_robot_driver ur5_bringup.launch robot_ip:=127.0.0.1

for more information regarding the UR5 and simualtions go to UR_ROS_tutorial, Universal_robot_github

demo

UR5 using GUI

Simulation of Staubli RX160

The following lines of code can bring up the Gazebo, Rviz with ur5 bot in it.

roslaunch stuabli_rx160_moveit_config demo_gazebo.launch

I had made changes for the moveit_config file, since there is no gazebo launch files prewittern in the original staubli repository.

Note1: If you want to make you own changes for the moveit_config file from strach then follow this tutorial, github (even the video is in Malayalam you can understand the sequence of step in a perfect way).

demo

Staubli using GUI

Simulation of Intel RealSense depth camera D435

The following lines of code can bring up the Gazebo, Rviz with realsense camera in it.

roslaunch realsense2_description view_d435_model_rviz_gazebo.launch

Note1: you need to add the 3d objects in the gazebo which are available on top of the menu bar. In rviz you can see the depth and the camera perception.

Note2: The gazebo plugin is readily available with the IntelRealSense repo. So this took to me another repo by pal robotics. However for the complete setup I have been following the this tutorial, modified_realsense_description2, gazebo_plugin. I could tell that the last 2 repositories would be enough to simulate the necessary stuff.

demo

Intel RealSense

when a real sense hardware is available:

roslaunch realsense2_camera rs_rgbd.launch
rviz
roslaunch realsense2_camera opensource_tracking.launch
rviz

RealSense SLAM tutorial

SLAM complete setup and working process go to github.

Things to remember while doing slam:

  • don’t forget to turn on mapgraph, else you will be seeing the current mesh cloud rather than a complete picture of the room.
  • do check the quality of in the terminal, if it goes to zero better go in slow velocity, if still no use, better restart.
  • use the pcl_veiwer command to check the entire slam output in the end.
  • the rosbag file stores data in GB, delete this file in the end if your work is done.

Getting a 3D mesh out of cloud points: colab Inside the collab file you have both the code for the python to capture data and store in .npy format and then you have basic visual to make a 3d mesh out of the cloud data you have.

[Real Sense Slam]

Useful Info

Working together with Anaconda and ROS.

when anaconda is installed along with ros, always give the path explicitly, since we need ros primarly than conda. 1.open a terminal

Type:

export PATH="/home/saisriteja/anaconda3/bin:$PATH"

and then activate anaconda navigator by typing:

anaconda-navigator

Set Bot in Free Drive To get Data

You can connect urx and ur-rtde at the same time, to collect data from the robot I have used ur-rtde and I have used IPython to set the robot in free drive using urx. Code for getting into freedrive.

import urx
robot = urx.Robot("192.16.101.225")
robot.set_freeDrive(1)

I have used this while collection data during here.