First of all, let me declare that this project is based on the [ROS Robotic Arm Introduction Tutorial] of Kazakhstan UP at station B , and reproduced in the early stage [Robotic Arm Vision Grabbing from Theory to Practical Combat].
This content is the phased achievement display and technology of his graduate career Share, all data and code are open source. So Pengpeng, I hereby reproduce it again. The hardware I use is different from it. The UP master uses UR5, and my laboratory uses UR3. The relevant materials UR3CB3.12 are listed below: https
: //www.universal -robots.cn/cb3/
[UR3 system upgrade to CB3.12 with URcap1.05]
hardware support
serial number | name | Function |
---|---|---|
1 | UR3 robotic arm | executor of all actions |
2 | D435i | executor's eyes |
3 | Ubuntu deep learning environment | executive thinking brain |
4 | Plane Grabbing Platform | Experimental operating environment |
5 | 6*6 black and white checkerboard calibration board | Calibration of the camera eye outside the hand |
6 | 3x3x3cm 3D printing small cube | artifact actor |
The code supports
the ros robotic arm entry tutorial code version3.0 link:
link: https://pan.baidu.com/s/1KFLQXVWShG5KfroCd6eM0A=8888
extraction code: 8888
[ROS robotic arm entry tutorial - small five] PPT
link: https:/ /pan.baidu.com/s/18ierKnf8OJPPvGn8uOf1hw=8888
Extraction code: 8888
1 Overview
[Autolabor Primary Tutorial] Getting Started with ROS Robots
Robot Operating System ROS Quick Start Tutorial
2. Introduction to ROS
2.1 Introduction to ROS
ROS official website: http://wiki.ros.org/
autolabor address: http://www.autolabor.com.cn/book/ROSTutorials/
Moveit api:https://moveit.ros.org/documentation/source-code-api/
melodic:http://docs.ros.org/en/melodic/api/moveit_tutorials/html/index.html
2.2 ROS robot package
ros-melodic:git clone https://github.com/ros-industrial/universal_robot.git
ros-noetic:https://blog.csdn.net/Dawn_yc/article/details/114791755
3. Robot URDF modeling
3.1 Introduction to URDF
URDF is the description format of the robot model in ROS, including the description of the robot rigid body appearance, physical properties, joint types, etc.
3.2 View model
$ check_urdf xxx.urdf Check the xxx.urdf file and find an error
$ urdf_to_graphiz xxx.urdf
Generate a pdf file from the urdf file for visualization
$ roslaunch ur_description view_ur5.launch Display the ur5 model
3.3 Configure and save Rviz
roslaunch ur_description view_ur5_with_gripper.launch Exhibition ur5+robotiq85 model
roslaunch ur_description view_ur5.launch
rviz
The rviz save path is inuniversal_robot/ur_description/cfg/2222.rviz
<?xml version="1.0"?>
<launch>
<include file="$(find ur_description)/launch/ur5_upload.launch"/>
<node name="joint_state_publisher" pkg="joint_state_publisher" type="joint_state_publisher" >
<param name="use_gui" value="true"/>
</node>
<node name="robot_state_publisher" pkg="robot_state_publisher" type="state_publisher" />
<node name="rviz" pkg="rviz" type="rviz" args="-d $(find ur_description)/cfg/2222.rviz" required="true" />
</launch>
4. Moveit! core function introduction and Rviz control
4.1 Introduction to Moveit!
MoveIt! is a tool set software related to the robotic arm, which
integrates various functions:
Ø Kinematics
Ø Motion Planning
Ø Collision Checking
Ø 3D Perception
Ø Manipulation
4.2 Robot ROS package architecture
rosrun moveit_setup_assistant moveit_setup_assistant
Load the urdf model
to check if there is a collision between the joints
Select the solver,
select the reference connection and end solver,
configure the relevant posture position,
lie flat and all zero
up posture 0,-1.5708,0-1.5708,0,0
Configure the end gripper
Fill in the personal information
and finally create the model
Import model directly
4.3 Computer settings
There is a corresponding Rviz demo after the moveit configuration
roslaunch ur5_moveit_config demo.launch
Random target position, click plan&execute, you can move to the corresponding position
Add obstacle
gripper demo
roslaunch ur5_gripper_moveit_config demo.launch
5. Gazebo simulates or controls a real robot
HyperTerminal
sudo apt install terminator
5.1 Connection between Moveit and controller
How to use
# gazebo仿真中控制机器人
roslaunch ur_gazebo ur5.launch
roslaunch ur5_moveit_config ur5_moveit_planning_execution.launch sim:=true
roslaunch ur5_moveit_config moveit_rviz.launch config:=true
# 控制真实UR5机器人
一键启动:
roslaunch ur_planning start_ur5.launch
分开启动:
roslaunch ur_modern_driver ur5_bringup.launch
roslaunch ur5_moveit_config ur5_moveit_planning_execution.launch
roslaunch ur5_moveit_config moveit_rviz.launch config:=true
# rviz 仅作可视化显示
roslaunch ur5_moveit_config demo.launch
rosrun ur_planning moveitServer.py
5.2 Gazebo simulation
5.3 Control the real UR5 robot
one-button start:
roslaunch ur_planning start_ur5.launch
Start separately:
roslaunch ur_modern_driver ur5_bringup.launch
roslaunch ur5_moveit_config ur5_moveit_planning_execution.launch
roslaunch ur5_moveit_config moveit_rviz.launch config:=true
6. Moveit basics (python)
6.1 Moveit! Review
6.2 Joint space movement
Install moveit_commander in Python
sudo apt-get update
sudo apt-get install ros-melodic-moveit-commander
# 关节规划,输入6个关节角度(单位:弧度)
def move_j(self, joint_configuration=None,a=1,v=1):
# 设置机械臂的目标位置,使用六轴的位置数据进行描述(单位:弧度)
if joint_configuration==None:
joint_configuration = [0, -1.5707, 0, -1.5707, 0, 0]
self.arm.set_max_acceleration_scaling_factor(a)
self.arm.set_max_velocity_scaling_factor(v)
self.arm.set_joint_value_target(joint_configuration)
rospy.loginfo("move_j:"+str(joint_configuration))
self.arm.go()
rospy.sleep(1)
# 空间规划,输入xyzRPY
def move_p(self, tool_configuration=None,a=1,v=1):
if tool_configuration==None:
tool_configuration = [0.3,0,0.3,0,-np.pi/2,0]
self.arm.set_max_acceleration_scaling_factor(a)
self.arm.set_max_velocity_scaling_factor(v)
target_pose = PoseStamped()
target_pose.header.frame_id = self.reference_frame
target_pose.header.stamp = rospy.Time.now()
target_pose.pose.position.x = tool_configuration[0]
target_pose.pose.position.y = tool_configuration[1]
target_pose.pose.position.z = tool_configuration[2]
q = quaternion_from_euler(tool_configuration[3],tool_configuration[4],tool_configuration[5])
target_pose.pose.orientation.x = q[0]
target_pose.pose.orientation.y = q[1]
target_pose.pose.orientation.z = q[2]
target_pose.pose.orientation.w = q[3]
self.arm.set_start_state_to_current_state()
self.arm.set_pose_target(target_pose, self.end_effector_link)
rospy.loginfo("move_p:" + str(tool_configuration))
traj = self.arm.plan()
self.arm.execute(traj)
rospy.sleep(1)
6.3 Cartesian Space Motion
# 空间直线运动,输入(x,y,z,R,P,Y,x2,y2,z2,R2,...)
# 默认仅执行一个点位,可以选择传入多个点位
def move_l(self, tool_configuration,waypoints_number=1,a=0.5,v=0.5):
if tool_configuration==None:
tool_configuration = [0.3,0,0.3,0,-np.pi/2,0]
self.arm.set_max_acceleration_scaling_factor(a)
self.arm.set_max_velocity_scaling_factor(v)
# 设置路点
waypoints = []
for i in range(waypoints_number):
target_pose = PoseStamped()
target_pose.header.frame_id = self.reference_frame
target_pose.header.stamp = rospy.Time.now()
target_pose.pose.position.x = tool_configuration[6*i+0]
target_pose.pose.position.y = tool_configuration[6*i+1]
target_pose.pose.position.z = tool_configuration[6*i+2]
q = quaternion_from_euler(tool_configuration[6*i+3],tool_configuration[6*i+4],tool_configuration[6*i+5])
target_pose.pose.orientation.x = q[0]
target_pose.pose.orientation.y = q[1]
target_pose.pose.orientation.z = q[2]
target_pose.pose.orientation.w = q[3]
waypoints.append(target_pose.pose)
rospy.loginfo("move_l:" + str(tool_configuration))
self.arm.set_start_state_to_current_state()
fraction = 0.0 # 路径规划覆盖率
maxtries = 100 # 最大尝试规划次数
attempts = 0 # 已经尝试规划次数
# 设置机器臂当前的状态作为运动初始状态
self.arm.set_start_state_to_current_state()
# 尝试规划一条笛卡尔空间下的路径,依次通过所有路点
while fraction < 1.0 and attempts < maxtries:
(plan, fraction) = self.arm.compute_cartesian_path(
waypoints, # waypoint poses,路点列表
0.001, # eef_step,终端步进值
0.00, # jump_threshold,跳跃阈值
True) # avoid_collisions,避障规划
attempts += 1
if fraction == 1.0:
rospy.loginfo("Path computed successfully. Moving the arm.")
self.arm.execute(plan)
rospy.loginfo("Path execution complete.")
else:
rospy.loginfo(
"Path planning failed with only " + str(fraction) +
" success after " + str(maxtries) + " attempts.")
rospy.sleep(1)
6.4 Interacting with the environment
# 在机械臂下方添加一个table,使得机械臂只能够在上半空间进行规划和运动
# 避免碰撞到下方的桌子等其他物体
def set_scene(self):
## set table
self.scene = PlanningSceneInterface()
self.scene_pub = rospy.Publisher('planning_scene', PlanningScene, queue_size=5)
self.colors = dict()
rospy.sleep(1)
table_id = 'table'
self.scene.remove_world_object(table_id)
rospy.sleep(1)
table_size = [2, 2, 0.01]
table_pose = PoseStamped()
table_pose.header.frame_id = self.reference_frame
table_pose.pose.position.x = 0.0
table_pose.pose.position.y = 0.0
table_pose.pose.position.z = -table_size[2]/2 -0.02
table_pose.pose.orientation.w = 1.0
self.scene.add_box(table_id, table_pose, table_size)
self.setColor(table_id, 0.5, 0.5, 0.5, 1.0)
self.sendColors()
7. Moveit foundation (C++) to achieve path planning with constraints
8. Motion Planning - How to choose the planning algorithm in Moveit?
8.1 Why path planning is needed
Ø Obstacle avoidance: avoid collisions with static objects near the robotic arm such as tables; avoid
collisions with dynamic objects such as (suddenly approaching) people
Ø The task has requirements on the path of motion: with kinematics or dynamics constraints, such as welding , Grab
a cup filled with water , etc.
n Path planning classification
Ø Based on search, Dijkstra, A*, Anytime A*, ARA*, D*
Ø Based on sampling, PRM, RRT, RRT-connect, RRT*, Kinodynamic-
RRT* (Conforming to dynamics), Anytime RRT*, Informed RRT*
Ø Intelligent algorithms such as genetic algorithm, ant colony algorithm
Sampling-based path planning algorithm
Ø Basic knowledge: complete introduction: the solution will be found when the time is infinite;
Ø PRM: 1. Random first Sampling n points to form Figure 2. Searching based on graphs
Ø RRT: fast search for random trees, non-optimal solutions; querying near can use kdtree and other data structures to
quickly find the nearest node in the tree
Ø RRT-connect: bidirectionally extended RRT, Expand from start and goal at the same time, and the search speed is relatively
fast. The default algorithm of ros-moveit
Ø RRT*: Two improvements have been made, one is to change the rules for connecting new nodes to the tree, and the other is to "prune" the
search tree ” operation to make it closer to the real optimal path
8.2 How to choose a path planning algorithm
- Rviz
- Moveit_setup_assistant
- in the program
9. Visual obstacle avoidance
9.1 Introduction
9.2 Official demo
Ø Only one point cloud/depth map input is needed to generate a raster map;
$ roslaunch moveit_tutorials obstacle_avoidance_demo.launch
Ø Cylindrical obstacles can be generated from the grid map
$ roslaunch moveit_tutorials detect_and_add_cylinder_collision_object_demo.launch
(Note: When running this statement, you need to close the previous command)
Function package:
https://github.com/ros-planning/moveit_tutorials
https://github.com/ros-planning/panda_moveit_config
9.3 Actual Combat
roslaunch ur5_moveit_config demo_with_obstacle_avoidance.launch
Main steps
Ø 1. Start the camera ros program
$ sudo apt-get install ros-melodic-realsense2-camera
$ sudo apt-get install ros-melodic-realsense2-description
$ roslaunch realsense2_camera demo_pointcloud.launch
Ø 2. Modify the moveit configuration file
(1) modify the point_cloud_topic of xxx_moveit_config/config/sensors_kinect_pointcloud.yaml to be
the topic published by the point cloud of your own camera
(2) or modify the image_topic of xxx_moveit_config/config/sensors_kinect_depthmap.yaml
to be the topic published by your own camera depth map
Ø 3 (Optional) Modify visual obstacle avoidance parameters, such as grid resolution octomap_resolution, etc.
Ø 4 (Optional) Modify camera parameters, such as resolution, update frequency, etc.
Ø 5. Publish the pose of the camera relative to the base coordinates of the manipulator
<node pkg="tf2_ros" type="static_transform_publisher"
name="camera_to_robot" args="0.72032458 -0.21936176 0.90925255
0.98422218 0.0969083 -0.04779138 0.14011233 camera_color_optical_frame
world" />
roslaunch ur5_moveit_config demo_with_obstacle_avoidance.launch
Reference https://blog.csdn.net/ssw_1990/article/details/104053041
Reference https://blog.csdn.net/ssw_1990/article/details/104053041
10. ROS and Deep Learning
10.1 Introduction
ROS-Moveit!
ROS-melodic以及之前版本Moveit!默认使用python2.7
ROS-noetic以及ROS2支持python3
Deep learning
Deep learning uses python3, and deep learning generally needs to configure the environment in a virtual environment.
How to integrate deep learning into ROS?
10. 2 Actual Combat
n Coexistence of ROS and Anaconda
Ø 1. One-click installation of ROS
$ wget http://fishros.com/install -O fishros && bash fishros
Ø 2. Install anaconda
https://blog.csdn.net/KIK9973/article/details/118772450
Baidu search for the anaconda installation tutorial corresponding to the ubuntu version
Ø 3. Create a virtual environment in anaconda
conda create -n YOUR_ENV_NAME
pyhton=3.7
Ø 4. Install the ros library in the virtual environment
$ conda activate
$ pip install rospkg/rospy
Note: Regardless of whether it is a virtual environment or not, as long as rospkg is installed, you will be able to find ros packages such as rospy,
std_msgs, etc., so that topic communication and service communication can be performed.
Ø 5. Encapsulate the moveit function into a server (Lecture 6 - Moveit Basic-python/ros service communication)
$ rosrun ur_planning moveitServer.py
Ø 6. Write a client in the script in the environment, and connect with the moveit function through service communication
11. Realize 6-DOF visual capture based on ROS-Moveit
11.1 Visual grabbing
- Plane capture
Ø Cornell dataset: 1035 RGBD images, 8019 labels
Ø Jacquard dataset: 54K RGBD images, 1.1 million labels
Ø Classic plane capture algorithm: GGCNN[1], GRCNN[2],
Swin-Transformer[3] ] - Six degrees of freedom to grab
the GraspNet-1Billion dataset
11. System framework design
n Motion planning
Ø Function: Arrive at
the position and attitude specified by the algorithm
Ø Path planning: Plan a collision-free trajectory from the current position
to the expected position n Hand-eye calibration Ø Function: Obtain the relationship between the robot coordinate system and the camera coordinate system Ø Effect: Currently available Achieve error within ±5mm
11. 3 Crawling experiment
n graspness[6] Single object grasping experiment
List of grasped items:
11 objects in total: 1. Apple 2. Banana 3. Flashlight 4. Baseball 5. Shoe brush
6. Badminton box 7. Radio 8. Down jacket cleaner
9. Small white shoe cleaner 10. U disk 11. Remote control
11.3 Crawling experiment
n Graspness[6] complex scene continuous grasping experiment
Ø Conclusion: Compared with other algorithms, grasping pose prediction results of grasping[6] are more accurate;
grasping can also be detected for most small objects and objects with the same color as the background Attitude, the robustness is stronger.
Ø Existing problems:
l Objects that are too "flat" and low in height make it difficult to detect the grasping posture
l During the grasping process, collisions with other objects may occur