Paper notes 20,190,620

1, A Practical Visual Servo Control for an Unmanned Aerial Vehicle
UAV practical visual servo control
https://ieeexplore.ieee.org/document/4481181
Abstract: is presented for image-based visual servo control can be static or quasi-static flying unmanned aerial vehicle (UAV), in which the camera is mounted on a vehicle. Target contemplated includes a plane and a limited set of disjoint stationary point. Using visual estimation error based on the spherical centroid data, and the image feature extraction unit and the embedded inertial measurement line speed and the direction of gravity inertial position and orientation to decouple dynamic control . Used to compensate for poor visual error adjusting the image Jacobian matrix gain term introduced by the non-uniform error of the measurement is adapted to visual acuity. A nonlinear controller to ensure that the index system considering convergence, full dynamics Lyapunov function design use export control system. Four-rotor UAV by the French Atomic Energy Commission to develop experimental results demonstrate the robustness and performance of the proposed control strategy.
Following the introduction, Section II describes the basic equations of motion four-rotor UAV. Section III describes the proposed selection of image features. Section IV provides a translational motion of motion control design. Section V extend to the full dynamic control system. Part VI of the experimental results (FIG. 1) obtained in Experiment 4 rotorcraft. Finally, Section VII provides some concluding remarks
Here Insert Picture Description
2, SE (4) on a four-rotor UAV geometry tracking control
https://ieeexplore.ieee.org/document/5717652
Abstract: The four-rotor unmanned aerial vehicle (UAV) tracking control provides a new result. There are four degrees of freedom input UAV, i.e. the size of the four rotor thrust, for controlling the six translational and rotational degrees of freedom, and four outputs asymptotically track, i.e., three position variable mass center of the vehicle and a direction of the fixed shaft body. The introduction of four-rotor UAV rigid body dynamics model globally defined as a basis for analysis. On special Euclidean group SE (3) nonlinear tracking controller developed and shown to have almost globally over the loop characteristics. Some numerical examples including an example in which four reverse recovery from the initial rotorcraft.

Conclusion
We propose a global dynamic model of a four-rotor UAV, we have developed a direct geometric tracking controller on special Euclidean group, and it has no intrinsic coordinates, thus avoiding the singularity of Euler angles and quaternion ambiguity when referring to attitudes. When the initial attitude error is less than 90 °, it exhibits a stability index, when the initial attitude error of less than 180 °, which yields a global index almost attractive. These illustrated by numerical examples.
Here Insert Picture Description
3,3D Vision for Mobile robot Manipulator on Detecting and Tracking Target
mobile robot three-dimensional robot vision detection and tracking target -2015
https://ieeexplore.ieee.org/document/7364605
Abstract: In this study, the mobile robot manipulation application of stereo vision systems to track and catch objects of different disciplines to produce a robot system has the ability to perform complex tasks in real. ** for gesture-based visual servo stereo camera and configured to generate a 3D map of the object manipulator. Provides a mobile robot control, stereo vision algorithms, transformation function and inverse kinematics, ** so that a more comprehensive study. The study succeeded in computer vision technology into the hands of mobile robot, defines the target base two cameras, and to solve the inverse kinematics function 4 degrees of freedom manipulator elbow. Tracking imaging system successfully measured distance; the object is located within the working space of the manipulator, the computer calculates the inverse equation of motion, to find the best angle of the arm reaches the destination. Experimental results show that the ability of the proposed method of control.
In this paper, the mobile robot manipulator method based on stereoscopic vision based on a position servo (PBVS) a. Stereo camera output for detecting and tracking targets. Also constructed three-dimensional coordinate based vision system. After solving the relationship between the target and the robot, the robot reaches the calculated inverse kinematics and seize the objective.
Application of the color detection method using a threshold value of - in communication - labeled component - distance measurement - the least squares method to reduce an error
3D pose estimation: Object detection, distance measurement phase.
In this experiment, a random target in front of the robot, the mobile robot moves slowly, using a stereo camera to detect an object and grasp it. The target in three different positions, and the recording track of the end effector.

In this paper, the authors propose a method for open-loop control of the robot, and therefore the precision is not high, in 10 tests, the robot can grip 6, it is accurate to 60%. Trajectory of the end effector arm as shown in FIG.
Here Insert Picture Description
4, enhanced command-based robot visual servo enhanced image -2014
Augmented Image-Based Visual Servoing Acceleration of A Manipulator the Using the Command
https://ieeexplore.ieee.org/document/6712119
Abstract: This paper presents a new image-based visual servo (IBVS) controller, called enhanced IBVS, 6 degrees of freedom for the manipulator. The main idea of this controller is that it produces acceleration as a control command. We developed a proportional-derivative controller that provides control commands to the robot. The controller may be implemented smoother and more linear feature trajectory in image space, and reduce the risk of leaving the field of view characteristics. Control method development also enhances the camera track three-dimensional space. Lyapunov method and system using the perturbation theory, fully study the stability of the method. Experimental testing to verify the validity of the proposed controller in the 6-DOF robotic system.
Here Insert Picture Description
. 5, https://ieeexplore.ieee.org/document/7097745
having a robotic actuator constraints and Fuzzy dead unmodeled dynamics adaptive visual tracking control -2015

Abstract: This paper focuses on adaptive visual tracking control actuator having a fuzzy dead zone constraints and unknown dynamic image uncalibrated visual servo robot system. Nonlinear dynamics of the robot and an external system without prior knowledge of the case, using fuzzy logic system to approximate unmodeled disturbances. Compared with conventional fuzzy logic system, by using recursive Newton-Euler method, it can significantly reduce the total number of fuzzy rules. By fuzzy fuzzy model of the slope region to blur into a dead certainty k̅ value k̅, construct a new fuzzy adaptive controller, eliminating the harmful effects of fuzzy dead zone constraints. Lyapunov stability analysis is proposed for visual function having unknown actuator dynamics and constraints of fuzzy feedback control dead zone problems. The results for visually tracking performance and closed-loop control system testing proposed BOUNDEDNESS.

A. problem statement and vision systems described
herein, the meter to the robot end effector mark feature points using feature point motion observation camera with a fixed perspective projection. Figure 1 shows the structure of ETH IBVS manipulator system. The camera intrinsic parameters and the assumed homogeneous transformation matrix between the camera and the robot is unknown. In addition, we are taking into account external interference unknown robot dynamics and systems. Finally, the study should have uncertain fuzzy dead zone constraints actuator input problem. As described above for uncalibrated IBVS manipulator system, as shown in the following objectives.
Here Insert Picture Description
Apparatus manufactured by Mitsubishi industrial robot, CR1D-721 controllers, custom PC, CCD camera and composed teaching. The Mitsubishi industrial robot with six degrees of freedom AC servo motor, an absolute encoder for detecting the position, six-axis input torque, RS-232C / USB interface. 37 kg of robot mass, while the upper arm and forearm length of 245 mm and 275 mm. Synthesis of the maximum speed of the robot joints to 5500 mm / s.
Here Insert Picture Description
6, Robust Pose Estimation from a Planar Target
robust posture target plane is estimated -2006
https://ieeexplore.ieee.org/document/1717461
Abstract: In theory, the calibration of the camera attitude can, but from a minimum of four co-coplanar Thread uniquely determined. In practice, there are many applications camera pose from the plane of the target track, and there are many recent execution of the tasks in real-time pose estimation algorithm, but all these algorithms are present posture blur . In this paper plane target perspective view of the camera posture we observed fuzziness. We show that, even in the case of having a wide-angle lens and a near target, there are also the posture Fuzzy - two different local minima corresponding to the error function. We carried out a minimum of two full explanation and analytical solution is located in a second minimum. Based on this solution, we have developed a new algorithm for unique and robust pose estimation from the target plane. In the evaluation experiments, this algorithm is superior to the four most advanced pose estimation algorithm
7, an IR beacon cost vision-based 6-DOF MAV positioned -2013
Low-Cost-Based Vision 6-DOF MAV Localization IR Beacons the Using
HTTPS : //ieeexplore.ieee.org/document/6584225
Abstract: MAV autonomous operation (MAV) is a challenging area of research in recent years has been a lot of research interest. In particular, the precise positioning of MAV is a problem, the most important task during the high precision is required, such as an indoor flight and landing. Many existing solutions are either inaccurate, or heavy, or expensive. We propose a vision-based low-cost solution to solve the problem of localization of MAV. The visual processing built-vehicle infrared tracking sensor for detecting an infrared mark, and point Based pose estimation algorithm to obtain a high rate of 6 DOF positioning. The system performance and an inertial measurement unit (IMU) and an outer stereoscopic measurement results are compared, as to the ground truth. We show that the weight of the system at low cost and inexpensive manner calculated on the 6 DOF produce accurate estimates, and can be immediately used for controlling implementation. Therefore, our solutions provide a viable solution for localized MAV
8, marked enhancement around the acetabular osteotomy tracking -2017
Augmented Tracking Marker for Peri-acetabular osteotomy Surgery
https://ieeexplore.ieee.org/document/8036979
Abstract: We have developed and validated a small, easy-to-use and cost-effective based on the enhanced hybrid navigation system marks for surrounding acetabular osteotomy (PAO). Mixing system comprises a tracking unit placed directly on the patient's pelvis, enhanced with an integrated IMU mark ( 'MU), which is connected to the patient's acetabulum and the host computer chips. Tracking unit labeled live video stream is sent to the host computer, the estimated posture mark in the host computer. Having 'MU reinforcing pose estimation flag which is sent to the host computer, where we use a sensor fusion to calculate pose estimation finally marked. Then, the main computer in the tracking direction in the acetabular fragment periacetabular osteotomy. Use of registered devices previously developed complete anatomical registration. Add a Kalman filter based on sensor 495 to complete the system. Plastic bone research conducted for verification between the navigation system based on optical tracking system and our proposed. Tilt forward and the mean absolute difference of 1.63 degrees and 1.55 degrees, respectively. The results show that our system is able to accurately measure the direction of the acetabular fragment.

Guess you like

Origin blog.csdn.net/weixin_42598288/article/details/92996841