You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've been recently working through an image-based visual servoing scheme with an UR3e (potentially a very important piece of information for my issue) and an intelRealSense 435di, i followed the camera intrinsic and extrinsic parameters VISP tutorials, and then connected the robot via Ethernet, all of these steps were succesfully implemented, i will leave some details below, the problem is when i run any of the following commands:
the command begins by warning me that the robot will move and to be close to the emergency stop (which proved to be vital) then the robot moves to random positions that put in danger the camera, the robot and each time i have to stop it before the visual servoing begins, when i approach (while the robot is stopped), the Aruco tags, they are succefully detected and i can see the trajectory plots as shown in the tutorial, but obviously the robot can't move at that point.
My question is the following:
-Why this happens ? Is it because the tutorial uses UR5e and i use UR3e so the trajectories send randomly prove to be non compatible with my robot ? How can i fix this ? sense i can't change the code (or shoud i consider this option) ?
-Relevant informations include:
the camera parameters:
camera.xml
Camera
640
480
perspectiveProjWithDistortion
605.146728515625
604.79150390625
325.53253173828125
244.95083618164062
-0
0
Camera intrinsic parameters and eMc transformation seem consistent to me.
The first thing to do is to test the joint velocity controller.
If it is working you can continue by testing the cartesian velocity controller in the end-effector frame.
Both are implemented in ViSP in vpRobotUniversalRobots class.
To this end, there are several tests in modules/robot/ folder that you use and adapt (see here).
Thanks a lot ! At least i know now that the calibration is no problem, well I tried the testUniversalJointVelocity here is what i get :
"Move to joint position: 0 -1.570796327 1.570796327 -1.570796327 -1.570796327 0
Apply joint vel in a loop for 4 sec: 0 0 0 0 -0.3490658504 0.3490658504
Change the control mode from stop to velocity control"
and for testUniversalRobotsCartVelocity i have the same response
at this point i clicked the emergency stop button to protect the realsense mounted on the robot, so yes it works but i guess the test positions for the UR3 are very unsafe, and the problem is i can't open the .cpp folders to edit the positions in the code in my ubuntu 20.04 computer, i tried another strategy by using directly the eMc to multiply it with the camera positions of the object detected (by yolo and mask R-CNN) to have (technichally) the equivalent but in the end effector frame, still doesn't work i hit into singularities all the time ( i run this program on python by sending directly the results in Urscript via ethernet to the UR3)
Hello VISP community !
I've been recently working through an image-based visual servoing scheme with an UR3e (potentially a very important piece of information for my issue) and an intelRealSense 435di, i followed the camera intrinsic and extrinsic parameters VISP tutorials, and then connected the robot via Ethernet, all of these steps were succesfully implemented, i will leave some details below, the problem is when i run any of the following commands:
-With a constant gain:
$ ./servoUniversalRobotsIBVS --eMc ../../apps/calibration/ur_eMc.yaml --plot
or even the sequential that is supposed to have 0 velocity at the begining of the loop according to the tutorial:
$ ./servoUniversalRobotsIBVS --eMc ../../apps/calibration/ur_eMc.yaml --plot --task_sequencing
the command begins by warning me that the robot will move and to be close to the emergency stop (which proved to be vital) then the robot moves to random positions that put in danger the camera, the robot and each time i have to stop it before the visual servoing begins, when i approach (while the robot is stopped), the Aruco tags, they are succefully detected and i can see the trajectory plots as shown in the tutorial, but obviously the robot can't move at that point.
My question is the following:
-Why this happens ? Is it because the tutorial uses UR5e and i use UR3e so the trajectories send randomly prove to be non compatible with my robot ? How can i fix this ? sense i can't change the code (or shoud i consider this option) ?
-Relevant informations include:
the camera parameters:
Camera 640 480 perspectiveProjWithDistortion 605.146728515625 604.79150390625 325.53253173828125 244.95083618164062 -0 0camera.xml
and the Homogenous transformation :
Ur_emc
0.9999746296 -0.006893637974 0.0017938755 -0.03641733936
0.006886289246 0.9999680025 0.004070994597 0.04819959709
-0.001821882063 -0.004058538169 0.9999901045 0.03156053558
0 0 0 1
Any help will be welcomed !
The text was updated successfully, but these errors were encountered: