编辑:新机器视觉
一、概论
现在的机器人少不了有各种传感器,传感器之间的标定是机器人感知环境的一个重要前提。所谓标定,是指确定传感器之间的坐标转换关系。由于标定的传感器各异,好像没有特别通用的方法。
手眼标定法是标定摄像头与机械臂的一个经典方法,不过这个思想也适用于其他传感器,比如自动驾驶中激光雷达与摄像头之间的标定,比如东京大学的这篇工作《LiDAR and Camera Calibration using Motion Estimated by Sensor Fusion Odometry》。
手眼标定法的核心公式只有一个,
,这里的 X 就是指手(机械臂末端)与眼(摄像头)之间的坐标转换关系。下面结合机械臂的两种使用场景,讲一下这个公式的由来。
用Base表示机械臂的底座(可以认为是世界坐标系),用End表示机械臂的末端,用Camera表示摄像头,用Object表示标定板。
- Eye-In-Hand
所谓Eye-In-Hand,是指摄像头被安装在机械臂上。此时要求取的是,End到Camera之间的坐标转换关系,也就是
。这种情况下,有两个变量是不变的:
- 摄像头与机械臂末端之间的坐标转换关系不变,也就是说,
始终不变;
- 标定板与机械臂底座之间的坐标转换关系不变,也就是说,
也是不变的。
把
按照前后两次运动展开,有
记
就得到了
也就是,如果能够计算移动前后,机械臂末端的坐标变换关系
以及相机的坐标变换关系
,即可求出机械臂末端到相机之间的坐标变换关系
。
- Eye-To-Hand
所谓Eye-To-Hand,是指摄像头被安装在一个固定不动的位置,而标定板被拿在机械臂手上。此时要求取的是,Base到Camera之间的坐标转换关系,也就是
。这种情况下,有两个变量是不变的:
- 摄像头与机械臂底座之间的坐标转换关系不变,也就是说,
始终不变;
- 标定板与机械臂末端之间的坐标转换关系不变,也就是说,
始终不变。
把
按照前后两次运动展开,有
记
就得到了
本文主要是讲解经典手眼标定问题中的TSAI-LENZ 文献方法,参考文献为“A New Technique for Fully Autonomous and Efficient 3D Robotics Hand/Eye Calibration”,并且实现了基于OpenCV的C 代码程序
二、Eye in hand 手眼标定问题
在机器人校准测量、机器人手眼协调以及机器人辅助测量等领域,都要求知道机器人执行器末端(抓取臂)坐标系和传感器(比如用来测量三维空间中目标位置和方向并固定在机器人手上的摄像机)坐标系之间的相互关系,确定这种转换关系在机器人领域就是通常所说的手眼标定。
将手眼标定系统如下图所示,其中Hgi为机器人执行器末端坐标系之间相对位置姿态的齐次变换矩阵;Hcij为摄像机坐标系之间相对位置姿态的齐次变换矩阵;Hcg为像机与机器人执行器末端之间的相对位置姿态齐次矩阵。
经过坐标系变换,Hgij、Hcij和Hcg满足上文所述的AX=XB关系:
将上式展开,可以得到手眼标定的基本方程:
因此,手眼标定问题也就转化为从上述方程组中求解出RcgRcg和TcgTcg,下面就按照TSAI文献所述求解该方程组。
三、“两步法”手眼标定
一般用“两步法”求解基本方程,即先从基本方程上式求解出Rcg,再代入下式求解出Tcg。在TSAI文献中引入旋转轴-旋转角系统来描述旋转运动来进行求解该方程组,具体的公式推导可以查看原始文献,这里只归纳计算步骤,不明白的地方可阅读文献,计算步骤如下:
Step1:利用罗德里格斯变换将旋转矩阵转换为旋转向量
Step2:向量归一化
Step3:修正的罗德里格斯参数表示姿态变化
Step4:计算初始旋转向量P′cg
其中,skew为反对称运算,假设一个三维向量V=[vx;vy;vz],其反对称矩阵为:
Step5:计算旋转向量Pcg
Step6:计算旋转矩阵Rcg
Step7:计算平移向量TcgTcg
四、MATLAB算法源代码
根据上述基本计算步骤,MATLAB实现代码为:
代码1:tsai函数(求解AX=XB)
代码语言:javascript复制function X = tsai(A,B)
% Calculates the least squares solution of
% AX = XB
%
% A New Technique for Fully Autonomous
% and Efficient 3D Robotics Hand/Eye Calibration
% Lenz Tsai
%
% Mili Shah
% July 2014
[m,n] = size(A); n = n/4;
S = zeros(3*n,3);
v = zeros(3*n,1);
�lculate best rotation R
for i = 1:n
A1 = logm(A(1:3,4*i-3:4*i-1));
B1 = logm(B(1:3,4*i-3:4*i-1));
a = [A1(3,2) A1(1,3) A1(2,1)]'; a = a/norm(a);
b = [B1(3,2) B1(1,3) B1(2,1)]'; b = b/norm(b);
S(3*i-2:3*i,:) = skew(a b);
v(3*i-2:3*i,:) = a-b;
end
x = Sv;
theta = 2*atan(norm(x));
x = x/norm(x);
R = (eye(3)*cos(theta) sin(theta)*skew(x) (1-cos(theta))*x*x')';
�lculate best translation t
C = zeros(3*n,3);
d = zeros(3*n,1);
I = eye(3);
for i = 1:n
C(3*i-2:3*i,:) = I - A(1:3,4*i-3:4*i-1);
d(3*i-2:3*i,:) = A(1:3,4*i)-R*B(1:3,4*i);
end
t = Cd;
%Put everything together to form X
X = [R t;0 0 0 1];
代码2:skew函数(求向量反对称矩阵)
代码语言:javascript复制function Sk = skew( x )
%SKEW 此处显示有关此函数的摘要
% 此处显示详细说明
Sk = [0,-x(3),x(2);x(3),0,-x(1);-x(2),x(1),0];
end
代码3:计算手眼矩阵Tm
代码语言:javascript复制clc
clear
close all
%%%%%%%%%%%%%%%%%%%%%%% T6矩阵参数%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%位姿1的时候机器人末端相对于机器人基坐标系下变换矩阵
Pose1=[1141.243,-15.261,-97.721,178.91,0.47,92.37];
Px = Pose1(1);
Py = Pose1(2);
Pz = Pose1(3);
rota = Pose1(4)*pi/180;
rotb = Pose1(5)*pi/180;
rotc = Pose1(6)*pi/180;
Rx = [1 0 0; 0 cos(rota) -sin(rota); 0 sin(rota) cos(rota)];
Ry = [cos(rotb) 0 sin(rotb); 0 1 0; -sin(rotb) 0 cos(rotb)];
Rz = [cos(rotc) -sin(rotc) 0; sin(rotc) cos(rotc) 0; 0 0 1];
R1 = Rz*Ry*Rx;
T1= [Px Py Pz]';
%%%%%%%%%%位姿2的时候机器人末端相对于机器人基坐标系下变换矩阵
Pose2=[1103.946,-163.910,-107.673,-160.90,-0.14,-91.62];
Px = Pose2(1);
Py = Pose2(2);
Pz = Pose2(3);
rota = Pose2(4)*pi/180;
rotb = Pose2(5)*pi/180;
rotc = Pose2(6)*pi/180;
Rx = [1 0 0; 0 cos(rota) -sin(rota); 0 sin(rota) cos(rota)];
Ry = [cos(rotb) 0 sin(rotb); 0 1 0; -sin(rotb) 0 cos(rotb)];
Rz = [cos(rotc) -sin(rotc) 0; sin(rotc) cos(rotc) 0; 0 0 1];
R2 = Rz*Ry*Rx;
T2= [Px Py Pz]';
%%%%%%%%%%位姿3的时候机器人末端相对于机器人基坐标系下变换矩阵
Pose3=[1073.714,2.669,-142.448,-142.86,0.84,-178.55];
Px = Pose3(1);
Py = Pose3(2);
Pz = Pose3(3);
rota = Pose3(4)*pi/180;
rotb = Pose3(5)*pi/180;
rotc = Pose3(6)*pi/180;
Rx = [1 0 0; 0 cos(rota) -sin(rota); 0 sin(rota) cos(rota)];
Ry = [cos(rotb) 0 sin(rotb); 0 1 0; -sin(rotb) 0 cos(rotb)];
Rz = [cos(rotc) -sin(rotc) 0; sin(rotc) cos(rotc) 0; 0 0 1];
R3 = Rz*Ry*Rx;
T3= [Px Py Pz]';
%%%%%%%%%位姿1,2,3时候机器人末端相对于机器人基坐标系下变换矩阵
T61=[R1 T1;0 0 0 1] ;
T62=[R2 T2;0 0 0 1];
T63=[R3 T3;0 0 0 1];
%%%%%%摄像机外参数矩阵(平面靶标在摄像机坐标系下表示)%%%%%%%%
Extrinsic1=[0.051678,-0.998634,0.007660,21.747985;
-0.998617,-0.051600,0.010060,27.391246;
-0.009651,-0.008169,-0.999920,319.071378];%%%3行4列矩阵
Extrinsic2=[0.014949,0.999738,0.017361,-35.869608
0.949779,-0.019626,0.312304,-20.701811
0.312563,0.011821,-0.949823,306.463155];
Extrinsic3=[0.999176,0.039246,0.010343,-26.361812
0.025037,-0.796606,0.603980,20.533884
0.031943,-0.603223,-0.796933,318.110756];
%%%%%%%
TC1=[Extrinsic1; 0 0 0 1];
TC2=[Extrinsic2; 0 0 0 1];
TC3=[Extrinsic3; 0 0 0 1];
TL1=inv(T61)*T62;
TL2=inv(T62)*T63;
TR1=TC1*inv(TC2);
TR2=TC2*inv(TC3);
A=[TL1,TL2]
B=[TR1,TR2]
X= tsai(A,B)
结果如下:
代码语言:javascript复制A =
-0.9976 0.0676 -0.0173 -146.8929 0.0535 -0.7980 0.6003 -165.7422
-0.0697 -0.9488 0.3082 -43.6165 0.9483 0.2289 0.2197 44.2528
0.0044 0.3087 0.9512 10.3295 -0.3127 0.5575 0.7690 21.0485
0 0 0 1.0000 0 0 0 1.0000
B =
-0.9975 0.0711 -0.0029 -11.6622 0.0544 -0.7855 -0.6164 177.7842
-0.0663 -0.9443 -0.3223 104.2345 0.9515 0.2280 -0.2067 65.4536
-0.0257 -0.3213 0.9466 21.3907 0.3029 -0.5753 0.7598 84.5619
0 0 0 1.0000 0 0 0 1.0000
X =
-0.9998 0.0187 -0.0076 -78.8694
-0.0187 -0.9998 -0.0073 14.2711
-0.0078 -0.0071 0.9999 -124.6709
0 0 0 1.0000
五、C 算法源代码
在利用OpenCV 2.0开源库的基础上,编写Tsai手眼标定方法的c 程序,其实现函数代码如下:
代码1:Tsai_HandEye函数,求解AX=XB
代码语言:javascript复制void Tsai_HandEye(Mat Hcg, vector<Mat> Hgij, vector<Mat> Hcij)
{
CV_Assert(Hgij.size() == Hcij.size());
int nStatus = Hgij.size();
Mat Rgij(3, 3, CV_64FC1);
Mat Rcij(3, 3, CV_64FC1);
Mat rgij(3, 1, CV_64FC1);
Mat rcij(3, 1, CV_64FC1);
double theta_gij;
double theta_cij;
Mat rngij(3, 1, CV_64FC1);
Mat rncij(3, 1, CV_64FC1);
Mat Pgij(3, 1, CV_64FC1);
Mat Pcij(3, 1, CV_64FC1);
Mat tempA(3, 3, CV_64FC1);
Mat tempb(3, 1, CV_64FC1);
Mat A;
Mat b;
Mat pinA;
Mat Pcg_prime(3, 1, CV_64FC1);
Mat Pcg(3, 1, CV_64FC1);
Mat PcgTrs(1, 3, CV_64FC1);
Mat Rcg(3, 3, CV_64FC1);
Mat eyeM = Mat::eye(3, 3, CV_64FC1);
Mat Tgij(3, 1, CV_64FC1);
Mat Tcij(3, 1, CV_64FC1);
Mat tempAA(3, 3, CV_64FC1);
Mat tempbb(3, 1, CV_64FC1);
Mat AA;
Mat bb;
Mat pinAA;
Mat Tcg(3, 1, CV_64FC1);
for (int i = 0; i < nStatus; i )
{
Hgij[i](Rect(0, 0, 3, 3)).copyTo(Rgij);
Hcij[i](Rect(0, 0, 3, 3)).copyTo(Rcij);
Rodrigues(Rgij, rgij);
Rodrigues(Rcij, rcij);
theta_gij = norm(rgij);
theta_cij = norm(rcij);
rngij = rgij / theta_gij;
rncij = rcij / theta_cij;
Pgij = 2 * sin(theta_gij / 2)*rngij;
Pcij = 2 * sin(theta_cij / 2)*rncij;
tempA = skew(Pgij Pcij);
tempb = Pcij - Pgij;
A.push_back(tempA);
b.push_back(tempb);
}
//Compute rotation
invert(A, pinA, DECOMP_SVD);
Pcg_prime = pinA * b;
Pcg = 2 * Pcg_prime / sqrt(1 norm(Pcg_prime) * norm(Pcg_prime));
PcgTrs = Pcg.t();
Rcg = (1 - norm(Pcg) * norm(Pcg) / 2) * eyeM 0.5 * (Pcg * PcgTrs sqrt(4 - norm(Pcg)*norm(Pcg))*skew(Pcg));
//Computer Translation
for (int i = 0; i < nStatus; i )
{
Hgij[i](Rect(0, 0, 3, 3)).copyTo(Rgij);
Hcij[i](Rect(0, 0, 3, 3)).copyTo(Rcij);
Hgij[i](Rect(3, 0, 1, 3)).copyTo(Tgij);
Hcij[i](Rect(3, 0, 1, 3)).copyTo(Tcij);
tempAA = Rgij - eyeM;
tempbb = Rcg * Tcij - Tgij;
AA.push_back(tempAA);
bb.push_back(tempbb);
}
invert(AA, pinAA, DECOMP_SVD);
Tcg = pinAA * bb;
Rcg.copyTo(Hcg(Rect(0, 0, 3, 3)));
Tcg.copyTo(Hcg(Rect(3, 0, 1, 3)));
Hcg.at<double>(3, 0) = 0.0;
Hcg.at<double>(3, 1) = 0.0;
Hcg.at<double>(3, 2) = 0.0;
Hcg.at<double>(3, 3) = 1.0;
}
代码2:skew函数(将3x1向量转换成3x3反对称矩阵)
代码语言:javascript复制Mat skew(Mat A)
{
CV_Assert(A.cols == 1 && A.rows == 3);
Mat B(3, 3, CV_64FC1);
B.at<double>(0, 0) = 0.0;
B.at<double>(0, 1) = -A.at<double>(2, 0);
B.at<double>(0, 2) = A.at<double>(1, 0);
B.at<double>(1, 0) = A.at<double>(2, 0);
B.at<double>(1, 1) = 0.0;
B.at<double>(1, 2) = -A.at<double>(0, 0);
B.at<double>(2, 0) = -A.at<double>(1, 0);
B.at<double>(2, 1) = A.at<double>(0, 0);
B.at<double>(2, 2) = 0.0;
return B;
}
代码3:计算手眼矩阵Tm
代码语言:javascript复制#include <opencv2/opencv.hpp> //头文件
#include <stdio.h>
#include <iostream>
#include <vector>
using namespace std;
using namespace cv; //包含cv命名空间
int main()
{
double a1[4][4] = { -0.9976,0.0676, - 0.0173, - 146.8929,
-0.0697 ,- 0.9488 , 0.3082 ,- 43.6165,
0.0044 , 0.3087, 0.9512 , 10.3295,
0 , 0 , 0 , 1.0000 };
double a2[4][4] = { 0.0535, - 0.7980, 0.6003 ,-165.7422,
0.9483 , 0.2289, 0.2197, 44.2528,
-0.3127,0.5575,0.7690, 21.0485,
0, 0, 0, 1 };
double b1[4][4] = { -0.9975, 0.0711, - 0.0029 ,- 11.6622,
-0.0663, - 0.9443, - 0.3223, 104.2345,
-0.0257, - 0.3213, 0.9466 , 21.3907,
0, 0, 0, 1 };
double b2[4][4] = { 0.0544, - 0.7855, - 0.6164, 177.7842,
0.9515, 0.2280 ,- 0.2067, 65.4536,
0.3029, - 0.5753, 0.7598 , 84.5619,
0, 0, 0, 1 };
Mat A1(4, 4, CV_64FC1, a1);
Mat A2(4, 4, CV_64FC1, a2);
Mat B1(4, 4, CV_64FC1, b1);
Mat B2(4, 4, CV_64FC1, b2);
vector<Mat> Hgij;
vector<Mat> Hcij;
Hgij.push_back(A1);
Hgij.push_back(A2);
Hcij.push_back(B1);
Hcij.push_back(B2);
Mat Hcg1(4, 4, CV_64FC1);
Tsai_HandEye(Hcg1, Hgij, Hcij);
}
输出结果如下:
参考文献:
Calibration and Registration Techniques for Robotics http://math.loyola.edu/~mili/Calibration/index.html
经典手眼标定算法之Tsai-Lenz的OpenCV实现 https://blog.csdn.net/YunlinWang/article/details/51622143?depth_1-utm_source=distribute.pc_relevant.none-task-blog-BlogCommendFromBaidu-7&utm_source=distribute.pc_relevant.none-task-blog-BlogCommendFromBaidu-7
手眼标定AX=XB原理 https://zhuanlan.zhihu.com/p/103749589
本文仅做学术分享,如有侵权,请联系删文。