matlab点云粗配准,(Matlab)3D点云配准与拼接
3D点云配准与拼接
我们采用Matlab软件来进行多组点云的拼接,并借助迭代最近点(ICP)算法生成相应的3D重建模型。其中所采用的数据来自Kinect相机捕获的真实三维场景数据。
配准两组点云数据
dataFile = fullfile(toolboxdir('vision'), 'visiondata', 'livingRoom.mat');
load(dataFile);
ptCloudRef = livingRoomData{1};
ptCloudCurrent = livingRoomData{2};
配准效果受数据去噪处理及ICP初始设置影响较大。通过优化去噪方法可有效提升配准精度。本研究采用下采样与处理方式优化点云去噪,并采用盒式网格分割法对点云进行离散化处理以减少计算复杂度。具体而言, 通过将点云划分为小网格并计算每个单元格内的平均值来生成新的数据集, 这种方法能够显著降低计算开销的同时保持较高的精度水平
gridSize = 0.1;
fixed = pcdownsample(ptCloudRed, 'gridAverage', gridSize);
moving = pcdownsample(ptcloudCurrent, 'gridAverage', gridSize);
我们采用ICP方法对两个点云进行配准,并随后对重合区域实施融合处理。该算法基于刚性变换模型进行参数求解。
Assign the point cloud registration parameters for the moving and fixed clouds using the icp algorithm with a point-to-plane distance metric and extrapolation enabled. The transformation matrix is stored in tform.
pcCloudAligned = pctransform(pcCloudCurrent, tform);
变换之后,我们生成真实场景,
mergeSize = 0.015;
ptCloudScene = pcmerge(ptCloudRef, ptCloudAligned, mergeSize);
% Visualize the input images.
figure
subplot(2,2,1)
imshow(ptCloudRef.Color)
title('First input image','Color','w')
drawnow
subplot(2,2,3)
imshow(ptCloudCurrent.Color)
title('Second input image','Color','w')
drawnow
% Visualize the world scene.
subplot(2,2,[2,4])
pcshow(ptCloudScene, 'VerticalAxis','Y', 'VerticalAxisDir', 'Down')
title('Initial world scene')
xlabel('X (m)')
ylabel('Y (m)')
zlabel('Z (m)')
drawnow
点云拼接
% Store the transformation object that accumulates the transformation.
accumTform = tform;
figure
hAxes = pcshow(ptCloudScene, 'VerticalAxis','Y', 'VerticalAxisDir', 'Down');
title('Updated world scene')
% Set the axes property for faster rendering
hAxes.CameraViewAngleMode = 'auto';
hScatter = hAxes.Children;
for i = 3:length(livingRoomData)
ptCloudCurrent = livingRoomData{i};
% Use previous moving point cloud as reference.
fixed = moving;
moving = pcdownsample(ptCloudCurrent, 'gridAverage', gridSize);
% Apply ICP registration.
该函数通过点到平面度量对移动和固定点云执行配准
% Transform the current point cloud to the reference coordinate system
% defined by the first point cloud.
accumTform = affine3d(tform.T * accumTform.T);
ptCloudAligned = pctransform(ptCloudCurrent, accumTform);
% Update the world scene.
ptCloudScene = pcmerge(ptCloudScene, ptCloudAligned, mergeSize);
% Visualize the world scene.
hScatter.XData = ptCloudScene.Location(:,1);
hScatter.YData = ptCloudScene.Location(:,2);
hScatter.ZData = ptCloudScene.Location(:,3);
hScatter.CData = ptCloudScene.Color;
drawnow('limitrate')
end
% During the recording, the Kinect was pointing downward. To visualize the
% result more easily, let's transform the data so that the ground plane is
% parallel to the X-Z plane.
angle = -pi/10;
A = [1,0,0,0;...
0, cos(angle), sin(angle), 0; ...
0, -sin(angle), cos(angle), 0; ...
0 0 0 1];
ptCloudScene = pctransform(ptCloudScene, affine3d(A));
pcshow(ptCloudScene, 'VerticalAxis','Y', 'VerticalAxisDir', 'Down', ...
'Parent', hAxes)
title('Updated world scene')
xlabel('X (m)')
ylabel('Y (m)')
zlabel('Z (m)')
