Page 1 of 1
Improving quality of mesh generated from point cloud
Posted: Wed Jul 14, 2021 1:35 pm
by n.dafinski
Hello, so I am pretty new to CloudCompare in general, and I have a few point clouds of two rooms generated from a lidar scanner using a couple different techniques. One of the point clouds is pretty sparse (600k points) and the other one pretty dense (18mil points), but they are both scans of exactly the same place, and I want to make the best possible mesh from them. For both clouds I followed the same workflow of computing normals with setting the octree radius automatically, and then using the Poisson Surface Reconstruction but the results I get are pretty underwhelming. I suppose there are some methods to "fix" the mesh so I can get it to look better after I've actually created it but I'm not sure what they are so that's why I am posting here in hopes that you guys can point me in the right direction (or maybe there is just something wrong with the point cloud in general). I will attach some pictures of the process for both clouds and could also provide the point clouds if necessary
Re: Improving quality of mesh generated from point cloud
Posted: Thu Jul 15, 2021 9:48 pm
by daniel
Looking at the PoissonRecon surface, I think there's a big issue with the normals. It's pretty hard to get clean normals on such a cloud. Are there 2 rooms in this cloud? (this is even trickier).
Re: Improving quality of mesh generated from point cloud
Posted: Fri Jul 16, 2021 6:17 am
by n.dafinski
daniel wrote: ↑Thu Jul 15, 2021 9:48 pm
Looking at the PoissonRecon surface, I think there's a big issue with the normals. It's pretty hard to get clean normals on such a cloud. Are there 2 rooms in this cloud? (this is even trickier).
Hi, Daniel, thanks for the reply. These are two rooms that are right next to each other yea. Most of my scans would be indoor building scans or scans of some outdoor urban environment, would you have any tips on how I could best compute normals in those cases so as to get better quality meshes? Also how important is the point cloud density when creating meshes? Thanks again
Re: Improving quality of mesh generated from point cloud
Posted: Fri Jul 16, 2021 4:23 pm
by WargodHernandez
One way to help the normal calculation would be to segment each room into individual clouds and then calculate the normals with preferred orientation of Barycenter which would have align normals towards the center of the room or out of it depending on + or - Barycenter choice.
Re: Improving quality of mesh generated from point cloud
Posted: Tue Jul 20, 2021 11:41 am
by daniel
And to reply to your other question, for building a mesh, the density is not important (however, it helps a lot to compute robust normals ;).
Re: Improving quality of mesh generated from point cloud
Posted: Tue Jul 20, 2021 1:04 pm
by n.dafinski
Okay, thank you guys I will try to play around with generating the normals to see if I can get better results by maybe splitting the scan into smaller chunks first