Page 1 of 1

Cloud to mesh - am I expecting too much from mesh?

Posted: Mon Aug 26, 2024 7:36 pm
by senmay
Hello,

I have a point cloud containing grating floors with railings, and I'm trying to create a nice-looking mesh, but after a few hours of playing around with the options, I can't get a satisfactory result. I'm quite new to CloudCompare, and I was wondering if I'm missing something. Could it be related to improperly computed normals (local surface = triangulation, orientation using minimum spanning tree with knn=8)? I've tried different settings, but none are satisfying. After reaching octree depth 11, the difference is barely noticeable. Is it possible to achieve better results in this case? I was thinking about segmenting different objects and working from there, but I'd prefer to avoid that if possible.

Attachments:

1. Point cloud
2. Point cloud with normals
3. Mesh - Normals (local surface = triangulation, orientation using minimum spanning tree with knn=8), filtered scalar mesh from PoissonRecon, octree depth = 12 with default advanced options.
Cheers.

Re: Cloud to mesh - am I expecting too much from mesh?

Posted: Tue Aug 27, 2024 8:35 am
by daniel
Yes, it's all about the normals. Computing properly oriented normals on railings or other small features can be really challenging. You would need a lot of points on 'both' sides. You could maybe achieve that by segmenting the cloud in various bits to be able to force the right normal orientation for each of them, but that can be very tedious.

The best way generally is either to import these normals, or if you have a structured file format with 'scan grids' (such as Faro or some E57 files), then CC will be able to resolve the normal orientation robustly.

Re: Cloud to mesh - am I expecting too much from mesh?

Posted: Tue Aug 27, 2024 2:04 pm
by jedfrechette
For an object like that I’d say your mesh actually looks pretty good.

Something like that is going to be difficult to get a clean mesh from. In addition to good normals you’ll also need an extremely dense point cloud that captures the entire surface with no gaps, which is hard to get with structures like the railing that introduce lots of occlusions. The mesh floor grating is going to be even worse and a tripod lidar scanner won’t capture enough detail on the individual grids to be able to mesh them as anything other than a rough surface.

Re: Cloud to mesh - am I expecting too much from mesh?

Posted: Tue Aug 27, 2024 5:07 pm
by senmay
Thank you for your answers!

I'm still unsure about how normals are imported from scanner exports. I used a Trimble X7 scanner and exported the data as .tdx and .rcp files. Can these exports already include pre-calculated normals? Unfortunately, I don’t have access to Trimble’s native software (RealWorks) to work with these files, so I’m wondering if there is a way to solve the issue with normals. I tried exporting the point cloud from Trimble Business Center as a structured .e57, but after importing it into CloudCompare, the normals and mesh quality appear to be significantly worse.

Re: Cloud to mesh - am I expecting too much from mesh?

Posted: Wed Aug 28, 2024 2:22 pm
by daniel
I see that you have the scanner positions. Do you also have individual scans? (i.e. multiple clouds?). In which case you can also use the sensor position to properly orient the normals.

If you have a doubt, you can still send me the file (to admin@cloudcompare.org) so that I can check on my side.

Re: Cloud to mesh - am I expecting too much from mesh?

Posted: Thu Aug 29, 2024 3:40 pm
by daniel
Ok, so I got the data and while it's very good on paper (individual E57 scans with sensor position and scan grids), they are filled with (0, 0, 0) points that completely mess things. They all have around 50 million (0, 0, 0) points out of 60 million. They probably corresponds to all the directions in which the laser didn't return. But sadly these points are not even flagged as 'invalid' in the E57 file (as they should be)...

So first, you have to remove these points (with 'Tools > Other > Remove duplicate points' - keep the default parameter).

Sadly this will just create a lot of holes in the scan grid, which makes it useless. That's a shame because it's super efficient to compute normals with these... But you can still use the sensor position. This is mostly longer in terms of computation time.

You can use the following parameters (on each cleaned individual scan). The important thing is to not use the scan grid:
compute_normals_example.jpg
compute_normals_example.jpg (41.45 KiB) Viewed 2833 times
Of course if you have a lot of memory you can load all the clouds at once and call all these tools on the 8 scans at once.

It should give you quite clean normals:
better_normals.JPG
better_normals.JPG (385.84 KiB) Viewed 2833 times
Eventually, you can merge the cleaned clouds with normals and compute the mesh with PoissonRecon.

However, you'll need a very high depth/octree level to reach this level of accuracy. And play with the resulting mesh 'density'. Still, I see that not all the bars are reconstructed properly (probably because there's not enough point on one side):
mesh_closeup.JPG
mesh_closeup.JPG (210.13 KiB) Viewed 2833 times
(I only reconstructed a small portion of the cloud here)

Re: Cloud to mesh - am I expecting too much from mesh?

Posted: Fri Aug 30, 2024 5:00 pm
by senmay
Ok, quality of the mesh is indeed better thanks to the methods you suggested. Regarding to those bad points I don't get how those points can have intensity and not XYZ. It means that laser had to return to source or my understanding is wrong?

X Y Z R G B Intensity
0.00000 0.00000 0.00000 34 37 46 0.250984

Thanks for help!

Re: Cloud to mesh - am I expecting too much from mesh?

Posted: Tue Sep 03, 2024 11:53 am
by daniel
Well, that would be a question for the scanner manufacturer ;)

Maybe it's a confidence issue (but I agree it's weird that they keep half of the information).