Page 1 of 1

[Done] Data Cleanup: Duplicate Point Detection

Posted: Thu Jan 30, 2014 3:10 pm
by andrewmcunliffe
Hi Daniel

Thanks for making your program available, it really is great. I working with TLS-derived clouds of vegetation and it's refreshing to be able to view the data so easily.

Would it be possible to implement a function to detect and remove duplicate points (perhaps within a user specified tolerance) to support data clean-up?

Many Thanks,
Andy

Re: Data Cleanup: Duplicate Point Detection

Posted: Thu Jan 30, 2014 3:42 pm
by daniel
Indeed there should be a dedicated method to do this.

Meanwhile, you can do it indirectly by computing the cloud 'density' (Tools > Other > Density). For duplicated points, the density is theoretically infinite (in practical there's a maximum value in CC). So you'll just have to remove the points with the highest density values (with "Edit > Scalar Fields > Filter by value").

Re: Data Cleanup: Duplicate Point Detection

Posted: Fri Jan 31, 2014 11:56 am
by daniel
Hum, I realize that I answered a bit too quickly: if you apply the alternative method I've just proposed, you'll remove all the points lying at the same place.

I've just added a proper method to do this. You can test it with the latest online 'beta' release (http://www.cloudcompare.org/release). You'll find it in "Tools > Other > Remove duplicate points"

Re: Data Cleanup: Duplicate Point Detection

Posted: Wed Jun 04, 2014 9:56 am
by andrewmcunliffe
Thanks for adding this functionality.

I've got a query about the default 'Min distance' setting though. I appreciate that CC is unitless, but my experience is that most users are working with clouds in meters; consequently, a default value of 0.000000000001 m seems a little excessive. Unless you're sure you'd rather the tool defaults to removing no points in most applications, perhaps a default setting of 0.001 (i.e., often 1 mm) may be more relevant, given the limitations of the tools used to acquire many of these datasets (beam divergence etc.).

I'd be interested to know you're thoughts on this.
Cheers,
Andy

Re: Data Cleanup: Duplicate Point Detection

Posted: Wed Jun 04, 2014 6:56 pm
by daniel
On my side we work mainly in millimeters (which is worse regarding the issue you mentioned here ;).

In fact this tool was intended to remove "real" duplicate points (for which coordinates are just slightly different due to numerical errors). Such points are typically encountered in STL files for instance. To remove points too close from each others, the "Subsample" method is better suited (it's almost the same in fact).

But the simpler would be to make CC "remember" about the last input value so that each one can set it once and for all.

Re: Data Cleanup: Duplicate Point Detection

Posted: Tue Jun 10, 2014 12:32 pm
by daniel
Ok that's done (will be effective in the next release).