Page 1 of 14
qCANUPO (classifier files, etc.)
Posted: Sun Apr 20, 2014 8:51 am
by daniel
I start a new thread so at to let people talk about the CANUPO classification plugin. Especially it would be good to share classifier files (.prm) this way.
Re: qCANUPO (classifier files, etc.)
Posted: Sun Apr 20, 2014 8:57 am
by daniel
The first source of classifiers can be found on the CANUPO authors website (
http://www.geosciences.univ-rennes1.fr/ ... rticle1284).
You'll find various classifiers for vegetation and rocks (debris, gravels, etc.):
Re: qCANUPO (classifier files, etc.)
Posted: Sun Apr 27, 2014 10:33 am
by Aarie
Hey Daniel, Dimitri and Nicolas,
congratulations on implementing CANUPO in Cloudcompare!
I've used CANUPO quite sucessfully in the past for classifying stone and vegetation from an archaeological site.
Thanks for your work and making it available to the public.
Cheers,
Arie
Re: qCANUPO (classifier files, etc.)
Posted: Sun Apr 27, 2014 12:38 pm
by Dimitri
Hi Arie,
glad it helped !
Did you create your own classifier or used an existing one ?
Dimitri
Re: qCANUPO (classifier files, etc.)
Posted: Mon Apr 28, 2014 8:25 am
by BrunaGarcia
Thanks for this implementation, that's nice! I've had a lot of dificulties with this tool while using it in the prompt.
I'm a very new user of all these tools (like cloudcompare, meshlab, etc.) and I'm not used to manipulate certain values.
Can anyone explain to me that thing of scales? I'm sorry but I don't really understand that (even with the examples from the tutorial etc).
Thank you once time again, this new version of CC is really awesome.
(i'm trying to create my own classifiers)
Re: qCANUPO (classifier files, etc.)
Posted: Mon Apr 28, 2014 9:10 am
by Dimitri
Hi BrunaGarcia,
Have you read the paper describing the multi-scale classification ? It's probably the best way to start and understand how it works.
Can you give me a list of other questions you have on Canupo. I'll try to compile them and create a FAQ. I'll also make a tutorial video in the coming days/week.
About the scales: check figure 2 in the paper
http://arxiv.org/pdf/1107.0550.pdf. One scale is the diameter of the ball centered on the point that is being classified. When you want to create a classifier, you specify a series of scales that corresponds to the distances around a point that will be used to characterize the local geometry of the cloud.
For instance: think of the difference between a flat wall and a vegetation bush. If your point cloud has a typical point spacing of about 1 cm, at scales of 5 cm, the wall appear 2D (i.e. a plane) while vegetation is made of 2D objects (leaves) and 1D objects (lines = stems). At this small scale, leaves could be mistaken for a wall, and vice-versa, so you need to use a larger scale. At a scale of 50 cm, the wall is still a plane and thus a 2D structure, while vegetation now appears has a bunch of points spread out in 3D. THis is a scale at which the wall and the vegetation is very different. Now, because it is not necessarily easy to pick the scales at which objects are the most different manually, and because various scales can help in the classification, you can specify a range of scales (in that case from ~ 5 cm to 50 cm with 5 cm intervals), and let qCANUPO-training find the best combination of scales that will allow the distinction between vegetation and wall.
Tip : do not use scales that are much larger than the type of objects you want to classify, because:
1. the largest is the maximum scale, the longer will be the computation time (and it increases non-linearly)
2. your classifier will loose spatial resolution
This is why the qCANUPO training plugin allow you to dynamically remove the largest scale to see if the classification result is still ok (i.e. blue and red points are well separated by the line). It's a neat way to optimize the resolution and computation time for a classifier.
Hope this help
Dimitri
Re: qCANUPO (classifier files, etc.)
Posted: Mon Apr 28, 2014 12:34 pm
by BrunaGarcia
That works, I've never seen this file so it's much better than the other I have (Maybe not the complete). Thanks a lot :)
I'm gonna take notes from every question I have and I'll send you later for you to create a FAQ.
Bruna.
Re: qCANUPO (classifier files, etc.)
Posted: Mon May 05, 2014 9:13 am
by Aarie
Hi Dimitri,
I've used my own classifiers. I also tried the existing ones but got slightly better results with the ones I made myself.
The results were quite impressive. I'm sure that I'll be using this pretty frequently, especially after the integration into CC.
So, thanks again.
Cheers!
Re: qCANUPO (classifier files, etc.)
Posted: Tue May 13, 2014 11:03 pm
by p01ntsurf3r
Hi Dimitri, I am have great success with CANUPO. I am classifying various TLS point clouds of Hawaiian beaches, fish ponds, archaeological sites as well as forest canopy research so I'll be contributing some classifiers for various types of vegetation and structural features. I have a few questions of how to optimize my processing time. Firstly I've noticed that if I use your CC plugin, it seems like all my CPU cores get utilized whereas if I use your original tool, I have staggered resource utilization, usually one or two cores at any given moment. I'm curious if this indicates i should abandon your original tool and use CC. Also, CC has no issues processing large clouds consisting of 50 million+ points. When I use canupo through CC on anything over 4 million points it seems to cause a crash. I'm currently running the cmd line tool on an identical computer concurrently with CC on a separate machine. Both are dual 8-core xeon 3.4ghz with 128gb ram and 1tb samsung 840 evo. Both of these machines crush pretty much everything so it will be interesting to see these results. i was just hoping you could shed some insight on performance differences and limitations between your tool and the CC plugin.
Re: qCANUPO (classifier files, etc.)
Posted: Mon May 19, 2014 9:54 am
by daniel
Hi,
Waiting for a an answer from DImitri himself, I can add some technical information about this issue. The descriptors computed for each core points take a lot of memory (especially if you use a lot of different scales). For each core point, we store (2*NS+1) values (with NS = number of scales). Therefore it's generally a good idea (and it's not necessary if I recall what Dimitri explained me) to not use too many core-points. Less than one million should be more than enough.
However, independently of what I've said above, the plugin shouldn't crash ;). I'll take a look at this asap!