Calculating mean value and standard deviation from Command Line Mode
Posted: Tue Nov 17, 2020 8:03 pm
Hi :-)
I'm trying to write a script that can calculate the mean value and standard deviation of the scalar field in a cloud. Here after, using -FILTER_SF, all points which scalar field values exceed 3 times the standard deviation, on both sides of the mean value, will be extracted to one cloud, and the remaining to another.
Is this possible in command line mode? Or is it only possible if you use the C++ library?
Best regards, Michael
I'm trying to write a script that can calculate the mean value and standard deviation of the scalar field in a cloud. Here after, using -FILTER_SF, all points which scalar field values exceed 3 times the standard deviation, on both sides of the mean value, will be extracted to one cloud, and the remaining to another.
Is this possible in command line mode? Or is it only possible if you use the C++ library?
Best regards, Michael