Page 1 of 1

Calculating mean value and standard deviation from Command Line Mode

Posted: Tue Nov 17, 2020 8:03 pm
by michaeladamsen
Hi :-)

I'm trying to write a script that can calculate the mean value and standard deviation of the scalar field in a cloud. Here after, using -FILTER_SF, all points which scalar field values exceed 3 times the standard deviation, on both sides of the mean value, will be extracted to one cloud, and the remaining to another.

Is this possible in command line mode? Or is it only possible if you use the C++ library?

Best regards, Michael

Re: Calculating mean value and standard deviation from Command Line Mode

Posted: Wed Nov 18, 2020 5:48 pm
by daniel
I don't think this is possible with the command line mode indeed... And if one wanted to improve the code, one would need to add a specific option of the 'Filter by SF' command to filter with +/N sigmas I guess.