Hi everyone,
I use CloudCompare for Geoogy / Geophysics topic and I start a PhD. to generate a 3D Digital Elevation Model with satelite images (and a lot of more things). I ever have a Cloud Point of a field and this Cloud Point is about 800 000 000 points. So yeah, the ply file is pretty huge, it's about 10 Go !
My question is: Which configuration should be GREAT to watch my CloudPoint without freeze while moving on it ? Is a GTX 1080 Ti and 32 Go memory with a i7 ~3.1 Ghz is enough ? Or do you think I need a better graphic card and/or 64 Go memory ?
Thank's in advance guys ! :)
Computer configuration for high density Cloud Points
Re: Computer configuration for high density Cloud Points
Wow, it's clear that such a big cloud is huge, especially at display time. I don't think the graphic card can contain all the points, therefore you won' t get the benefit of the full GPU acceleration... Normally the LOD mechanism should help you, but you need to wait for it to be computed and it may take a while (have you already experienced it? With points appearing progressively when you move the cloud?).
Daniel, CloudCompare admin
Re: Computer configuration for high density Cloud Points
Yeah, I ever saw it for smaller cloudpoints :) That's a real great thing btw !
The problem is as you said, the loading... In all cases, CC have to read all the points and then, display them. So for you, the graphic card will not help ? There is a GTX 1080 Ti with 11 Go, maybe it can take all the cloudpoints ? (9-10 Go approximately) Is someone ever experienced that ?
If I use the global shift on my data (as example, to don't start at 7000000 in x-coordinates but at 0), is that really usefull ? I think I can save 6 digits at least for all points in x-coordinates and 4 for y-coordinates. If I save the cloudpoints after the global shift in .bin format (CC), is the next load will be faster ?
Thank's for your answer by the way ! :)
The problem is as you said, the loading... In all cases, CC have to read all the points and then, display them. So for you, the graphic card will not help ? There is a GTX 1080 Ti with 11 Go, maybe it can take all the cloudpoints ? (9-10 Go approximately) Is someone ever experienced that ?
If I use the global shift on my data (as example, to don't start at 7000000 in x-coordinates but at 0), is that really usefull ? I think I can save 6 digits at least for all points in x-coordinates and 4 for y-coordinates. If I save the cloudpoints after the global shift in .bin format (CC), is the next load will be faster ?
Thank's for your answer by the way ! :)
Re: Computer configuration for high density Cloud Points
Ah, with 11Gb, it may indeed fit into the GPU memory :D. You can check this in the console (there's a message about the percentage of points that could be loaded in "VBOs").
But anyway, CC will still render all the point at each frame (as long as the LOD mechanism is not enabled), which will still be very long... Especially if you have normals (in this case make sure to disable them, otherwise the rendering will be horrible). In the same spirit, if you display the cloud with a scalar field, make sure the shader dedicated to their display is enable (see the Display Options).
In all cases you have to shift the large coordinates to avoid display issues (because OpenGL drivers don't handle large coordinates) but it won't help because the truncation from 64 bits to 32 bits is already done. However if you save your data as BIN files, or as binary PLY for instance, it may be faster than other formats (especially ASCII ones).
But anyway, CC will still render all the point at each frame (as long as the LOD mechanism is not enabled), which will still be very long... Especially if you have normals (in this case make sure to disable them, otherwise the rendering will be horrible). In the same spirit, if you display the cloud with a scalar field, make sure the shader dedicated to their display is enable (see the Display Options).
In all cases you have to shift the large coordinates to avoid display issues (because OpenGL drivers don't handle large coordinates) but it won't help because the truncation from 64 bits to 32 bits is already done. However if you save your data as BIN files, or as binary PLY for instance, it may be faster than other formats (especially ASCII ones).
Daniel, CloudCompare admin