Computer Build Question
Posted: Tue Jun 30, 2020 9:17 pm
First, I simply would like to say that this truly is an amazing tool. Thanks for going the GNU route!
I’m primarily interested in using CloudCompare for combining many laser scans together to perform a historical analysis of the overall dataset. My issue is the computer I initially had access to barely was capable of handling a single one of these datasets let alone multiple. Rough numbers are millions of data points per scan, collected roughly hourly, nearly 24/7.
I’ve done some basic searches of the forum and have discovered the following so far...
CPU - I’ve seen mentioned that due to calculations being relatively easy the CPU speed isn’t as important as overall number of cores. Would 8-core 16-thread be sufficient? A coworker pointed me towards the Intel i9-9980XE Skylake X 18-Core although this is quite an expensive route without understanding more of the benefit part of the cost/benefit. Thoughts?
RAM - more the better right? I recall someone processing some satellite lidar was running 128Gb. Seems like perhaps starting with 64Gb, but having space to upgrade to 128Gb would be a good move?
GPU - another key piece of hardware and internal memory is clearly important. A coworker spec’d a GIGABYTE GeForce RTX 2080 8Gb, but also found a ASUS ROG Strix GeForce RTX 2080 with 11Gb. Again thoughts?
My current conundrum is not having a computer capable of the datasets I have access to, but also the unknown of not understanding how much of an impact these various components will make. I do think I easily have the potential to approach the maximum datapoint limit (something like 4 billion, right) and I’d love to not require a ton of statistical sampling.
Any help/guidance would be greatly appreciated and I hope to become a contributor to the community some day!
Kind regards,
CF
I’m primarily interested in using CloudCompare for combining many laser scans together to perform a historical analysis of the overall dataset. My issue is the computer I initially had access to barely was capable of handling a single one of these datasets let alone multiple. Rough numbers are millions of data points per scan, collected roughly hourly, nearly 24/7.
I’ve done some basic searches of the forum and have discovered the following so far...
CPU - I’ve seen mentioned that due to calculations being relatively easy the CPU speed isn’t as important as overall number of cores. Would 8-core 16-thread be sufficient? A coworker pointed me towards the Intel i9-9980XE Skylake X 18-Core although this is quite an expensive route without understanding more of the benefit part of the cost/benefit. Thoughts?
RAM - more the better right? I recall someone processing some satellite lidar was running 128Gb. Seems like perhaps starting with 64Gb, but having space to upgrade to 128Gb would be a good move?
GPU - another key piece of hardware and internal memory is clearly important. A coworker spec’d a GIGABYTE GeForce RTX 2080 8Gb, but also found a ASUS ROG Strix GeForce RTX 2080 with 11Gb. Again thoughts?
My current conundrum is not having a computer capable of the datasets I have access to, but also the unknown of not understanding how much of an impact these various components will make. I do think I easily have the potential to approach the maximum datapoint limit (something like 4 billion, right) and I’d love to not require a ton of statistical sampling.
Any help/guidance would be greatly appreciated and I hope to become a contributor to the community some day!
Kind regards,
CF