I'm looking for an open-source 3D viewer that can visualize city-wide point cloud data. CloudCompare states that it was meant to deal with huge point clouds (typically more than 10 millions points, and up to 120 millions with 2 Gb of memory). However, the typical size of city-wide dataset is much more than 2 GB. My sample data with several blocks of data is about 20 GB.
Is there any workaround to the 2 GB-memory limit? Also, is there a plan to increase the memory limit?
Regards,
Juwon
Big file - 2GB memory limit
Re: Big file - 2GB memory limit
Hi,
In fact the '2Gb' here is not a limit, it's just an indication: you can roughly load 120 M. points with 2 Gb of memory. With the 64 bits version and a lot of memory installed on your computer you can load a lot more points (I've heard of users loading and processing about 450 M. points).
If you are interested in visualization only, the issue for you is that CC has no particular mechanism to display efficiently so much points. So the framerate will be very low. You should look at Oliver Kreylos' Lidar Viewer for instance: http://idav.ucdavis.edu/~okreylos/ResDev/LiDAR/. Look at the movie, it's quite impressive.
In fact the '2Gb' here is not a limit, it's just an indication: you can roughly load 120 M. points with 2 Gb of memory. With the 64 bits version and a lot of memory installed on your computer you can load a lot more points (I've heard of users loading and processing about 450 M. points).
If you are interested in visualization only, the issue for you is that CC has no particular mechanism to display efficiently so much points. So the framerate will be very low. You should look at Oliver Kreylos' Lidar Viewer for instance: http://idav.ucdavis.edu/~okreylos/ResDev/LiDAR/. Look at the movie, it's quite impressive.
Daniel, CloudCompare admin