Big file - 2GB memory limit

Feel free to ask any question here
Post Reply
goskycam
Posts: 1
Joined: Mon Jan 27, 2014 6:08 pm

Big file - 2GB memory limit

Post by goskycam »

I'm looking for an open-source 3D viewer that can visualize city-wide point cloud data. CloudCompare states that it was meant to deal with huge point clouds (typically more than 10 millions points, and up to 120 millions with 2 Gb of memory). However, the typical size of city-wide dataset is much more than 2 GB. My sample data with several blocks of data is about 20 GB.

Is there any workaround to the 2 GB-memory limit? Also, is there a plan to increase the memory limit?

Regards,
Juwon
daniel
Site Admin
Posts: 7707
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: Big file - 2GB memory limit

Post by daniel »

Hi,

In fact the '2Gb' here is not a limit, it's just an indication: you can roughly load 120 M. points with 2 Gb of memory. With the 64 bits version and a lot of memory installed on your computer you can load a lot more points (I've heard of users loading and processing about 450 M. points).

If you are interested in visualization only, the issue for you is that CC has no particular mechanism to display efficiently so much points. So the framerate will be very low. You should look at Oliver Kreylos' Lidar Viewer for instance: http://idav.ucdavis.edu/~okreylos/ResDev/LiDAR/. Look at the movie, it's quite impressive.
Daniel, CloudCompare admin
Post Reply