First load of Shapefile layer is really slow

Topics: General Topics
Oct 23, 2006 at 9:17 PM

I have a shapefile which contains about 200,000 points and is around 30MB. Strangely its index is around 60MB. When this layer is first loaded it is extremely slow. When I look at Task Manager I notice the CPU goes to about 100% and the aspnetwp process goes to about 800 MB of memory. After the CPU returns to normal the layer will be displayed very quickly; however aspnetwp will still have 400MB of memory consumed permenantly. This occurs even if an index file was already created or if I choose to not load the index in the layer constructor.

Does it load and store every point into memory in order to speed up processing further requests? I was wondering if there were recommendations for shapefiles like these? Should it be split in half?
Coordinator
Oct 23, 2006 at 10:28 PM
Hi,

SharpMap only loads the features on demand, but i think
you show the whole extent at first.
Does this make sense by 200,0000 points?

Better to show a smaller extent.
.
.
myMap.GetBoundingBox(10, 10, 20, 20);
.
.

BR
/Christian
Coordinator
Oct 24, 2006 at 4:27 AM
Indexing for points will be larger, since point data is 16 bytes, but the bounding box for each feature is two points of data, or twice as large.

In general, with indexing, you trade memory for speed, so of course it will be larger.

I'm working on allowing filtering on the index so it is only partially constructed, instead of constructed on the entire layer. This will be in the 2.0 release. Until then, you have to constrain the extents, like Christian mentioned.
Developer
Oct 24, 2006 at 5:25 AM
There are different kind of indexes that are better for point-data than a quadbased index.
Coordinator
Oct 24, 2006 at 5:36 AM
The Guttman in SharpMap 2.0 alpha is pretty good for points and polygons. When the post-optimization stuff is added, it will be even better.