SharpMap v2 rendering speed for very large maps

Topics: General Topics, SharpMap v2.0
Jul 30, 2009 at 3:53 PM

Hi. I’m *very* new to SharpMap. Does anyone have any ballpark figures for SharpMap v2’s ability to render and use very large maps – for instance a street-level vector map of the UK? In a typical format, this could be around 0.5 GB.

With different zoom levels you’d have different types/levels of features visible – i.e. only motorways would be visible when you’re zoomed really far out, etc. Thus you wouldn’t expect to have more than perhaps 1000 objects on-screen at any one time (although for long roads you’d expect a single road object to be made up of hundreds or thousands of individual line segments). To complicate matters your approx. 1000 on-screen objects would be sampled from database/dataset containing millions of objects.

Can SharpMap cope with this amount of data and how long would it typically take to render something like this? In the ideal scenario rendering speeds would need to be something like 1/10<sup>th</sup> of a second…

Thanks very much

 

 

Coordinator
Jul 31, 2009 at 11:38 AM

Hi gwyn, SharpMap can cope with any amount of data but could do with some performance tuning overall, as long as you are careful with the cutoff bounadries for each layer you should get acceptable performance and improving the performance is a high priority. Also I am not sure if by large you mean lots of data or large image buffer. If you mean image buffer you are currently limited by GDI+ (google "max image size gdi+") however we are working on switching to a managed version of AGG which should circumvent this issue hth jd