Please add your project and your name indicating your interest to actively participate in the Benchmark.
- Project Name, responsible lead, ideas (feel free to start a new web site)
- FOSS4G, Arnulf Christl: Help to ETL test data for server benchmark
- MapProxy performance hit, Ivansanchez: Once WMS servers are configured, run the benchmark through MapProxy to measure the performance hit (or even gain?)
- Your Project, Your Name: "We will optimize our server to render tiles for amazing cartography"
- MapServer, Michael Smith: Provide hardware for all teams and data/mapping support for MapServer + running of Test runs for MapServer
Please specify your requirements for a benchmark test and how you can actively contribute to the process. This can include:
- ETL test data (please be creative, currently we aim for Ordnance Survey Great Britain OpenSpace and OpenStreetMap)
- Server & desktop(?) Hardware, platforms
- Server storage
- For the WPS session, nothing is needed
- anything else
Benchmarking Session at FOSS4G
The following text has been submitted as a regular presentation just to have a foot in the door. Probably it would make sense to have it in the plenary at the end of the conference.
In the last years FOSS4G set the scene for performance shootouts between the most powerful and renowned mapping software around - both open source and proprietary. Each year the teams gave the best they could and unanimously agreed that they learned a lot and were able to considerably improve the software. So far the goal was to accelerate the process of grabbing a geometry, rendering it and pushing it out. This has been optimized to a degree that makes differences hard to distinguish, we can honestly say that the contestants are really, really dead fast.
The next level of speed can only be achieved by fine tuning data stores, kernels and twisting virtual arms at a very low level. But this will at the same time defeat the point of comparability. So what to do? The Benchmarking team will find an answer until FOSS4G starts - or rather: ends, so we have three days more to excel at speeding up.
But like everything in this world we do not want to stagnate, wither and die but evolve. Therefore the 2013 edition of the benchmark proposes to look into evaluating three aspects:
- The well known dedicated map server contest
- As known on comparable platforms
- plus (potentially) a session where experts tune kernel parameters and everything they put their heads around to highest level the team is capable of or interested in - including a readable documentation of what they did
- Ease of use / Flexibility
- how easy is it to set up a web map service (tile rendering engine), again including a documentation
- How easy is it to create a map with a Desktop GIS package using the Open Data published by the Ordnance Survey plus an OSM overlay (probably the pubs) and a randomly chosen CSV file with some coordinates and attributes. In 2006 the TU Delft implemented a GML relay contest in which software development teams / vendors received a GML and had to load, modify, save and then pass it on to the next contestant - again reporting on what worked and what didn't. This exercise could follow similar patterns.
- using the OGC SLD SE as published by the Ordnance Survey as a starter
- plus then going into nitty gritty details and enhancements and maybe some interested cartographers from ICA
We will probably not be able to do all of this but it will at least highlight the complexity of the exercise and give some hints as to the parameters which are relevant when choosing a mapping platform on the green field or as replacement for an existing solution.
The following links point to interesting external resources:
Next year the benchmark will aim to be more inclusive and beside OGC WMS also benchmark 1) usability and 2) speed of Desktop GIS, also the performance of WPS or whatever else you believe is relevant and interesting.
Task 1: Usability
- style and
- analyze a given set of data (Ordnance Survey OpenSpace combined with other UK alphanumeric OpenData).
Record the steps required to get the results.
The aim is to see how easy it is to answer a (few) simple GIS questions using regular data.
Task 2: Speed and quality
- analyze a given large set of data (e.g. North Carolina sample dataset).
Record the time required to get the results, see accuracy of results.
The aim is to see the GIS' perform on bigger data.
- Shape files (Ordnance Survey)
- CSV with alphanumeric data (e.g. population or election results)
- North Carolina data in common GIS formats (vector, raster) - see here or OSGeo Live