Difference between revisions of "Benchmarking 2011"

From OSGeo
Jump to navigation Jump to search
Line 11: Line 11:
 
| '''Contacted'''
 
| '''Contacted'''
 
| '''Confirmed'''
 
| '''Confirmed'''
 +
| '''Comment'''
 
|-
 
|-
 
| Cadcorp GeognoSIS  
 
| Cadcorp GeognoSIS  
Line 16: Line 17:
 
| Dec 9th, 2010
 
| Dec 9th, 2010
 
| Yes
 
| Yes
 +
|
 
|-
 
|-
 
| Constellation-SDI  
 
| Constellation-SDI  
Line 21: Line 23:
 
|  
 
|  
 
|  
 
|  
 +
|
 
|-
 
|-
 
| Erdas Apollo  
 
| Erdas Apollo  
Line 26: Line 29:
 
|  
 
|  
 
|  
 
|  
 +
|
 
|-
 
|-
 
| ESRI ArcServer
 
| ESRI ArcServer
Line 31: Line 35:
 
|  
 
|  
 
|  
 
|  
 +
|
 
|-
 
|-
 
| GeoServer  
 
| GeoServer  
Line 36: Line 41:
 
|  
 
|  
 
|  
 
|  
 +
|
 
|-
 
|-
 
| MapGuide Open Source
 
| MapGuide Open Source
Line 41: Line 47:
 
| Jan 12th, 2011
 
| Jan 12th, 2011
 
|  
 
|  
 +
| Response from Jason Birch: Unable to participate
 
|-
 
|-
 
| Mapnik  
 
| Mapnik  
Line 46: Line 53:
 
| Dec 8th, 2010
 
| Dec 8th, 2010
 
| Yes
 
| Yes
 +
|
 
|-
 
|-
 
| MapServer  
 
| MapServer  
Line 51: Line 59:
 
| Dec 8th, 2010
 
| Dec 8th, 2010
 
| Yes
 
| Yes
 +
|
 
|-
 
|-
 
| Oracle MapViewer  
 
| Oracle MapViewer  
Line 56: Line 65:
 
|  
 
|  
 
| Yes
 
| Yes
 +
|
 
|-
 
|-
 
| QGIS mapserver
 
| QGIS mapserver
Line 61: Line 71:
 
|  
 
|  
 
|  
 
|  
 +
|
 
|-
 
|-
 
| SPOT imagery
 
| SPOT imagery
Line 66: Line 77:
 
| Dec 12th, 2010
 
| Dec 12th, 2010
 
|  
 
|  
 +
|
 
|-
 
|-
 
| [http://www.interactive-instruments.de/index.php?id=107&L=1 XtraServer]
 
| [http://www.interactive-instruments.de/index.php?id=107&L=1 XtraServer]
Line 71: Line 83:
 
|  
 
|  
 
|  
 
|  
 +
|
 
|}
 
|}
  

Revision as of 11:51, 15 January 2011

Basic Premise

Following up on last year's exercise, the performance shoot-out presentation at FOSS4G2011 will test how long each Web mapping server takes to generate a map image, from a common set of spatial data, on a common platform. The data will be served by each Web mapping server through the WMS standard, which will serve exactly the same set of LAYERS. A JMeter load will be run on the testing box to measure various aspects of those layers.

Potential Participants

Mapping Server Development Team Leader Contacted Confirmed Comment
Cadcorp GeognoSIS Martin Daly Dec 9th, 2010 Yes
Constellation-SDI Adrian Custer, Cédric Briançon
Erdas Apollo Dimitri Monie
ESRI ArcServer Satish Sankaran
GeoServer Andrea Aime
MapGuide Open Source TBD (contacted mapguide-internals) Jan 12th, 2011 Response from Jason Birch: Unable to participate
Mapnik Dane Springmeyer Dec 8th, 2010 Yes
MapServer Jeff McKenna, Michael Smith Dec 8th, 2010 Yes
Oracle MapViewer LJ Qian Yes
QGIS mapserver Marco Hugentobler
SPOT imagery Jean-Francois (Jeff) Faudi Dec 12th, 2010
XtraServer

Timeline

January 1st, 2011 begin contacting all mapping servers
March 1st, 2011 commitment due by all interested mapping servers
March 2nd, 2011 exercise begins (and weekly meetings)
August 1st, 2011 final testing begins
September 1st, 2011 no further testing
September 2nd, 2011 final results due from all teams
September 12-16, 2011 present results at FOSS4G2011

Rules of Engagement (unofficial, copied from 2010)

  1. All parties must contribute any changes that they make to their software for this exercise, back to their community. Note that the changes don't have to be contributed before the conference, just in a reasonable period of time.
  2. Comparisons will be made of the best available version of the software, be it a formal release or a development version.
  3. Two tests will be run: one 'baseline' test with the data in its raw format (with spatial indexes), and another 'best effort' test where 'the sky is the limit' for what changes you want to make to the data (change format, generalize, etc)
  4. Teams must document all steps they did to manipulate the data/server for both the 'baseline' and 'best effort' tests. If a team does not document the steps on this wiki then that team's test results will not be used.
  5. Data formats to be used will be shapefiles for vectors, and uncompressed geotiffs for rasters.
  6. WMS output formats to be used will be png8 and png24 where possible


Documenting Server Details and Differences

It is the responsibility of each team to document their setup with regard to data.

Please keep your notes either on a wiki subpage or in svn (or both).

Generally teams keep notes on setup on the wiki while they put stylesheets and scripts in svn.

A good organization for wiki pages is to create a subpage like: http://wiki.osgeo.org/wiki/Benchmarking_2011/Mapnik_notes

And create a directory in svn named {servername}/{year} like: http://svn.osgeo.org/osgeo/foss4g/benchmarking/mapnik/2011

Datasets

SVN

The project files (minus data) are stored in Subversion (http://svn.osgeo.org/osgeo/foss4g/benchmarking/).

Hardware

windows_wms_bm (windows server)

  • System Type: Dell PowerEdge R410
  • Ship Date: 7/7/2010
  • Processor: Intel® Xeon® E5630 2.53Ghz, 12M Cache,Turbo, HT, 1066MHz Max Mem
  • 8GB Memory (4x2GB), 1333MHz Dual Ranked RDIMMs for 1Processor, Optimized
  • 2TB 7.2K RPM SATA
  • OS: Windows Server 64bit

linux_wms_bm (linux server)

  • System Type: Dell PowerEdge R410
  • Ship Date: 7/7/2010
  • Processor: Intel® Xeon® E5630 2.53Ghz, 12M Cache,Turbo, HT, 1066MHz Max Mem
  • 8GB Memory (4x2GB), 1333MHz Dual Ranked RDIMMs for 1Processor, Optimized
  • 2TB 7.2K RPM SATA
  • OS: Centos 5.5 x86-64

Communication

Coordination/communication is primarily via the Benchmarking mailing list: http://lists.osgeo.org/mailman/listinfo/benchmarking

Weekly meetings will occur through IRC chat in the #foss4g channel on irc.freenode.net

Next IRC meeting

External Related Links