Benchmarking 2011

From OSGeo
Jump to navigation Jump to search

Basic Premise

Following up on last year's exercise, the performance shoot-out presentation at FOSS4G2011 will test how long each Web mapping server takes to generate a map image, from a common set of spatial data, on a common platform. The data will be served by each Web mapping server through the WMS standard, which will serve exactly the same set of LAYERS. A JMeter load will be run on the testing box to measure various aspects of those layers.

Communication

Coordination/communication is primarily via the Benchmarking mailing list: http://lists.osgeo.org/mailman/listinfo/benchmarking

Weekly meetings will occur through IRC chat in the #foss4g channel on irc.freenode.net (you can use webchat to connect in browser)

Next IRC Meeting

  • Wed August 3rd, 2011 @ 14:00:00 UTC
    • Provisional Agenda:
      • Status of OSM vector styling from all teams
      • Discuss rasters to be used
      • Discuss testing methodology

Previous IRC Meetings

reverse phone lookup

Participants

Server Teams:

Mapping Server Development Team Leader Confirmed
Cadcorp GeognoSIS Martin Daly Yes
Constellation-SDI Martin Desruisseaux, Johann Sorel Yes
GeoServer Andrea Aime, Gabriel Roldan Yes
Mapnik Dane Springmeyer Yes
MapServer Jeff McKenna, Michael Smith Yes
QGIS Server Pirmin Kalberer, Marco Hugentobler Yes

Data Teams:

Data Package Team Leader Comment
Imposm Oliver Tonnhofer tool to be used to import OSM data
SPOT imagery Jean-Francois (Jeff) Faudi

Not Participating:

Mapping Server Development Team Leader Comment
Erdas Apollo Dimitri Monie Response from Luc Donea: Unable to participate
ESRI ArcServer Satish Sankaran discussing internally
MapGuide Open Source TBD (contacted mapguide-internals) Response from Jason Birch: Unable to participate

Timeline

January 1st, 2011 begin contacting all mapping servers
March 1st, 2011 commitment due by all interested mapping servers
March 2nd, 2011 exercise begins (and weekly meetings)
June 1st, 2011 final testing begins (no more changes to data/styles/hardware, but changes to software is allowed)
September 1st, 2011 no further testing
September 2nd, 2011 final results due from all teams
September 12-16, 2011 present results at FOSS4G2011

Rules of Engagement (DRAFT)

  1. All parties must contribute any changes that they make to their software for this exercise, back to their community. Note that the changes don't have to be contributed before the conference, just in a reasonable period of time.
  2. Comparisons will be made of the best available version of the software, be it a formal release or a development version.
  3. One test will be run: a 'baseline' test with the data in any format desired, but teams cannot generalize or change the data's resolution from its raw values.
  4. Teams must document all steps they did to manipulate the data/server (such as spatial indexes created etc.). If a team does not document the steps on this wiki then that team's test results will not be used.
  5. WMS output formats to be used will be png8 and png24 where possible

Documenting Server Setup

It is the responsibility of each team to document their setup with regard to configuration details, setup notes, and differences between other servers in how data might be accessed/indexed.

Notes for each server:

Stylesheets and scripts in svn using a directory named wms/{year}/{servername}:

Testing Tool

JMeter is used since it can read a list of bbox's via CSV, and the machine that applies load has been updated with Jmeter 2.5 (the latest stable release at the time of the benchmark): http://www.reverse.net/pub/apache//jakarta/jmeter/binaries/jakarta-jmeter-2.5.zip

Jmeter can be run from the command line like:

    /home/jmeterusr/jakarta-jmeter-2.5/bin/jmeter

Server Ports

Each server has access to a range of ports for testing:

Mapserver - 8080 - 8089

Mapnik - 8090 - 8099

Cadcorp - 4326 - 4335

Constellation - 8100 - 8109

Datasets

OSM Vectors

CSV Files created with

./run_wms_request.sh -count 2200 -region -109 37 -102 41 -minsize 64 64 -maxsize 1024 768 -minres "2.5e-06" -maxres 0.000755 -srs 4326 -srs2 3857 -filter_within colorado.shp

Sample styling:

DEM Hillshading

Sample styling:


DEM Hillshading + OSM Vectors

OSM Vector Style changes:

 LandN layers turned off (they provide background color)
 LanduseN layers set to 70% opacity

JMeter Results

There are 2 scripts for summarizing and ploting the JMeter results from benchmarking

1). opt/scripts/2011/jmeter/charts/plotter.py Takes 3 arguments

 1). The jmeter output
 2). A String to put on the title of the chart
 3). Normal or seed to determine how many thread loops

Example usage:

/opt/scripts/2011/jmeter/charts/plotter.py jmeter_summary_vector-3857-linux-fcgi.txt "MapServer Linux Fast CGI 3857" normal

/opt/scripts/2011/jmeter/charts/plotter.py jmeter_summary_vector_hill_seed_fcgi_3857.txt "MapServer Linux Seed Hillshade Fast CGI 3857" seed

2). /opt/scripts/2011/jmeter/charts/summarizer.py

This will produced the summary tables from the Jmeter results

Example usage:

/opt/scripts/2011/jmeter/charts/summarizer.py jmeter_summary_vector-3857-linux-cgi.txt > ms_vector-3857-linux-cgi.txt


SVN

  • Data are only stored on the server at '/benchmarking/wms/2011/data/'

Hardware

Contact [User:msmitherdc|Michael Smith] with any questions about this hardware or for login credentials.

windows_wms_bm (windows server)

  • System Type: Dell PowerEdge R410
  • Ship Date: 7/7/2010
  • Processor: Intel® Xeon® E5630 2.53Ghz, 12M Cache,Turbo, HT, 1066MHz Max Mem
  • 8GB Memory (4x2GB), 1333MHz Dual Ranked RDIMMs for 1Processor, Optimized
  • 2TB 7.2K RPM SATA
  • OS: Windows Server 64bit

linux_wms_bm (linux server)

  • System Type: Dell PowerEdge R410
  • Ship Date: 7/7/2010
  • Processor: Intel® Xeon® E5630 2.53Ghz, 12M Cache,Turbo, HT, 1066MHz Max Mem
  • 8GB Memory (4x2GB), 1333MHz Dual Ranked RDIMMs for 1Processor, Optimized
  • 2TB 7.2K RPM SATA
  • OS: Centos 5.5 x86-64

External Related Links