Difference between revisions of "Benchmarking 2011"
Msmitherdc (talk | contribs) (→SVN) |
|||
(13 intermediate revisions by 7 users not shown) | |||
Line 1: | Line 1: | ||
== Basic Premise == | == Basic Premise == | ||
− | Following up on [[Benchmarking 2010| | + | Following up on the [[Benchmarking 2010|2010 exercise]], the performance shoot-out presentation at [[FOSS4G 2011]] tested how long each Web mapping server takes to generate a map image, from a common set of spatial data, on a common platform. The data was served by each Web mapping server through the WMS standard, which served exactly the same set of LAYERS. A JMeter load ran on the testing box to measure various aspects of those layers. |
== Communication == | == Communication == | ||
Line 19: | Line 19: | ||
=== Previous IRC Meetings === | === Previous IRC Meetings === | ||
− | |||
* [[Benchmarking_2011/MeetingLog | Meeting Log]] | * [[Benchmarking_2011/MeetingLog | Meeting Log]] | ||
Line 139: | Line 138: | ||
* [[Benchmarking_2011/Cadcorp_notes | Cadcorp]] | * [[Benchmarking_2011/Cadcorp_notes | Cadcorp]] | ||
* [[Benchmarking_2011/MapServer_notes | MapServer]] | * [[Benchmarking_2011/MapServer_notes | MapServer]] | ||
+ | * [[Benchmarking_2011/QGIS_notes | QGIS Server]] | ||
Stylesheets and scripts in svn using a directory named wms/{year}/{servername}: | Stylesheets and scripts in svn using a directory named wms/{year}/{servername}: | ||
Line 145: | Line 145: | ||
* [http://svn.osgeo.org/osgeo/foss4g/benchmarking/wms/2011/cadcorp Cadcorp] | * [http://svn.osgeo.org/osgeo/foss4g/benchmarking/wms/2011/cadcorp Cadcorp] | ||
* [http://svn.osgeo.org/osgeo/foss4g/benchmarking/wms/2011/mapserver MapServer] | * [http://svn.osgeo.org/osgeo/foss4g/benchmarking/wms/2011/mapserver MapServer] | ||
+ | * [http://svn.osgeo.org/osgeo/foss4g/benchmarking/wms/2011/qgis QGIS Server] | ||
== Testing Tool == | == Testing Tool == | ||
Line 163: | Line 164: | ||
Cadcorp - 4326 - 4335 | Cadcorp - 4326 - 4335 | ||
+ | |||
+ | Constellation - 8100 - 8109 | ||
+ | |||
+ | GeoServer - 8110 - 8119 | ||
== Datasets == | == Datasets == | ||
Line 196: | Line 201: | ||
LandN layers turned off (they provide background color) | LandN layers turned off (they provide background color) | ||
LanduseN layers set to 70% opacity | LanduseN layers set to 70% opacity | ||
+ | |||
+ | ==Running JMeter== | ||
+ | |||
+ | JMeter is run using VNC (port 5901) on the Jmeter server. | ||
+ | |||
+ | |||
==JMeter Results== | ==JMeter Results== | ||
Line 220: | Line 231: | ||
<code>/opt/scripts/2011/jmeter/charts/summarizer.py jmeter_summary_vector-3857-linux-cgi.txt > ms_vector-3857-linux-cgi.txt</code> | <code>/opt/scripts/2011/jmeter/charts/summarizer.py jmeter_summary_vector-3857-linux-cgi.txt > ms_vector-3857-linux-cgi.txt</code> | ||
− | |||
== SVN == | == SVN == | ||
Line 249: | Line 259: | ||
* 2TB 7.2K RPM SATA | * 2TB 7.2K RPM SATA | ||
* OS: Centos 5.5 x86-64 | * OS: Centos 5.5 x86-64 | ||
+ | |||
+ | == Final Results == | ||
+ | |||
+ | * [http://svn.osgeo.org/osgeo/foss4g/benchmarking/wms/2011/presentation/ Final presentation slides] (contains charts and results) | ||
+ | * [http://svn.osgeo.org/osgeo/foss4g/benchmarking/wms/2011/jmeter/ Raw results] are stored in the team's folder | ||
== External Related Links == | == External Related Links == | ||
Line 259: | Line 274: | ||
---- | ---- | ||
− | [[Category:FOSS4G2011]] [[Category: | + | [[Category:FOSS4G2011]] |
+ | [[Category:Benchmarking]] |
Latest revision as of 03:15, 24 February 2015
Basic Premise
Following up on the 2010 exercise, the performance shoot-out presentation at FOSS4G 2011 tested how long each Web mapping server takes to generate a map image, from a common set of spatial data, on a common platform. The data was served by each Web mapping server through the WMS standard, which served exactly the same set of LAYERS. A JMeter load ran on the testing box to measure various aspects of those layers.
Communication
Coordination/communication is primarily via the Benchmarking mailing list: http://lists.osgeo.org/mailman/listinfo/benchmarking
Weekly meetings will occur through IRC chat in the #foss4g channel on irc.freenode.net (you can use webchat to connect in browser)
Next IRC Meeting
- Wed August 3rd, 2011 @ 14:00:00 UTC
- Provisional Agenda:
- Status of OSM vector styling from all teams
- Discuss rasters to be used
- Discuss testing methodology
- Provisional Agenda:
Previous IRC Meetings
Participants
Server Teams:
Mapping Server | Development Team Leader | Confirmed |
Cadcorp GeognoSIS | Martin Daly | Yes |
Constellation-SDI | Martin Desruisseaux, Johann Sorel | Yes |
GeoServer | Andrea Aime, Gabriel Roldan | Yes |
Mapnik | Dane Springmeyer | Yes |
MapServer | Jeff McKenna, Michael Smith | Yes |
QGIS Server | Pirmin Kalberer, Marco Hugentobler | Yes |
Data Teams:
Data Package | Team Leader | Comment |
Imposm | Oliver Tonnhofer | tool to be used to import OSM data |
SPOT imagery | Jean-Francois (Jeff) Faudi |
Not Participating:
Mapping Server | Development Team Leader | Comment |
Erdas Apollo | Dimitri Monie | Response from Luc Donea: Unable to participate |
ESRI ArcServer | Satish Sankaran | discussing internally |
MapGuide Open Source | TBD (contacted mapguide-internals) | Response from Jason Birch: Unable to participate |
Timeline
January 1st, 2011 | begin contacting all mapping servers |
March 1st, 2011 | commitment due by all interested mapping servers |
March 2nd, 2011 | exercise begins (and weekly meetings) |
June 1st, 2011 | final testing begins (no more changes to data/styles/hardware, but changes to software is allowed) |
September 1st, 2011 | no further testing |
September 2nd, 2011 | final results due from all teams |
September 12-16, 2011 | present results at FOSS4G2011 |
Rules of Engagement (DRAFT)
- All parties must contribute any changes that they make to their software for this exercise, back to their community. Note that the changes don't have to be contributed before the conference, just in a reasonable period of time.
- Comparisons will be made of the best available version of the software, be it a formal release or a development version.
- One test will be run: a 'baseline' test with the data in any format desired, but teams cannot generalize or change the data's resolution from its raw values.
- Teams must document all steps they did to manipulate the data/server (such as spatial indexes created etc.). If a team does not document the steps on this wiki then that team's test results will not be used.
- WMS output formats to be used will be png8 and png24 where possible
Documenting Server Setup
It is the responsibility of each team to document their setup with regard to configuration details, setup notes, and differences between other servers in how data might be accessed/indexed.
Notes for each server:
Stylesheets and scripts in svn using a directory named wms/{year}/{servername}:
Testing Tool
JMeter is used since it can read a list of bbox's via CSV, and the machine that applies load has been updated with Jmeter 2.5 (the latest stable release at the time of the benchmark): http://www.reverse.net/pub/apache//jakarta/jmeter/binaries/jakarta-jmeter-2.5.zip
Jmeter can be run from the command line like:
/home/jmeterusr/jakarta-jmeter-2.5/bin/jmeter
Server Ports
Each server has access to a range of ports for testing:
Mapserver - 8080 - 8089
Mapnik - 8090 - 8099
Cadcorp - 4326 - 4335
Constellation - 8100 - 8109
GeoServer - 8110 - 8119
Datasets
OSM Vectors
CSV Files created with
./run_wms_request.sh -count 2200 -region -109 37 -102 41 -minsize 64 64 -maxsize 1024 768 -minres "2.5e-06" -maxres 0.000755 -srs 4326 -srs2 3857 -filter_within colorado.shp
Sample styling:
DEM Hillshading
Sample styling:
DEM Hillshading + OSM Vectors
OSM Vector Style changes:
LandN layers turned off (they provide background color) LanduseN layers set to 70% opacity
Running JMeter
JMeter is run using VNC (port 5901) on the Jmeter server.
JMeter Results
There are 2 scripts for summarizing and ploting the JMeter results from benchmarking
1). opt/scripts/2011/jmeter/charts/plotter.py Takes 3 arguments
1). The jmeter output 2). A String to put on the title of the chart 3). Normal or seed to determine how many thread loops
Example usage:
/opt/scripts/2011/jmeter/charts/plotter.py jmeter_summary_vector-3857-linux-fcgi.txt "MapServer Linux Fast CGI 3857" normal
/opt/scripts/2011/jmeter/charts/plotter.py jmeter_summary_vector_hill_seed_fcgi_3857.txt "MapServer Linux Seed Hillshade Fast CGI 3857" seed
2). /opt/scripts/2011/jmeter/charts/summarizer.py
This will produced the summary tables from the Jmeter results
Example usage:
/opt/scripts/2011/jmeter/charts/summarizer.py jmeter_summary_vector-3857-linux-cgi.txt > ms_vector-3857-linux-cgi.txt
SVN
- The project files (minus data) are stored in Subversion (http://svn.osgeo.org/osgeo/foss4g/benchmarking/wms/2011/). If you need commit access contact jmckenna on IRC.
- Data are only stored on the server at '/benchmarking/wms/2011/data/'
- existing committers can add new OSGeoID users through this page: https://www.osgeo.org/cgi-bin/auth/ldap_group.py?group=osgeo
Hardware
Contact [User:msmitherdc|Michael Smith] with any questions about this hardware or for login credentials.
windows_wms_bm (windows server)
- System Type: Dell PowerEdge R410
- Ship Date: 7/7/2010
- Processor: Intel® Xeon® E5630 2.53Ghz, 12M Cache,Turbo, HT, 1066MHz Max Mem
- 8GB Memory (4x2GB), 1333MHz Dual Ranked RDIMMs for 1Processor, Optimized
- 2TB 7.2K RPM SATA
- OS: Windows Server 64bit
linux_wms_bm (linux server)
- System Type: Dell PowerEdge R410
- Ship Date: 7/7/2010
- Processor: Intel® Xeon® E5630 2.53Ghz, 12M Cache,Turbo, HT, 1066MHz Max Mem
- 8GB Memory (4x2GB), 1333MHz Dual Ranked RDIMMs for 1Processor, Optimized
- 2TB 7.2K RPM SATA
- OS: Centos 5.5 x86-64
Final Results
- Final presentation slides (contains charts and results)
- Raw results are stored in the team's folder
External Related Links
- FOSS4G2010 Benchmarking Presentation
- FOSS4G2009 Benchmarking Presentation
- FOSS4G WMS Benchmark
- WMSTester - tool for testing not from OSGeo