OSGeo Vision for UN-GGIM 2
Background
In 2006 OSGeo set out to support and promote Open Source geospatial software development. The world at that time had changed rapidly by the unprecedented success of other major software initiatives using an Open Source model, notably GNU, Linux, BSD Unix, Apache and Mozilla. The ability of software developers to contribute to common source code, and to form functional communities focused on rapid, state-of-the-art improvements to crucial infrastructure had changed the rules of the game. Private sector companies, academia, government and the military were taking notice of the huge success of these projects, and beginning to change their thinking about how to benefit from this model. In particular, the strategic decision of IBM to support Linux, against its own proprietary operating systems marked a watershed moment in the adoption of an Open Source development and governance model for software by a Fortune 500 corporation.
The application of standardized licensing to source code and to new contributions had proven critical to long term stability in very large infrastructure software projects. OSGeo from its inception applied rigorous and documented criteria to previously somewhat ad-hoc projects.
OSGeo was initially formed with several of the best-of-breed Open Source geospatial software projects of the day : MapServer, GDAL/OGR, GRASS and GeoTools. Note that the initial roster of projects, and subsequent additions, covers the spectrum of server-side data storage, plug-in libraries for data transformation and format conversion, presentation and editing of data on the web, as well as traditional desktop GIS analysis. OSGeo projects now include those for mobile devices.
Very quickly, OSGeo participants realized that geospatial data itself was both crucial to the software efforts and may also achieve benefits similar to those for software, in an Open Source environment. OpenStreetMap had been founded in 2004, following the massive success of Wikipedia, roughly founded in 2001, and both were expanding rapidly with unexpectedly high quality results. OSGeo started thinking about how the role of government might fit with the role of the independent Open Source software development world in geodata.
It is a truism that science itself is an open source project. The success of the scientific method and communities of research in the last several hundred years may well be a sign that Open Source software development is here to stay, but the modern conflicts in science over funding and publishing may also be an indicator of some difficulties that lie ahead.
Current Development
OSGeo is a world-wide organization operating on the Internet, and as local chapters and interest groups. OSGeo operates common hosting and communications infrastructure, including:
- more than 150 active mailing lists,
- 25 project development instances using TRAC
- more than 400,000 unique visitors per month
- more than 700 active source code committers
- combined source code base of more than 100 million lines of code
OSGeo Local Chapters operate in more than 25 countries around the world. It is safe to say there are participants in every industrialized nation world-wide, and good representation from a handful of smaller countries in the third world.
OSGeo facilitates a growing and thriving, international Open Source software development network of networks, with rigorous review criteria and adaptive community models baked in. The necessary crucible for the next generation of major advancements in geospatial software is a combination of the ability to combine and grow common source code libraries, a rigorous legal framework, and support for community infrastructure. OSGeo sets out to enable and grow all of these.
Data Access
Highly competitive entities are in some cases committing significant resources to Open Source software development projects, however, the case with data is far less clear. It is arguable that a role of government is to provide reliable, comprehensive data infrastructures as a basis for good governance and economic growth, as we see in the United States and the European Union, but this is not universally practiced. In some cases it is up to disruptive, skunk-works projects to take on the challenge of common data infrastructure.
In the case of emergency response, participatory, cooperative models making extensive use of Open Source software and data have out-performed previous emergency response efforts in at least two very high profile cases: the Haiti Earthquake; and Hurricane Katrina. Emergency response and civil defense agencies have widely made these into case studies in the future of disaster mobilization.
Access to data is impossible in modern terms without software. Software development is the key to the evolution of the acquisition, management, dissemination and analysis of data. Currently there is a powerful and sometimes difficult evolution going on, as the roles of government, private sector and academia are blurred and redefined again and again, with respect to geospatial data and software. Arguably government is a laggard in the evolution of data access, while the private sector races forward, and cooperative communitarian models flourish on the web.
Future Scenarios
Since the founding of OSGeo, the Open Source software model has grown exponentially by every measurement metric. The role of Open Source geospatial software in the future, be it five years, ten years and beyond, is unquestionable.
Some Future Developments Over the Next Five Years
- there will be an increasing shift towards using Open Source in software infrastructure
- "cloud" infrastructure deployment will increase
- storage capacity will vastly increase in volume, and decrease in size and power usage
- advances in hardware, algorithms and software libraries will increase the ability to deal with big data
- sensor networks will become increasingly common and effective, both deployed, and in-fact due to ubiquitous networking effects
- statistical methods applied to big data will broaden and deepen understanding, and further enable machine learning methods
- crowd-sourcing and other ad-hoc methods will be cross checked with data collected by traditional structured methods, increasing quality
- privacy will continue to be a battle on all sides