Crisis Mapping: RDTN.org

From OSGeo
Revision as of 00:26, 18 October 2012 by Wiki-Leo (talk | contribs) (Reverted edits by AndreaHopper (talk) to last revision by Brian Wilson)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Presentation by Uncorked Studios on their project RDTN.org - a collective voice helping others stay informed

RDTN.org maps radiation info from Japan and beyond, aggregating data from a number of sources. Soon to be a KML feed.

The site rolled out just 72 hours after the original "back of envelope" sketch.

How it happened

1) made assumptions

2) built a trust network

3) launched early, iterated often

The assumptions

1) people will buy geiger counters

2) people will trust us

3) radiation units are simple

4) people will understand info on site

5) no one else to solve same problem

6) they are qualified to do this

Crisis mode made it happen

The presentation stressed that many of these assumptions are, in fact, wrong. For example, radiation units are not simple nor widely available on short notice, so they needed to find alternative data sources.

But the sense of urgency created momentum that allowed the developers to overcome the faulty initial assumptions and to push on to create something of value.

Operating in crisis mode allows more quickly building a network of contacts. One of the connections led to Pachube (pronounced "patch-bay"). Pachube has a network of sensors in place around the world, so they became a source of data.

In ordinary circumstances, Pachube would have been seen as a competitor that might have derailed the whole project. ("It's been done already, let's just skip it!") In crisis mode, Pachube became a valued collaborator.

Coping with data sources

Pachube provides an API to allow direct access to their data.

But at this point, most agencies do not. Much of the data in RDTN.org has to be scraped from government or commercial Web sites. Scraping is necessary because these agencies make data available in polished reports (eg daily PDFs or screenshots) that are easy to view but difficult to harvest as data sources.

An obvious site to turn to in an environmental crisis would be US EPA. But the site offers only general information and no direct access to sensor data or hard numbers.

Data.gov is supposed to act as a data clearinghouse. But Data.gov almost makes the situation worse by raising expectations. The site showcases the data on the front page and promises downloadable data and feeds, but really all the links lead back to the same frustrating EPA interface.

To be fair, the sharing of data via feeds is still in its infancy, and as one attendee commented, GIS data was in the same situation only a few years ago.

At least one interesting example (from Germany) of a contributing source using crowd-sourcing not to contribute original data, but to transform (scrape) it out of PDFs into usable form. Volunteers read the PDF reports and re-enter data in a machine-readable format.


Return to 2011 Unconference Sessions.