Talk:ARRA Broadband Mapping

the broadband mapping initiatives are going to produce large amounts of data that are valuable to multiple levels of government, businesses, academia, and citizens - they must ensure that it will have the broadest possible accessibility and that there are affordable tools for usability. [ajturner 31aug09]

the goals should be 1) open data, 2) open source software, 3) fame/$$$ for OSGeo [roughly paraphrased from bitner, recently]

--- I think that most of these paragraphs are better as about one sentance each.. I think we should get back to an outline or sketch phase and reprioritize what goes in [01Sep09 12:00 PST] [Brian M Hamlin, OSGeo California - maplabs a-t light42 d com]

[apologies.. after a spate of good writing, some clumsy and ill-considered lines came out. back to more constructive language - 03Sep09 bh] [on further consideration, I am sure that the paper needs to be far more "just state it as it is" and far less "political".. there are political things to say, but OSGeo is not the one to say that.. OSGeo IMHO is in the extreme of technical competancy and merit based software. A position paper ought to be strongly in the same vein  -04Sep09 bh]

--

IMHO there should be two pages.. One is policy, the second simply a technical block diagram of a suggested workflow and network. Bitner, would you be willing to contribute to the block diagram? "a substantial recommendation on the structure and flow of the data in the repositories" for ideas, this was a highly successful and modern, multi-stakeholder system  http://www.marinemap.org/marinemap/   http://en.oreilly.com/where2009/public/schedule/detail/7219

-

Some Policy points:


 * vendor interop / open standards
 * committed to transparency
 * acheivable with proven means

Misc Notes:

As a dense policy document, this work may benefit from a formal "enthymeme/claim/reason/warrent/backing" structure. [03Sep09 bh]

The rise of Open Source Software is virtually synonymous with the building of the Internet - common, standards based infrastructure. The success of OSS is no more strikingly embodied than in Apache - in the midst of great uncertainty and a fierce turf war over the control of the Web Server, the critical component of the new "WWW", Apache, warts and all, built on open standards and in open source, rose to became the vendor neutral infrastructure upon which rapid, reliable progress was made. The Open Web is a strength with the right tools to search and navigate it.. In much the same way, an Open Data infrastructure provides the vendor neutral framework upon which both commercial and public-good progress is made.

-

The ability to map and work interactively on the web with that mapped data is NEW. Paper is not the only output anymore. Live stats are possible. The trajectory is that it is getting more and more capable. "Why map with computers" is an old topic, not worth anything. The new topic is - agile, web accessible and standards-based map data, can be sliced and diced up to fit the needs of a multitude of stakeholders! ajturner's GeoWeb 2.0 talk come to life.. (with that said, a balanced argument is required; an RSS or Twitter feed of changes has "wow", but is a lot less important than data verifiability, an audit trail, security checkpoints, etc. Agility, filling new niches is an OSS strength.. (the Bazaar as opposed to the Cathedral http://www.catb.org/~esr/writings/cathedral-bazaar/))

Any solution needs to change well over time, and reflect time well in the data. Infrastructure growth doesnt happen over night. This is a long effort. So the ability to see *progress*, revisions, changes over time, are very useful. Both in assessment, and in accountability. Also, as mentioned in the first ramble, any solution must be able to evolve. The deployment criteria calls for an almost unheard of deployment time table. OSS is strong there. Previous vendor solutions are top heavy and cumbersome. 1 year => .1 release cycle...

--

What to map ? In the New Zealand example, they mapped both Supply and Demand. Starting from there, things like schools, health care, business vs home start to have a frame around them. http://broadbandmap.govt.nz/map/

Getting the broadband providers to reveal their data is a huge political problem, and not yet solved. Every manner of obstacle may be thrown in the tracks of the effort. However, mapping demand side will be a snowball. If there is something to get, every org, school, health facility, constituency, and whatever else will come out of the woodwork to be "on the map." Both sides have their challenges. WIthout claiming to have all the answers, the paper ought to make an attempt at showing how the challenges of each side can be addressed with an agile, evolving OSS core, and an Open Data policy. Third party verifiable on the one hand, and fast, effective aggregation on the other perhaps.

-

Publicly verifiable information trumps competitive card-sharking.. e.g. the "sunshine rule."

For the best possible result, the plan needs to navigate boldly toward: expanding service coverage outside of the highest margin areas; documenting infrastructure; respect security concerns while firmly standing by "Open by default; public access" data; readily actionable recommendations..

Our Open Data position boosts economic potential and competitiveness, creates a level playing field in markets, and is a boon to those who are currently underserved; advocates wise allocation of resources, breaks the gridlock of high-speed adoption evident in the US; and serves those in policy that want to see change and results.

In numerous cases, private companies have done excellent work in providing real results. While others wring hands, and govt has meetings, companies have laid working structures in place. That has to be respected.

Why point out shortcomings of current market structures? - revision State problem areas in a positive stance. Be proactive, its not criticism... [03Sep09 bh]

This paper is presented to a sophisticated audience, that knows both sides of the arguments already!! By realizing the STRENGTHS of the OSGeo position, you dont have to blather on about basics. [drastic condensation of language] {statements like this need to be more formally supported] => Verifiable, auditable data policy is both right in principle, consistant with our political heritage of Federal data owned by the people, and more effective in practice, breaking barriers of inefficiency and stagnation.

---

What does OSGeo bring UNIQUELY ? What can OSGeo say out loud that others can't say ?

-

In this paper, there is an informational component - as in, hey, this exists.. and there is a, hmm, deeper situational component, as in, push the apple cart the way it wants to go, due to forces larger than OSGeo and the wind and the lay of the land, politically, economically. OSGeo has to tap into that to really make an impact here.

This is a chance to speak and that chance to speak is IMPORTANT. Dont waste your words

-- from the Webinar "Broadband penetration has stagnated" "Broadband is essential to compete in the world economy" also urban and rural problems are vastly different, and both matter. "Inefficiency has plagued the process.. we can no longer afford that"

For the accountants and bankers.. money talks.. What is the Cost/Benefit proposition here? The centerpiece of the solution is OSS software that is proven, and costs less for the taxpayer.

Industry standard software products are branches; valued, tested, branches. The proposal's core infrastructure relies on interoperating Linux and OSS, for provable reasons, and with provable cost benefit.

-- --

[02Sep09]

FCC Webinar - Benchmarks

AT&T Testimoney: How to measure the availability of Broadband.. by Zip code has been criticized as "too vague", whereas by address is inacurate.

What is speed? An index of performance components... What if you poll the users instead of measure ?

Price - many different price measures.. each has its own purpose and usefulness.. Cable Television and Digital data overlap.

Benchmarks measure what? Customer preferences change over time. Its desirable to have benchmarks that are stable over time, and its inevitable that policy goals will change with the day, and benchmarks will change to measure the current goals. With this in mind, benchmarks should be broad and stable.

- Public Advocacy - Tools and Methods

Form 477 - asset mapping - Consumer Assesment Survey - Online Crowd sourciing - Cii Participation and Town Halls

-- Harold Feld - Legal Director, Public Knowledge

"A difference between benchmarks and goals".. statute says there are metrics and benchmarks.. "the stuf we have to measure the progress of the National Broadband Plan" Goals are amazingly broad.. touches every aspect of our lives.. we need to know how the implementation affects many aspects of our civic life. There is a hell of a lot of information..

Last Mile measures - and also the Middle Mile

Dynamic vs Static measures - very spirited and insightful comments about live, modern measurements, including references to privacy concerns.

Advice to FCC - dont try to do this alone!

--- National Academey

Who are the stakeholders? Consumers - Service Providers - Application Content developers - Consumers - Policy Advocates - at least

without common terms, the debate is confused and prolonged

What are the performance indicators? discussion of applications and performance.. (previous committee work referenced)

--- -- Q&A

Price is relative to region.. Census block is a reasonable unit of measure

Not long ago, in order to get DSL you often had to be a business subscriber; now thats common.. So the service tier of yesterday is not the service tier of tomorrow

H Feld - you will collect far more information than you can use.. Academics will love it.. but what is useful? what guides the implementation of the goals ?

What we need is - suggestions on where do we go from here.. We dont have the perfect answer

Gathering Data: Credibility and Reliability.. Can we believe that outsiders will do the thankless task over the long haul? Can we believe that some sources of information will not skew towards certain vendors?

H Feld - An increasing portion of data gathering will be the machine-to-machine component.. we need to bring in more people who study informatics, rather than all of us, specialists who have a particular bias.

100mb to 100million homes.. by 2020.. still puts us about 8 years behind S. Korea (!) but is still a long term goal for the US

Maybe we should give up on FCC 477...

(number one result for "US National Broadband Map" is http://blog.fortiusone.com/2009/06/09/creating-the-national-broadband-map-for-35-million-instead-of-350-million/