How the FCC is Creating Better Open Data

    by Ian Cairns
    November 23, 2010

    In the context of our [TileMill](http://mediashift.org/idealab/2010/08/tilemill-custom-maps-to-help-with-data-dumps-hyper-local215.html) project, we’ve been talking about our goal to help make open data from governments more actionable by making it easier to turn GIS data into custom maps. We’re focused on building better tools so people can turn data into custom maps to tell better stories online, but another important part of this process is getting good access to quality data in the first place. What does it look like to open up data effectively, so that it’s not just available but useful to the public?

    1. FCC Setting a Good Example

    The [Federal Communications Commission (FCC)](http://reboot.fcc.gov/developer) is providing one good example by demonstrating how an iterative approach to releasing data leads to better quality. Where a lot of government agencies and other organizations with large volumes of data have taken a path of just posting everything they have and letting developers figure out the rest, the FCC has taken a different approach. They have been building applications with their own data, [creating APIs](http://reboot.fcc.gov/blog?entryId=727240) based on their own needs as they build, and releasing these APIs to the public [to help further vet the usefulness of the APIs](http://reboot.fcc.gov/blog?entryId=872119) and the data. This iterative process where they’re actually [“eating their own dogfood”](http://www.investopedia.com/terms/e/eatyourowndogfood.asp) and using the data and APIs they have created is giving the data they have released a sharper edge. Instead of just posting files, the FCC is taking the time to understand how their data is used so that others can leverage it more effectively.

    We’re big supporters of this approach. After working on data visualization projects with open data sets, one of the most practical things we’ve discovered is how often there are holes in data quality or completeness until someone tries to visualize the data. The sooner data providers can figure out where these holes are, the sooner they’ll see their data leveraged by others to create greater impact. There’s no better way to discover (and then improve) data issues like this than actually working with the data.

    1. Video

    As part of their process to engage the developer community to provide feedback on their API releases, the FCC recently hosted an [“Open Developer Day”](http://reboot.fcc.gov/blog?entryId=951121). After the event, my colleague Eric Gundersen talked about the FCC’s “dogfooding” with [Alex Howard](http://twitter.com/digiphile) from O’Reilly Media. Check out the video below or [read Alex’s blog post](http://gov20.govfresh.com/fcc-hosts-developer-day-focused-on-open-government-innovation/) for more details.

    Tagged: data visualization fcc government open data tilemill visualization

    Comments are closed.

  • Who We Are

    MediaShift is the premier destination for insight and analysis at the intersection of media and technology. The MediaShift network includes MediaShift, EducationShift, MetricShift and Idea Lab, as well as workshops and weekend hackathons, email newsletters, a weekly podcast and a series of DigitalEd online trainings.

    About MediaShift »
    Contact us »
    Sponsor MediaShift »
    MediaShift Newsletters »

    Follow us on Social Media