SEANetMap: Stagtus 2015-05-18

Table of Contents

1 Introduction

This is a status report for the SEANetMap project.

Three point summary:

  • The latest code is live at:
  • The City's requirements are about one third met.
  • There are currently 25 open issues and 15 closed issues, which like the requirements can be seen as 33% complete.

2 John Tigue's availability

Note, to be extremely clear, again: Tigue has been and continues to work on the project daily.

SEANetMap's GitHub activity graph tracks code commits but does not track wiki commits nor other forms of document writing. I am currently past all the start-up costs of initializing an open-source project, which involved the implicit foundational work of basic program management, documentation, and setting up software development process/infrastructure. For example, deployment is currently down to typing about 20 key-presses.

As can be seen by the graph, I am now spending the majority of my time on this project writing code, rather than the required less fun stuff that someone had to step up to and address. The point is any "about one third done" summary does not cover a lot of foundation work that was done before requirements could actually be worked on.

I am currently freelancing on another paying project simultaneously. In late April and early May, I had a window between paying gigs so I was able to give SEANetMap 100% of my attention. Although I can no longer afford to give SEANetMap all my attention this does not mean I have gone AWOL on the SEANetMap project. The intent of my earlier sprint was to set up this current situation up i.e. with the foundation laid follow through and keep plugging away.

3 Context

SEANetMap is a joint project between M-Lab and Code for Seattle. The goal is to develop open source software which provides municipal situation awareness tools for broadband network performance. Citizens can run network performance tests and then compare their results to historical and current results from around the area.

Initially, the project will be rolled out in Seattle, yet it is a goal to make the codebase reusable for deployment in other localities. Seattle, as the initial deploy, has the opportunity of being seen as the home of a significant open source community producing powerful tools for the current techno-political context. This would provide the city with some prestige, as well as the raw utility of having such technology available for Seattle's own benefit.

4 Status of City's requirements

There are, depending on how one counts, 9 or 10 requirements. And, depending on how one counts, 3 to 5 of them have been met. So, the requirements are about one third met.

The following is each requirement broken out with a brief status statement:

  • [X] CityReq1 (User submits their broadband experience) #34
    • Currently the ndt-javascript-client is in the UI and working. But no record of tests is being kept in SEANetMap.
  • [ ] CityReq1.1 (Auto-populated elements) #35
    • No UI, no database
  • [ ] CityReq1.2 (End-user populated elements (opt-in)) #36
    • No UI, no database
  • [ ] CityReq2 (Data is summarized by census tract) #37
    • [This Req has no text, it simply rolls up 2 sub-issues for 2 CityReqs]
  • [X] CityReq2.1 Sample size must be > 4 (privacy concerns) #38
    • Done.
  • [X] CityReq2.2 Traffic light color scheme indicates net quality #39
    • Correct color scheme, will need to revisit when have breakdown by provider
  • [ ] CityReq3 (User views the broadband map) #40
    • [This Req has no text, it simply rolls up 3 sub-issues for 3 CityReqs]
  • [ ] CityReq3.1 Traffic light colored choropleth on political boundaries #41
    • Yes: by City Council District. No: by US Census tracks
  • [ ] CityReq3.2 Explore stats and provider options, given an area #42
    • 0% complete, 0 of 3 "more info" charts and maps
  • [ ] CityReq3.3 City-level summary charts #43
    • 0% complete, 0 of 3 charts

5 Issues

In general according to the issue tracker things are moving along nicely: 25 open issues, 15 closed.

Here is a summary of issues that should currently be addressed, broken down by who should probably be addressing them. (Sections sorted alphabetically; within each section, highest priority issues first.)

5.1 Brett Miller

  • No issues identified.

I am not sure what Brett is up to. I have heard by email that he is doing something involving hitting M-Lab's BigQuery account. I have no more information than that. His repo on GitHub is: seattle-internet-analysis, which has very low activity but then a lone coder does not need to sync with others so that may not signify anything. It would be great to get his work folded into the main repo but that has not started.

5.2 City of Seattle

5.3 John Tigue

5.4 M-Labs

5.4.1 bq2geojson

  • A refactor would be most useful, into 3 modules (as described in the architecture overview document)
    • main module which is how to programmatically use it, rather than only command-line interfacing
    • Enquirer module: gets CSV files from somewhere
    • MapMaper module: converts CSVs to TopoJSON aggregate reports
  • More command line options (e.g. a way to set var dirs, polygonType, polygonFiles, etc) will be needed for production
    • without polygonFiles, cannot change which map to aggregate by
    • The whole var properties should be rolled into the main config object which gets passed into the main module.
    • capacity to request 2014-04 to 2015-04 i.e. cross calendar year boundaries
  • README could use an update (e.g. no mention of TopoJSON and how to set that)

6 Chris Ritzo's proposal of 2015-05-15

(The following points are extracted from an email from Chris Ritzo.)

Here's my sense of what you need for an interim solution:

  • M-Lab Seattle data on a map, binned by council districts
  • Clear instructions on updating data as time goes on
  • Map can be placed in an iframe on your website
  • Seattle branded M-Lab NDT test

This will get the city an interim solution that lets them release a first version of the map and test as a public engagement tool. We would then work with the city and with local contributors to build additional features.

The additional features I think that you want are:

  • Seattle version of the NDT test uses HTML5 geo-location to store more accurate geo and test results in a non-mlab database.
  • Seattle map uses either/both M-Lab data and locally stored data
  • Map allows binning by census track/block
  • Provide some way of separating results by ISP

6.1 John Tigue's response

The above has three parts. In the following I address each separately.

6.1.1 First list: "an interim solution"

The items in the first list (let's call it "an interim solution") are all done.

6.1.2 Middle block: "good enough for 1.0"

"This will get the city an interim solution that lets them release a first version of the map and test as a public engagement tool. We would then work with the city and with local contributors to build additional features."

If there is some schedule crisis requiring us to get something out now, well, OK I guess. But there's really not much there. The state of the code is not embarrassing but it is not particularly engaging. Everything about it is inferior to existing offering, except maybe the map which could use some more refinement and features.

If this were my product, I would not waste the single opportunity for a first impression, especially tied to a city announcement. The code has hardly been tested, and even if it does work, I would expect the response to be, "So what? Maybe someday something will come of it. For now: moving on. Next." If the idea is to "test as a public engagement tool" I would not expect much back.

Please do not take this the wrong way, I think this is cool tech which can affect the real world (which is why I am working on it for free, so I definitely liked the koolaid). What is going on under the hood is pretty damn spiffy. I just do not expect a naive user to be super impressed, with what we have currently.

If we had time it would be nice to quietly roll-out to friendly beta testers. At last week's Citizens Tech Advisory Board meeting, I engaged the people involved with Upgrade Seattle (a.k.a. Equitable Broadband) and they liked the idea of being beta testers. This would give us real users, who are friendlies. We really should test before going live to the broad public. Even small usability testing would be nice. And basic browser/platform combinations should be checked, etc.

6.1.3 Second list: "next round after 1.0"

"The additional features I think that you want are…" Here are some comments point-by-point:

  • [ ] "Seattle version of the NDT test uses HTML5 geo-location to store more accurate geo and test results in a non-mlab database."
    • Getting geo-location info is literally a one-liner. The issue is where does that info go. The idea of Seattle deploying an NDT test server is interesting. That could store the HTML5 geo info.
  • [ ] "Seattle map uses either/both M-Lab data and locally stored data"
    • The too gets to the heart of the data architecture and I am working on a doc that explores the options. I will add Chris' idea of a City of Seattle local NDT server.
  • [ ] "Map allows binning by census track/block"
    • This is almost done and I would like to get that into version 1.0 as it would close 2 CityReqs i.e. we would be past half done on the requirements.
  • [ ] "Provide some way of separating results by ISP"
    • As documented in the arch overview, section "Lookup service provider based on client IP", identifying the ISP/provider is literally a one-liner in node:
      • var org = maxmind.getIsp('');
    • The issue is what to do with that info: where is it stored? Then there has to be charts as per CityReq3.3

My point is that, ignoring the US Census map, the issues in this section all skirt around the issue that SEANetMap needs a data store architectural plan. I have multiple options thought out and will release a document within the next 48 hours.

Author: John Tigue

Created: 2015-05-18 Mon 11:43

Emacs 24.2.1 (Org mode 8.2.10)