Loomio
Thu 28 Sep 2017 3:06AM

Does the dashboard measure the right things?

PS Paul Stone Public Seen by 403

For open data release to be of quality and sustainable, maturity in data management and governance is important. That is why the Open Data Maturity Model was used as a basis for surveying agencies. Was this a good idea?

SM

Shaun McGirr Sun 1 Oct 2017 10:19PM

I think using an existing framework is a great idea, and it measures the right things, but I do not think it measures them at the right level. Government-as-a-whole is not an actor that makes decisions or implements policies: this is done by individual ministries/departments/agencies according to their own priorities. It would be useful to see differences in maturity across agencies, to know who is leading the charge across the various maturity areas.

JM

Jocelyn Morrison Tue 3 Oct 2017 10:27PM

Thanks for your feedback Shaun. We have been thinking about whether the dashboard should show results for government as a whole, or at an agency level. I like your idea of showing those who are leading the charge, and we could show that in relation to the range of responses (without naming all agencies). We were concerned that showing results for those with lower levels of maturity may lead to them being misconstrued as poor performers rather than being at a particular maturity level. This is because the dashboard doesn't reflect the challenges agencies may face, or the steps they are taking to improve their maturity. Is this a valid concern?

SY

Stuart Yeates Tue 3 Oct 2017 8:11AM

I agree that reusing a question set from elsewhere has great advantages, but I think that adding a few local questions reflecting local priorities could be useful, including by highlighting some of the ways that open data can help meet those priorities.

For example there's "Uses open technical standards, code lists, and identifiers." but also useful would be "Uses national, Australasian or international standards, code lists, and identifiers."

I also suspect that there needs to be a treaty mention in here somewhere.

JM

Jocelyn Morrison Tue 3 Oct 2017 10:32PM

Providing a local context to the questions is definitely something we want to do, so thanks for your suggestions Stuart. The survey questions and dashboard are very much prototypes, so any ideas for things to include are most welcome!

CF

Cam Findlay Wed 4 Oct 2017 8:28PM

It would be good to test the dashboard design out for accessibility - are colours and icons in use enough for those with colourblindness to interpret?

On the spider diagram, I'm a fan of these for comparing two points in time over a number of entities (sometimes you can find patterns that you can build out a response to eg. low licensing, low skills and knowledge but high data management pattern agencies might be in need of some sort of data or open data 101 elearning or training workshops to see if those can be lifted).

JM

Jocelyn Morrison Wed 4 Oct 2017 10:49PM

Appreciate your feedback Cam. We're planning to use all the feedback and suggestions to redesign the dashboard and create a web-based version. Checking accessibility and using websafe colours etc is definitely part of the build.

We'll also be using the dashboard to show changes over time, so suggestions on how that might be done are greatfully received!

CF

Cam Findlay Wed 4 Oct 2017 11:42PM

In that case it might be good for the main dash to have a total count of the "Yes", "No" on each item. That was you can show movement over time.

At the moment it says "Mostly Yes" how much is "Mostly"? 80%? 50%?

SM

Scott Miller Sat 21 Oct 2017 3:35PM

Hello Paul et al.,
I think the dashboard is an excellent first iteration of an all of government (AOG) snapshot. However, I struggle with how this data is going to be used with much purpose.

That is:

  • How does one (whether that is Stats NZ, a CSO or someone else) take this work forward?
  • Like the current OGP NAPs, where is the expectation of stretch between existing and future practice (across all three columns of activity)
  • As a member of the public, what confidence is there that there is governance process being managed within this data?
  • Can some simple stats be provided, like how many central/local/CRI agencies were invited to complete this survey, and how many did complete the survey?
  • What is the split between central/local/CRI, is there distinct differences between the three agencies?
  • As a CSO, how can I create change as a result of this analysis? It feels like there is too much space between 'what is' and 'what is possible' next

Some of the great elements:
* The use of focus area colour, icons, and layout are stunning
* The spider chart does an excellent job of referencing the relative positions of the six focus areas
* Like another commenter, it would be good to track progress (6 monthly perhaps?)