*** THE CONTENT ON THIS SITE IS NOT BEING UPDATED ***

Please visit od4d.com for the latest articles.

Lessons in Tools Development

I should say that one of the most challenging tasks I experienced in this research was developing the tools.  I think the challenge comes from the fact that we are dealing with a new phenomenon (i.e. open data) in a sensitive context (i.e. local government).  Also, the approach hinges more on the qualitative side that precision in the wording of the question, and the manner by which the questions are to be asked, are key considerations.

We started tools development as early as February 2013, when we discussed the research’s key questions.  We analysed the sets of questions and the possible sources of information.  A question can be broken down into different information needs, and therefore requires different sets of information sources.  The first set of tools were drafted in April but were only finalized a month after, prior to the scheduled test run.  The test run offered opportunities for revision, and the review after even gave us more insights into redeveloping the tool further, to a point that I said to the team, that we need to put the tool to rest.

From the experience, I can identify at least three lessons learned. They are not novel, but I still find it worthwhile to share it here.

1. Clarity of the questions are important.  It should pass the self-test.  If you will find difficulty in answering it, then expect the same when you run it with your real respondents.

2.  It is important to discuss the tools with others.  Co-development, I define here as developing the tools with your co-researchers is critical, as this would make the tools more robust as it will benefit from the honest and straighforward comments of your peers.

3.  A test run is critical,  and the test site should be chosen in such a way that it has the potential to represent the actual case sites.  In our case, because we are studying provinces, we really tested the tools in a province and ran them in a way that we expected it to be run in our case study sites.  The test run provided us an opportunity to test our assumptions, 40% of which were wrong. :)

 

Tools development is critical, because the answers one gets in a research depends entirely on the questions asked.  I did not regret that our tools development exercise took us almost 4 months.  While I am not saying that the current form of our tools are already perfect, what I can say is that our tools, as they look now, are a hundred times better than the first draft we had.