Search

Policy and technology: a "longue durée" view

Random thoughts on policy for technology and on technology for policy

Month

July 2008

Help needed: looking for best practices in Europe for government expense disclosure

Jennifer Bell of VisibleGovernment.ca (Canada) asks in a comment to the previous post about best practices in government expense disclosure (e.g. travel expenses of member of parliament, up to the detail level of scanning receipts!).
I don’t recall anything now but I suppose the UK has some projects on this.
Please let her (but me too!) know something about this. Possibly in a comment here.
Thanks

is egov2.0 just for mature democracies?

I have been invited to give a speech in South Africa. They asked for my tutorial at the Lisbon conferece, but I will make some changes. In particular, I wrote several times about how the new vision of egov2.0 is very much suited to the Anglo-Saxon contest (UK, US, New Zealand…). A former colleague at the EC went so far to say that it was ONLY relevant there.
My perception is that web2.0 is very relevant for developing countries as well. In particular, the key impact I see is on fighting corruption. We have seen examples in Romania, Bulgaria, Hungary, of web2 tools being used to encourage citizens to denounce corruption of public officials.
What other impacts can web2.0 have in young democracies and/or developing countries?
Probably the people at the World Bank PSD have interesting views on that.

A first try on context aware applications

One of my main concerns about current applications for collaborative governance and participation is their high costs of engagements. They are still designed for people that are really interested. In a way, we can compare these applications to the classic game platforms, such as XBOX and Playstation, which are designed for hard-core gamers, while new platforms such as WII and IPHONE games aim to involve the casual player. We need the WII of participation!

To be fair, web2 applications have already made huge steps in making e-participation tools easier and more usable, but still work needs to be done to engage “the second wave of users” (as Lee Bryant commented in one recent workshop at IPTS).
Therefore, research is needed to make participations easier, less costly, more interesting and relevant.

CONTEXT AWARENESS could significantly reduce the costs of participation, and facilitating it. We should not expect people to suddenly become participative citizens because it is a good in itself, but because it is practically useful in each specific situation (the long tail of participation). In terms of the previously mentioned “double dividend”, this could both dramatically increase the rate of participation, and provide interesting technological developments as context awareness is very much an innovation field of ICT at this stage (it is one of the features of web3.0, for some).

Context can have very different determining variables. Depending on my current situation (where I am, what I am doing, what I am reading etc.), I should be offered information on specific decisions being taken that could affect me.
1. For example, LOCATION-BASED participation services could be provided, so that when entering a public space (a park, a library…) I am informed of the changes that have been proposed (a proposal to reduce the number of art-books in the library). I could provide my view on the spot, both qualitative (suggestions) and quantitative (voting). I could also see what other users suggested and commented about the service I am using.
2. Another way to ensure context-awareness is through TASTE-SHARING tools. Just as Amazons builds on users’ data to suggest new books (customer who bought this also bought), e-participation tools such as debategraph could suggest “policy issues”: :citizens who engaged in this discussion also were interested in”. This makes it possible to leverage users’ intelligence to ensure better relevance (and less costs) of participation. Technologically, this requires algorithms that build patterns out of users behaviour.

Certainly these tools require better structured data to ensure interoperability, in a way or another, so that information such as taste, choices, location is made available across platforms. A specific pet idea of mine is to develop microformats for ratings of public services so that citizens’ ratings are made available across platforms.

Does this make any sense to you?

Technorati tag: egov2research

starting the conversation on future egov2.0 applications: the state of the art

The way I approach the problem of envisaging innovative egov20 applications is to look for incremental innovation.
a) I start looking at the best project that I know that leverage individuals’ collaboration for public goals.
b) Then I try to look at what “next steps” could be made to make these services better.
c) Often, it turns out these are just minor improvements which do not need research, but “just” code. Therefore, I filter out the results on the basis of their innovativeness.

This is not a scientific method, it’s a made-up way to give a structure to the process. I am sure it can be improved.
But most of all, I know nobody can to this by oneself. It needs an open discussion. We need to involve technology experts from different fields together with public policy experts, and engage in OPEN discussions. Each of us has its view, and because of the different field of expertise, often we are unable to talk to each other and advance the debate. That’s the reason for this debate, which hopefully you can follow not only on this blog but on metaaggregator such as the technorati tag feeds for egov2research, which is of course still empty at the time of writing .

So, here are some of the best projects I know to engage the wider public in public governance:
• Collaboration in Patent review: Peertopatent , not only because it uses the input from individual citizens, but most importantly, collaborative filtering tools.
Debategraph, which enables the tree-shaped structuring of a large-scale conversation, and the relation to the evidence to support each claim
• Tools for enlarging the policy debate to a wider public: OFCOM discussion on regulation of BBC; the discussion tool used for the GNU free documentation license
• The BBC white spectrum tool, used to visualize comments of a sensitive debate held on the blog
• Farmsubsidy now geographically locates each recipient of agricultural subsidy (starting from Sweden)
Gapminder, and similar tools to play with public data and to make them more meaningful
• All of mysociety.org projects, but now I am a particularly interested in fixmystreet and planningalerts

Overall, the continuous rate of new applications launched clearly shows the huge innovation potential, and the opportunity to invest in this.

Investing in these tools has therefore a high potential impact in two directions (the so-called “double dividend”):
– on the quality of debate and the level of citizens’ engagement.
– in the innovativeness of the ICT applications developed: for example GapMinder software was built for analyzing demographic data of the UN, and it has then been bought by Google to improve its “Google spreadsheet” software.

In the next posts, I will try to imagine how these applications could look like in 5 years if research is carried out.

Technorati tag:

Let’s continue walking the talk: calling for open collaboration to imagine innovative applications for egov2.0

As I blogged some time ago, I have been involved with the EC consultation process on the future workprogramme for ICT for governance and policy-making. You can find my presentation in a previous post.
I also have concerns that today’s innovation model challenge the traditional approach to ICT research policy, especially when addressing applied research (such as eGov, eHealth, eBusiness…). The traditional approach is based on large-scale multi-years projects, while the “2.0” innovation model calls instead for smaller, shorter project, flexibly arranged along the duration, or for a totally novel approach such as prize-based competitions – as pointed out in another post here.

But these are long term structural changes to the way research policy is organized. At the moment, we have to accept the existing approach to research funding, and deal with the CONTENT of the research.
A big effort has been put in place by the EC to put ICT in the public sector back on the research agenda, it is now time to spell out better what kind of next-steps applications could be developed, in order to:
– make people outside the “egov2.0 circle” aware of why this is an important field of ICT innovation.
– reach a better understanding between ourselves about what the next steps could be
– raise the level of the debate and get better projects and better value for money through the EU funding

I therefore make an OPEN CALL FOR COLLABORATION. Let’s start imagining what innovative solutions could be developed as next steps of eGovernment in fields such as: collaborative governance, policy simulation and visualization. For a fuller description of the scope, see here.

As this is in the context of research policy, we should not look at implementing existing best practice, but at developing the next steps of these solutions. For example, I like a lot the approach of Roc Fages, but it is done using available solutions, while here we should try to imagine the next steps.

I will start doing this on this blog. I hope you will join me in this effort, either by :
– co-authoring on this blog;
– commenting the posts;
– writing about this and using the technorati tag EGOV2RESEARCH .

This collaboration is on-going, with no precise deadline. However, for our effort to be listened, most of the work should be done BY NOVEMBER 2008.

Disclaimer: this is not an official EC effort, it’s voluntary and informal. However I am sure that if we come out with good content, it will be listened to.

Technorati tags: egov2research

Review of EU research policy in the field of ICT: my view

I am reading the High-level expert review of the EU research programme in the field of ICT.
Very interesting. There are many comments to make.
1) it is an expert report (the IPSE DIXIT approach). So there is little need to present evidence, and little evidence is presented: indeed, there is a nice effort to precisely and punctually refer the conclusions presented to the evidence in the annex, but if you actually read the annex, the evidence is not very conclusive.
2) my impression is that there is better evidence for negative comments, than for positive ones. For example, the positive impact for SMEs is based on interviewees personal opinion, while there is plenty of objective evidence that the programme failed to involve the most innovative SMEs (only 22% received funding).
3) the main problem is the method. Surveying participants is not a good way, IMHO, to obtain solid evidence. Especially as the participants have an interest in this funding instruments being enhanced…
4) some worrying evidence is hidden in the annex. for example the patenting ratio of funded projects is lower than ICT industry in general.
5) the key recommendations are not particularly original: venture capital, public procurement, public private partnership, less red tape. I think if you look back ten years, the recommendations would have been the same. In particular, I’ve been working on venture capital policies 10 years ago, and the evidence pointed to a lack of good proposals to be financed in Europe rather than to a lack of VC supply. When Loic le Meur moved to Silicon Valley, the key reason was the quality of human resources there, not VC. However, it is interesting that yet another report points to the importance of framework conditions, rather than to specific policy measures.
6) some very interesting recommendations point to the need for enhanced flexibility in funding research. I totally agree on that, as recent web2-like developments show that innovation are developed through iterative efforts of trial and error, rather than through large-scale long-term projects. I have a suggestion: we have seen how PRIZES have become a popular way to finance innovation, both in the private (e.g. google android applications) and public sector (e.g. DARPA all-terrain robotic vehicle contest). See also this great article by Kelman . This is consistent with the report comments on “financing projects based on actual performance rather than promises and reputation”.

concrete policy measures to open public data: the example of the UK Data unlocking service

This is a very interesting action to open government data (found through the great PoI task force blog). An internal government office acts to make information held by other offices open.
“How it works:

  1. You describe the public sector information asset you want unlocked for re-use, and post a request to
    the service. We’ll check through your request and if it’s OK (e.g. not a request under access legislation) we will add it here. You need to know that the information is potentially accessible and that no Freedom of Information exemption applies.
  2. Others can see your request and support it, either by adding a comment or by voting. The more support a request has, the better the chances of unlocking the information you want to re-use.
  3. We’ll contact the public sector information holder and see what can be done to unlock the information for re-use. To keep things simple, if the problem relates to an issue specifically covered by the Re-use of Public Sector Information Regulations or the Information Fair Trader Scheme, we’ll treat it accordingly – so you won’t need to make a separate complaint.”

See here for more details. Not only a great initiative – but explained in plain English.
Of course, that is happening in the UK.

Blog at WordPress.com.

Up ↑