Search

Policy and technology: a "longue durée" view

Random thoughts on policy for technology and on technology for policy

Month

October 2010

The impact of open data: first evidence

So, we had in the last year or so a trend towards opening up public data. Data portals and initiatives flourished at international, national and regional level. My perception is that this is becoming one of this typical technology trends: benefits are not clear, but an open data portal is now a must have for any institution who wants to appear progressive. According to some tweets, even the EU is going to launch a data.eu portal next year. As my partner Cristiano Codagnone writes, it’s a typical case of institutional isomorphism.
Now what evidence to we have about impact?
Well, the US data.gov portal is the only one publishing metrics. And they’re quite good (25K downloads per week). Actually, all data.gov portal should publish these numbers if they’re serious about transparency.

And of course we got plenty of competitions to build apps based on open data. But it’s not clear if anyone is using those applications. According to my conversations with civic hackers, not many.

Yet we have clear evidence about an impact. There has not been a strong backlash effect. Even when delicate data like public spending were published, like the COINS database in the UK, no big scandal has emerged. And most certainly, we have not seen an increase in demagogic or conflictual discussions about those data.

So while we don’t have evidence yet to prove the benefits of opening up data, I think we can say that the risks have proved not to be excessive. Which is one of the main argument AGAINST opening up data for less innovative public administrations.

next big thing: attention, silence, intimacy enablers


The web brought us great things, but we still need to learn how to deal with partial attention and information overload. I certainly feel I could be more productive than I am.
We need to learn to work in flows, to follow our creative stream and attention. We need IT tools that help us with that. We need non-IT tools as well.
We are in the same situation we were after the birth of fast food. We are overweight and less healthy. This is why slowfood was born.
I recently came across several tools (IT and not IT) that help you focus and reflect, that are elegant and minimalist, that respect your intimacy. I believe this is part of a larger trend towards a reflection on how can we promote intimacy, silence, attention, in a web2 world characterised by total openness, information overload, continuous partial attention.

  • Tools like Instapaper.com and Pinboard.in try to help you reading when you have time, esp. combined with Kindle.
  • The pomodoro technique helps staying focussed
  • And the lack of multitasking on the iPad (as well as its simplicity) is often perceived as an advantage rather than a drawback.

And finally, at the Hub Brussels today Norma brought a great thing: a small puppy, red or green, that you put on your desk to show whether you want to chat or not. It’s like the skype icon – actually the skype icon could be a very powerful tool to use across device. We decided to call it hubatar – see image above.

While I love co-working spaces, I believe we need to think and create IT tools, non IT tools and behavours that account for attention, intimacy and silence management. I would like to start a reflection on this, possibly at the CoWorking Europe event, or in the Hub Brussels.

So what does it mean for technology? well it means that the next big thing after web2 will be about attention, silence and intimacy management tools to keep you in the flow. And that they will be in the computer, on the web, on the mobile, on any IT device (a big thing will be in cars, to avoid accident). And crucially, it will be in non-IT things, like design, furniture. And it will also be a method, a culture.

I am sure this is not new, somebody has thought of this already.

What agent-based modeling and gov20 share: human centric and network savvy tools for dealing with complexity

When working on the roadmap on ICT for governance and policy modelling, I started considering these two domains as a bit juxtaposed, coming out of complicated committee negotiation rather than sharing a well defined identity. They seemed, at first sight, as simply two domains that shared the fact of “not being dealt with by other program priorities.
This is quite typical of EU research negotiation, where internal discussions sometimes obscure the genuine importance of a theme. As one interviewee once told me, we’re working on a solution in search of a problem.
Yet I now realize how much these two domains share, and how much the actual culture of the people working on these themes is similar. There is a kind of anthropological similarity between the two communities.
Basically, both agent-based modellers and gov20 people share a refusal of traditional IT and economic thinking. On IT: they are against “IT automation” , where IT subsitutes humans through online services, decision support systems, knowledge management tools. Instead, they share the view of IT as a human centric tools, in the sense of Engelbart and Ciborra.
In economics, they are against general equilibrium theory. which assumes that all individuals are rational and that market tools are the most efficient tool to reflect preferences, and that future systems behavior can be induced by the past in a linear way.
So, the reality can be described by a 2X2 matrix

 

Traditional / Reductionist / Modern

Non-traditional / Complex / Post-modern

IT

IT automating

The more users, the lees performant the application is

User needs are predictable ex-ante

IT augmenting

Network effects Applications get better
the more people use it

User-needs are dynamically and iterative integrated into development

Economics

General equilibrium theories

Humans considered as rational and average

Linear models can predict future

Agent-based modelling

Human considered as diverse and
accounting for social influence

Non-linear effects are generated from agent interaction and system dynamics

Instead, both these two communities share two principles: a) that human are diverse, non reduceable, and not susbtitutable by machines and b) that complex, non linear results can be generated by very simple network interaction.
In my next post, I will dig deeper into how these two principles are adopted by the two communities.

UPDATE: another principle they share is the notion of “emergence”. Both gov20 and agent-based modeling focus on using ICT to identify and capture emergent behaviour ex post, rather than codifying it ex-ante

final chapter of the draft research roadmap: it’s about the how not about the why

I just wrote the final chapter of the CROSSROAD roadmap and would welcome comments. Here it is:

The research challenges presented here also question the viability of existing research instruments in addressing them. It is increasingly recognized that the determining variables for successful ICT innovation do not lie in the domains and areas to be funded (the what), but in the nature of the mechanisms in place (the how). No matter how well formulated and agreed the research challenges are, they will not be met if we don’t design the appropriate funding instruments to support research.

In particular, the research challenges presented above present features which are not always fully compatible with FP7 type of
research. In particular they are:

– user-driven and demand-driven: the tools are developed by people directly involved and interested. In the field of computational science, it is the scientist who develop the model and the algorithm; on collaborative governance, it is citizens and civil society
organisations who develop innovative applications, often as open source.

– highly multidisciplinary, with particular involvement of non-technological disciplines. Psychology, political
sciences, art, design are fundamental component of research in fields such as visual analytics and serious games.

– Not clearly divided between research and innovation: the agile development of these applications benefits from new re-compositions of existing tools as well as development of new functionalities. Market release (in beta) is not the end of the research process but a part of it. Rigid border between what is research and what is not are simply meaningless in this context and likely to be counterproductive in terms of marketable innovation.

– serendipitous innovation: research in fastly developing, complex and demand-driven applied research fields cannot be planned linearly, three years in advance, but are necessarily adjusted iteratively in order to respond to new needs, unforeseen technological opportunities and market development. How can the three years funding model of typical FP7 project be compatible to the one week-end development time of a typical barcamp? In such a context, open and more flexible funding models are to be applied therefore not only to basic research (as in the case of the European Research Council and Future and Emerging Technologies), but also to applied research.

These features are not “by definition” incompatible with current research programmes, but they are “de facto” very marginally present. One particular challenge is that most of the research challenges (such as agent-based modelling, participatory sensing, visual analytics) are building on research currently being developed in the contest of the FET (Future and Emerging Technologies) area, which is itself designed
somewhat differently, in terms of funding instruments, from other FP7 ICT research themes. Will the research community of FET be able to thrive under a different model?

Deliverable T4.4 will directly look into the possible research instruments to be used, taking into account alternative models for research funding such as FET, ERC and prizes.

Policy-making in a complex world: what ICT research should be funded? Have your say.

Within the CROSSROAD project, We’re drafting the research roadmap for governance and policy modelling. We just presented the first draft at a workshop within the ICT2010 event, and I am personally very happy with how it went. Here is our presentation.

But because it was aimed at the research community, it was cryptic: I would like to clarify what we do, why and how. So, what it is all about?
First, the context: we’re setting the  roadmap within the ICT research programme. So we focus largely on technological research. In a nutshell, the EU wants to know what ICT research it should fund in order to improve policy-making in a complex world. So the overarching question is, is there a need for ICT research on public governance? Or is it just a matter of adopting existing tools developed in other contexts? Traditionally, government is seen as an innovative market only in the areas of defense and security, but I believe there is space for research – obviously applied – in this field as well. Google moderator, Google video search, Gapminder are three examples of state-of-the-art tools applied firstly in the public governance context.

It’s a story of challenges and opportunities, and how to grasp them. The challenge is to govern a complex world, where instability is the norm, challenges are global and systemic, citizens are vocal. Traditional general-equilibrium models and eDemocracy tools fall short. The best short reference I can give is this blog post by Irving Wladawsky-Berger.
The opportunities are there: new agent-based models and simulations are able to account for tipping points and the diversity and non-rationality of human behavour. Collaborative tools lower the costs of collaboration and engagement in policy-making, and citizens are increasingly taking an active role in addressing public issues. Interactive visualization allow us to augment the human capacity for pattern recognition.

The vision we have is that there are three combined trends that are radically changing the way policies are designed, implemented and evaluated. More data are becoming available, through open government data, citizens-generated data (participatory sensing) and sensors (Internet of Things or Web Squared). More people are being involved in analyzing the data and collaborate, thanks to democratisation of software and lowering costs of collaboration. Better modelling and simulation tools are available, based on agent-based modelling, system dynamics and related non-linear approaches. The combination of these three trends could lead to a paradigm shift in policy making.

Yet these opportunities are far from being realized and research is needed to do that. We identified 4 grand challenges to be addressed in order to take full advantage of these opportunities, articulated in specific research challenges. The goal is to create a shared understanding of the key issues in order to orient future research. The titles below will bring you directly to the uservoice platform we are using.

How to assist policy makers in taking evidence-based decisions in our complex, unpredictable world? Existing econometric models are unable to account for human behaviour and unexpected events. New policy modeling and simulation are fragmented, single purpose and work at micro-level. There is a need for robust, intuitive, reusable collaborative modeling tools that can be integrated into daily decision-making processes.

Grand Challenge 2: Data-powered collaborative intelligence and action
How can we make sure that increased transparency translates into actual more open and more effective policy-making? Current tools require high involvement and attention, therefore engaging only the very committed people. They are designed facilitate conversations, rather than action. There is a need for more intuitive collaborative tools that are able to engage also less interested people, maximizing the impact of short attention span and low-engagement, as well as for ICT based feedback mechanism that are able to encourage real action and behavioral change.


How to provide high-impact services to citizens, businesses and administrations in a way that allows for co-design, public-private collaboration, citizen interaction and service co-generation. Allowing for 1-stop, 1-second service delivery at very low cost and administrative burden. Allows for completely new services, through mash-up and interoperability-by-design

How to make ICT-enabled governance a rigorous scientific domain, by providing formal methods and tools. The systematic classification of problems and solutions and description through formal languages, in an effort to make diagnosis and prescription of solutions a deterministic process that will allow building on top of existing knowledge

In the links above, you will see the specific research challenges we identified. You can comment on them, vote and add other challenges. I am quite sure we miss important points – for example William Heath added personal-data-management tools. We need your knowledge on the specific research issues – we’re not the experts on all of these themes.


It’s your chance to tell the EC what reasearch should be funded. Deadline 15th October!

A final important point. Of course it’s not only about what research to support, but how. My impression is that EU funding is currently not optimized to capture innovation in this field, which is very much bottom-up, emergent, serendipitous, design-driven, multidisciplinary. So next discussion will be about the how.

Blog at WordPress.com.

Up ↑