Search

Policy and technology: a "longue durée" view

Random thoughts on policy for technology and on technology for policy

Month

January 2014

How can #egov support business growth? Come to learn from the best – Brussels Feb 6th

In a little more than a week, on february 6th, we’re hosting a seminar in Brussels which I particularly look forward to.

The basic question is how can e-government services support business growth, going beyond the traditional view of “cutting red tape”. It is superfluous to mention how important it is today to promote growth in Europe.

Our goal (in the context of a contract for the EC) is to identify the key lessons learnt from existing practice, and to understand what are the bottlenecks to be addressed.

To do so, we invited some of the most inspiring initiatives in Europe:

  • Jaana Lappi (Ministerial Advisor) and Benita Troberg (Project manager) from the Ministry of Employment and Economy (FINLAND) – for the case Enterprise Finland
  • Mihkel Tikk from the Department of State (ESTONIA) – for the case of the portal eesti.ee
  • Helle Schade-Sørensen from the Danish Agency for Digitisation (DENMARK) – for the case Nem Handel
  • Lidio Viérin and Guido Piovano, representatives from the single contact point for the Region Valle d’Aosta (ITALY)
  • Giulio Aimeri (Forum P.A.) for an overview of Italian best practices
  • Sergio Jerez, Municipality of Barcelona, on the joint strategy of open data for business startups
  • Stefan Fittner, project manager of the Business Service Portal (Austria)

The seminar is designed for maximum interaction, with input expected from all participants, not only for speakers. We have post-it session, live voting sessions, and we will develop together key policy recommendations for the EC.

Participation is free, on a first come first served basis. Register HERE.

Advertisements

The determinants of science 2.0 adoption #openscience #futurescience20

During the discussion I had at the European Astronomical Society, I realise how much science 2.0 implications vary across disciplines.
This made me sketch out a set of factors that deeply shape how science 2.0 deployment plays out in a specific scientific field:

  • the involvement of industry in funding research: if there is strong industry involvement, there are stricter IPR regimes and less willingness to share. Moreover, in disciplines such as astronomy, with little industry involvement, scientists have less possibility to get rich and are more likely to be motivated purely by curiosity and passion. Hence, science 2.0 can be expected to have a greater impact where industry involvement is smaller
  • the kind of data sources: if data are mainly collected through large observatories, as in the case of astronomy, it is often those observatories that decide on data sharing, and it is certainly easier to have highly structured, high quality and curated data, shared through common repositories. In other fields where data gathering is fragmented, there are less central repositories, and data sharing is more costly and difficult
  • the public appeal: astronomy is fascinating for everyone, and it is easier to have citizen science initiatives such as GalaxyZoo.
  • Big vs small science: it is certainly more common for publications in big science to be reproducible, than it is the case for small science.
  • applied vs basic research: related to the previous point, I am not sure how this plays out, but it is possible that basic research is more curiosity driven and therefore keener to openness.

This is obviously just an initial list. What do you think?

Science 2.0 in astronomy: can we please have “telescope citability”? #futurescience20

Just out of a great meeting with the European Astronomical Society. Very exciting, lots of insight.

I can’t report on all the content, but just one idea: It’s difficult but fundamental to use alternative metrics. For instance, there should be ways to recognize scientists who build great scientific instruments: a kind of “telescope citability”. There are some scientists that are great at building instruments, but less so at writing papers, and they don’t gain the recognition they deserve.

So it’s not only about data citability, or sharing codes: can we have telescope citability?

Indicators for data reuse: it’s not how many, it’s who #opendata

I am in Rolle, Switzerland, on the beautiful Geneva lake, getting ready for a speech on Science 2.0 to the European Astronomical Society. As usual, travelling makes me read, and think. In this case, a great paper on the reuse of scientific data: If We Share Data, Will Anyone Use Them? 

One of the topic I am interested in is reuse of open data. In the domain of open government, the current EU eGov action plan one of the key actions is on indicators for PSI reuse. This is critical: after many years fighting to have open government data, we now need to show they are actually getting used and reused. Just as for online public services, there is a sense of disappointment with the low rate of open data reuse, typically measured by number of downloads of datasets or number of users downloading datasets. Somehow there was the expectation that citizens would rush to play with government data, once they became available.

In my opinion, this is a mistaken expectations. Citizens by far are not interested in government data, and certainly not in directly manipulating them. What matters is not how many people download them, but what do with it the few people who care. It does not matter if spending data are downloaded by few people: what matters is that among those few, someone is building great apps and services, used by millions, generating social and economic benefits.

Based on the literature on eGovernment, we somehow expect that UPTAKE indicators anticipate IMPACT indicators. If you have few users downloading, you expect the impact to be low, and viceversa. But the reality is that the success stories of open data happen when “data meet people”, when the right people come across the right data. When it comes to innovation, uptake is not a proxy for impact. What matters is not how many, but who. Number of downloads and number of users should not be taken as headline indicators to measure the impact of open government.

The same is true in science. Publishing scientific data will not lead to thousands of scientists replicating the findings of other scientists. But we know from the Rheinhart Rogoff case that we simply need one student to reuse the data in order to achieve a huge impact, in this case to uncover the mistaken evidence behind the most important economic decisions of our time.

An Open Strategy, in any domain, should not be aiming to generate massive participation, but at enabling and facilitating the job of those few that actually care about them. That’s design for serendipity.

Findability of the data is key and this is why metadata and standards are crucial to grasping the benefits of open data. Because they facilitate the serendipitous encounter of the right people with the right data.

Three more observatories reviewed: IPP, EEO and DAE #policyobs

The website Daeimplementation.eu aims to monitor the progress of the Digital Agenda for Europe. It presents through a dashboard the progress in the implementation of each action by each Member State. The data are uploaded directly by the Member States, which have to provide yes/no answer and the evidence to back up the statements. Based on the percentage of yes/no answer, and on the relative timing of the action, different colors are assigned.

This is certainly an appealing way to present complex topics in a single policy dashboard. It is also interesting that data is provided directly by stakeholders. However, this refers to a set of specific actions under a common strategic framework, rather than to different uncoordinated policy issues.

The European Employment observatory allows people guarantees the “provision of information, comparative research and evaluation on employment policies and labour market trends in 33 countries.”

It allows to search, browse by theme and year and country, existing pieces of content produced ad hoc by the Observatory.

The Innovation policy platform is a project by the World Bank and the OECD. It allows users to browse innovation policy themes and statistics on a country basis.

The search / browse facility allows to identify ad hoc summary and relevant existing reports, through their content management system.

“EU startups don’t think in terms of platform”: what does it mean?

Screen Shot 2014-01-17 at 13.54.56

At our workshop on Innovation policy in Brussels last October, the keynote by Wim De Waele was illuminating and thought provoking, even more than usual.

One of the most stimulating observation was that EU startups lack the “platform thinking”.

What does this mean? I try here to spell out some ideas, very rough and unstructured.

In my opinion, platform thinking means basically that you don’t try to maximize value on the short term by capturing 100% of the revenues/benefits/market share. You rather try to enable more business to create value out of your business, driving to mutual benefits. You try to design your market strategy in a way that it makes the pie bigger, not just to capture a greater share of the pie. You renounce to some short term gains in view of long term ones. You avoid pursuing a purely vertical integration strategy.

Is this the meaning of “platform thinking”?

And is it true that EU startups don’t have this?

Observing another observatory: Go-SPIN by Unesco #policyobse

 

Sin títuloIn the context of my analysis of online policy observatories, an interesting example is the Go-SPIN (Global Observatory of Science, Technology and Innovation Policy Instruments) project by UNESCO, which aims to provide the following information services:

  • STI policies;
  • operational STI policy instruments; STI legal frameworks;
  • STI national systems: organizational charts and STI priorities;
  • a data analysis software managing more than 300 temporal series of indicators: economic, social, educational, industrial, scientific, technological, on innovation, infrastructure, ICTs, etc;
  • a database listing organizations that provide technical and financial co-operation on STI issues;
  • a web-semantic-text mining multilingual tool with different applications for selecting STI strategic priorities.;
  • a digital library with more than 900 UNESCO documents on STI policies.

I am particularly interested in the semantic text mining tool, which however is yet to be completed. Do you know any other policy observatory using semantic technologies?

Assessing the first online policy observatory: hspm.org/ #policyobs

Here we are, starting the assessment of the online policy observatories. Remember, I look forward to your suggestions regarding the best policy observatories you know in this Quora discussion .

Screen Shot 2014-01-15 at 13.50.32

The Health System and Policy Observatory by the WHO is certainly very interesting. The design is modern and appealing.

On the left, you have a common template for describing the key features of the health system, with indication of the latest update.

In the central column, you have two boxes which display information in chronological order, one is the “reform log” (nice name) and the other a more plain “health policy update”. The difference between the two is not very clear.

At the right, you have background meta-information.

What is more interesting, though, is the country comparison tool. It allows you to pick max 8 countries and 2 of the topics listed in the left column. It then creates a pdf document out of this information.

However, the document is not very readable as it simply collates the information one country after the other. Choosing just 2 countries and 2 topics I obtained a 49-page document.

In other words, it does not make comparison easy or intuitive. And it also lacks and form of interaction with content.

In summary, here is what I think:

Strengths: intuitive design, robust information; good division between structured and chronological information; easy tool to pick the comparison

Weakness: no way to interact/comment; the country comparison tool simply collates the information without allowing an effective comparison; all the background work is highly human-intensive (basically traditional drafting of reports).

 

What are the best online policy observatories? #policyobs

I’m now starting a new project where I have to design a policy observatory.

In this phase, I want to collect inspirational examples. Can you help me to identify some of the best policy observatories?

By policy observatory, I mean a website where public policies on a specific topic are shared and presented.

For instance, I like:

– the Google Constitute Project , so simple and neat about global constitutions;

– the ERAWatch website, very rich in content about innovation policies;

– our own monitoring tool for the implementation of the Digital Agenda for Europe and the XML based linkedpolicies tool

What are your favourite policy observatories, no matter of the domain?

If you answer on Twitter, use the hashtag #policyobs.

UPDATE: I added a question on Quora, you can answer there.

Create a free website or blog at WordPress.com.

Up ↑