Distributed architectures for Decentralised data governance (i.e. engineering new network architectures fit for regulatory purposes)

  • Fabrizio Sestini profile
    Fabrizio Sestini
    25 November 2016 - updated 4 years ago
    Total votes: 2

In these days, most public debates on the future of the Internet focus on techno-political questions, such as:

  • How to prevent the concentration of power in the hands of a few data factories operating at global scale?
  • How to create a level playing field enabling new (European) entrants to implement innovative approaches based on the big data accumulated by the incumbent platforms?
  • How to reap the benefits of big data aggregation (e.g. in terms of commercial and public services, global science advancements, better statistics), within a clear legal framework, respecting privacy and ownership of data?
  • How to guarantee security of transactions and identity of users, while at the same time preserving privacy and ownership of data, which is stored in clouds located across different international technical and legal frameworks?

Insofar, these issues have been addressed exclusively from a regulatory and policy perspective, for instance within the Digital Single Market or the e-Privacy directive, because the concentration of power on the Internet is often seen as an unavoidable consequence of network effects. 

But, actually, this asymmetric concentration of data and power in the hands of a few global aggregators is a consequence of the extremely centralised architectures of the dominant data platforms, especially at the level of data governance. Besides regulatory and policy measures, which are difficult to conceive and put in practice across different territorial and legal borders, I wonder whether technological solutions enabling an intrinsically decentralised data governance could break the "rules of the game" which have determined the success of the current data incumbents?

This approach would exploit distributed architectures to enable a fully decentralised storage and management of data. Which is feasible under existing technologies: for instance, P2P and distributed ledgers technologies enable the possibility of a fully decentralised certification and security of transactions (be they monetary exchanges or data exchanges). The main technological challenge is to generalise these kinds of architectures to clouds, social networks and IoT in a standardised and robust manner.

The vision is to enable the emergence of a decentralised privacy-by-design open innovation ecosystem, where new entrants can directly query users for their open data through standardised agreements (on the model of CC licenses), to quickly implement innovative data fuelled commercial or social services.

This has been the subject of a first exploratory call ICT11/b, published in 2016, which received 15 proposals, among which we could select the project Decode, which is starting on December 1st. Given the high interest received, we are again proposing this concept as a possible topic for the first calls of NGI. A workshop was held in Barcelona on 17/11, and comments and opinions, through this blog or the questionnaire, are most welcome.