<![CDATA[A Shared Digital Europe]]>https://shared-digital.eu/https://shared-digital.eu/favicon.pngA Shared Digital Europehttps://shared-digital.eu/Ghost 3.40Thu, 06 Oct 2022 08:54:15 GMT60<![CDATA[Generative Interoperability]]>https://shared-digital.eu/generative-interoperability/622f6b0349816f127c4c06b7Mon, 14 Mar 2022 16:21:34 GMTBuilding public and civic spaces onlineGenerative Interoperability

Interoperability is one of the original design principles underpinning the internet, and largely responsible for its scale and unique properties. In recent years, it has also been increasingly seen as a policy measure that can introduce greater market competition and user choice. Important and contentious interoperability proposals are included in key European digital regulations, including the Digital Markets Act and the Data Act.

In this context, we propose to treat interoperability not just as a competition measure, but also as a policy principle that supports the creation of new ecosystems, with a stronger role of public and civic actors.

This report builds on our earlier work on a ‘Shared Digital Europe’, a new frame for digital policymaking in the EU. It has been designed in order to support a more equitable and democratic digital environment, functioning beyond the market logic.

It is based on an investigation of the role that interoperability can play in a transformation of the digital environment so that public and civic spaces are more prominent online.  The research led to finding a new conceptualization of the principle:  generative interoperability as a positive norm that is the foundation for an open online ecosystem, and part of a larger approach to building more democratic digital spaces.

 

Download the report

]]>
<![CDATA[If you want to give people a seat, you need a table first]]>https://shared-digital.eu/if-you-want-to-give-people-a-seat-you-need-a-table-first/6203ca6549816f127c4c0691Wed, 09 Feb 2022 14:13:40 GMT

The Internet is busted. Huge corporations are calling the shots, extractive business models prevail. And safe and privacy-friendly online spaces have become scarce. How do we move from extractive practices to regenerative ones? How do we retain public control and move to a people-centered internet?

In our research on a Shared Digital Europe and public-civic spaces, we argue that public and civic actors need to build these alternative spaces. And that interoperability is an essential principle, through which they can together form a bigger ecosystem.

Over the course of summer, we consulted friends and experts on the question: how do we get to public-civic spaces, what role does interoperability play? And what is holding governments and civil society back to make the shift? In this series we share the insights of these conversations one at a time.  

A seat at the table

Acatech is an influential group in the German tech-policy landscape but, according to Passoth, it is often also very research and industry-driven. When he thinks back to the writing process, he characterizes a part of his role to act as a mediator to bring NGO's and civil society organizations on board, ‘because they make the important points but are often not well-represented in the tech-policy discussions’.

Participation and public accountability are recurring themes in the conversation with Passoth. Interoperability, he argues, is often presented as a technical quick fix to make things more open and collaborative. ‘But this is not just going to happen through the tech stack’. In practice, there are power plays around making systems interoperable, and there will be policy and digital industry players who will have a strong voice about what kind of interoperability we will have.

In other words, it is not the shape of technology that allows interoperability, or even democracy, but the governance of it. Therefore, Passoth argues, it is crucial to give organized civic actors as well as ‘silent voices’ a seat at the table, and we need strong governance forms to make sure these actors keep their seats at the table.

‘That’s also where cooperatives can come in’. A cooperative is just one example of how one might structure democratic governance, one which might not scale well or even be too centralized. Perhaps that’s true, Passoth adds, but the practical advantage of a cooperative is that there is an entity, a place where people can effectively be included. If you want to give people a seat, in other words, you need a table first.

Too much faith in the agora

But even a solid table and well-structured civic participation might not be enough, Passoth warns. In the 70s and 80s, Internet governance was fairly transparent, participatory and thereby democratic. Later, this changed. For Passoth, the lesson we can learn from these ‘implemented blueprints for participation’ - like internet standards - is that they can be hijacked. This happens, for example, when big companies and their engineers are overrepresented in efforts to reform the internet or when one single corporation introduces parallel standards (like Huawei’s proposal of an alternative to the Internet Protocol).

*What is interoperability?
Interoperability is the technical ability to plug one product or service into another product or service. It is also one of the founding principles of the internet, as it has been originally envisioned.There are many types of interoperability, like indifferent interoperability: think, for instance, of a car manufacturer that doesn’t care about what chargers are plugged in its cars’ standard cigarette lighters) or cooperative interoperability, when a technology developer wants other people and companies to create add-ons that fit the technology (such as phone manufacturers opting for a standard 3.5mm headphone minijack). And there is the opposite of cooperative interoperability, when a technology is downright hostile to others trying to connect, called adversarial interoperability.
There has been a lot of talk about interoperability in activist and policy circles, but not a clear view on what role it might play in developing digital spaces that are not dominated by huge for-profit corporations. There is also a focus on competitive interoperability (which regulates the big players) and not enough talk about generative interoperability, sustaining new ecosystems. You can read more about this in our background stories on interoperability and how we got to work with it: Interoperability 1: Policymaking is Worldbuilding and Interoperability 2: The Fork in the Road.

‘It seems to me that there’s too much faith in the agora’, Passoth claims, arguing that it is not enough to create open and participatory processes. ‘The consensus seems to be that if we create participatory formats, then we will have democratic regimes. But democracy works not only through the agora, but also through its institutions, and representation does not only mean allowing the place at the table, but actually creating the need for participation as well, even if it’s a burden sometimes’. So instead of an “open agora”, we need specific institutional setups that will enable participation, and help to deal with power imbalances.

Hard work

The Data Governance Act - proposed by the European Commission in 2020 and provisionally approved by the European Parliament and Council - threw ‘data cooperatives’ in the mix of possible strategies for a fairer internet. An interesting, but barely developed proposal.

But treating promising governance structures like cooperatives as organizational fixes often ignores another important aspect: that their implementation and maintenance is complicated, hard work and often ‘messy’. Passoth: ‘Just look at the world of coops that we already have, in health, in food delivery, these realities are not so romantic, right?’ He has a ‘heart for cooperatives and the idea of using them for governing data sharing. But at the same time, he argues that we must learn again from the history of cooperatives, of real-life cases, and realize how complicated cooperative organizing is.

Interoperability’s image of being a technical or organizational ‘quick fix’ for a fairer internet directly contradicts this ‘messy’ reality. Take discussions about what kind of interoperability is needed, for instance, about whose standards are in the API’s, about what types of interfaces are used or about what data formats are opted for. In this sense, interoperability means hard practical work, especially when you make a case for participation or even cooperative governance.

Tick-the-box-exercise

Does or could interoperability lead to unfavorable outcomes, we ask Passoth. He shares a more common fear that it might, especially when it comes to control over components that make up the digital sphere. That’s why he ‘always’ stresses that interoperability as a tech-political goal is a very limiting frame. ‘Take messaging. A check-the-interoperability-box exercise would mean that, if you send the message from Whatsapp, you can receive it in Telegram. But who actually gets to monitor the transmission, who has the data on that, who can collect the metadata? This is not defined, sometimes not even discussed’.

Thus, merely creating technical interoperability is just one step of the way, says Passoth. It might not immediately lead to unfavorable outcomes, but it does leave control over technologies to the players that are currently running them. Such models of interoperability don’t touch the business model of the big players, like a certain standard or data format, keeping the power structures intact or even reinforcing them. ‘The question is: can we do more?’

Having put too much faith in the ‘agora’ and participation, what are possible alternative solutions? Passoth points to institutions. ‘A solution can be to have certain types of transparency obligations or accountability obligations for industry, not only in terms of data use but also in terms of what kinds of formats and standards are used. Public institutions or agencies acting in the public interest can be charged with this responsibility’. These new institutions, supported by public and civic actors, are crucial for ensuring and maintaining the balance of power in interoperable ecosystems.

]]>
<![CDATA[Aik van Eemeren: "Don’t Call It Alternative, Call It Normal"]]>https://shared-digital.eu/dont-call-it-alternative-call-it-normal/61e046b749816f127c4c064fThu, 13 Jan 2022 16:55:18 GMT

The Internet is busted. Huge corporations are calling the shots, extractive business models prevail. And safe and privacy-friendly online spaces have become scarce. How do we move from extractive practices to regenerative ones? How do we retain public control and move to a people-centered internet?

In our research on a Shared Digital Europe and public-civic spaces, we argue that public and civic actors need to build these alternative spaces. And that interoperability is an essential principle, through which they can together form a bigger ecosystem.

Over the course of summer, we consulted friends and experts on the question: how do we get to public-civic spaces, what role does interoperability play? And what is holding governments and civil society back to make the shift? In this series we share the insights of these conversations one at a time.  

What is interoperability?


Interoperability is the technical ability to plug one product or service into another product or service. It is also one of the founding principles of the internet, as it has been originally envisioned. 


There are many types of interoperability, like indifferent interoperability: think, for instance, of a car manufacturer that doesn’t care about what chargers are plugged in its cars’ standard cigarette lighters) or cooperative interoperability, when a technology developer wants other people and companies to create add-ons that fit the technology (such as phone manufacturers opting for a standard 3.5mm headphone minijack). And there is the opposite of cooperative interoperability, when a technology is downright hostile to others trying to connect, called adversarial interoperability. 


There has been a lot of talk about interoperability in activist and policy circles, but not a clear view on what role it might play in developing digital spaces that are not dominated by huge for-profit corporations. There is also a focus on competitive interoperability (which regulates the big players) and not enough talk about generative interoperability, sustaining new ecosystems. You can read more about this in our background stories on interoperability and how we got to work with it: Interoperability 1: Policymaking is Worldbuilding and Interoperability 2: The Fork in the Road.

Previously in this series we published conversations with Nathan Schneider, Mai Ishika Sutton, Jaromil, Geert-Jan Bogaerts and natacha roussel.

This time we chat with Aik van Eemeren, Public Tech Lead at the Chief Technology Office (CTO) of the Amsterdam municipality. The CTO team focuses on the question: how could and does Amsterdam contribute to the development of public technologies for society? The work has a wide range: from digital inequality to universal internet access. Van Eemeren describes his role as gathering knowledge and creating a framework required for digital transition of the region of Amsterdam.

Critical questions in the digital realm have slowly but surely become politicized in Amsterdam. The impact of technology on the quality of city life is becoming increasingly clear. The policy agenda Digital City, co-written by Van Eemeren, outlines the city’s digital ambitions over a four-year period (2019-2022) and builds on three fundamental principles of a Free, Inclusive and Creative Digital City. In addition, it lists concrete policy actions, many of them research or experiment-oriented. We’re in the agenda’s final year, so what are the lessons so far?

From experiment to scale

‘We have done many interesting pilots’, '' says Van Eemeren, “one is that we’ve seen it’s hard to scale them. There’s just not a whole tech industry working on public technologies”. So how dó we scale? Larger public institutions or coalitions or networks of institutions can play an important role. There is, for example, the Regionale Ontwikkelingsmaatschappijen (ROM), regional public investment banks that invest in innovative (digital) economies, and the Open Agile Smart City (OASC), which brings together cities to support their ‘digital transformation journey’.

Ideas for scaling public technologies are conceived of in Amsterdam itself too. Van Eemeren shares the idea of a ‘bit book’: a collection of current digital transformation experiments, initiatives and projects, and a search for a frame in which they all fit. Why is this important? ‘Because then we can better explain why we invest in this domain, to ourselves but also to private or public sector organizations who are willing to provide support’.

Too many standards = no standard

A related issue concerns standards. If there is a standard, technologies may become interoperable. For example, because we have a standard Internet Protocol (IP), computers are able to send and receive information in a network of computers we now call the Internet. But when different groups invent different standards, the idea goes to waste. Currently, according to Van Eemeren, ‘everybody’s making their own standards’. He describes ‘a competition for the winning standard’, where organizations like the World Economic Forum, the United Nations and the IEEE Internet Initiative all develop standards in the hope of universal adoption, which rarely happens.

So one crucial challenge is that there are too many standards. Amsterdam has set an interesting example when it drafted standard public purchasing terms for algorithms and made them available to local governments throughout the Netherlands. This proved scalable as other cities committed to using and developing this standard further. There is a downside, though: it takes a lot of time before everyone is on board. Van Eemeren adds: ‘We don’t need a lot of policy, what we need is someone to give direction and say: ‘It’s going to be these two or three platforms and open standards and from now on everybody’s using them’.

Think big, start small

In Europe the belief is growing that the European Union can play such a role. And there are reasons to justify this belief: think of the work-in-progress Data Act, the General Data Protection Regulation (GDPR), guidelines governing AI, and the research and innovation programme Horizon Europe comprising many calls for proposals on public tech-related subjects. Also, the Brussels-based Gaia X is a prime example of how selecting one standard and applying it within a broader community might help bring about truly transformative shifts.

However, within Europe different digital mindsets exist. South Europe - especially Italy - are a few steps ahead in terms of thinking critically about technology, of using open standards and applying them universally. Cultural differences play a role. In Germany for instance, Van Eemeren shares, standards are applied to everything, ‘even down to the colour of pencils used by customs officers’. Whereas in the Netherlands, standardisation is culturally much more complicated.

But we can make progress, he continues, as long as we don’t bite off more than we can chew. This means breaking down the digital work field into workable parts and standardizing one piece of technology at a time. The Foundation for Public Code, for example, focuses specifically on open source code and assisting public organizations with ‘codebase stewardship’, allowing codebases to mature and organizations to collaborate.

Policy advice from a policymaker

Though not quite fitting the traditional box, Van Eemeren is a civil servant working for a local government. He knows a thing or two about policy. So in getting to open, interoperable digital spaces, where should we focus on, policy-wise?

First of all, he confides, it is a matter of language and attitude. We are used to calling open, interoperable technologies ‘alternatives’. ‘But if we want to make open normal, that’s how we have to think. If we continue to label them as ‘alternatives’, we put ourselves out of business’.

Second, we lobby the wrong way. Lobby, Van Eemeren is convinced, is ineffective when one drafts a comprehensive vision document and proposes it integrally to policymakers. It might work better when lobbying responds to existing plans and policies and ‘reacts’ by inserting clauses and proposing additions and tweaks. ‘There is upcoming regulation on who gets to issue our digital identity. We have to lobby on this specifically and propose alterations’.

But overall, lobby is not the answer. Van Eemeren: ‘Talking to European parliamentarians and lobbying them has marginal impact. I would say the power of local actions is more effective, of creating examples, of telling that story and consecutively building standards together’. By ‘together’ he refers to translocal collaboration between cities.

The power of example

Van Eemeren, on behalf of Amsterdam, works together with cities across Europe and the world, such as in the Cities Coalition for Digital Rights, a club of cities that works together ‘to protect and uphold digital rights’. ‘The coalition says: this is what we all think is important when it comes to digital rights and technology. Currently we’re digging a little deeper into the meaning of the manifest in order to get to a political agenda’.

But to do this, he continues, we need examples. Examples of what ‘secure online services’ or ‘locally-controlled digital infrastructures’ - phrases from the manifest - might look like. And this, he adds, is a complex and difficult process. ‘Can you name one successful example of a data commons in the Netherlands? If no one starts building it, we as a city can’t support it.’

]]>
<![CDATA[Jaromil: "Interoperability gives the potential of freedom"]]>https://shared-digital.eu/jaromil/61e05a1049816f127c4c0672Tue, 07 Dec 2021 17:00:00 GMT

The Internet is busted. Huge corporations are calling the shots, extractive business models prevail. And safe and privacy-friendly online spaces have become scarce. How do we move from extractive practices to regenerative ones? How do we retain public control and move to a people-centered internet?

In our research on a Shared Digital Europe and public-civic spaces, we argue that public and civic actors need to build these alternative spaces. And that interoperability is an essential principle, through which they can together form a bigger ecosystem.

Over the course of summer, we consulted friends and experts on the question: how do we get to public-civic spaces, what role does interoperability play? And what is holding governments and civil society back to make the shift? In this series we share the insights that these conversations produced, one at a time.
Previously featured in this series on interoperability and digital public-civic ecosystems were Nathan Schneider, Geert-Jan Bogaerts, Mai Ishikawa Sutton and natacha roussel.

Jaromil is a free and open software developer. The name Jaromil is a developer pseudonym, his real name is Denis Roio. He is the founder and CTO of dyne.org, a foundation that produces software not to make profit, but for its social impact. The work and projects at dyne.org has interoperability as a strong feature on several levels, Jaromil tells us.

When asked about the importance of interoperability, he speaks from his own experiences with collaborating with video artists in Amsterdam. Many of them, he says, were using Adobe’s flash player as a format, which came free of cost but also without the source code. A few years ago Adobe ‘dismissed’ the flash player and, as a result, many artists became unable to show their work. ‘In these situations the work should have been portable between different formats. Interoperability is important because it is about building bridges between systems that can be open.’ Software developers, he says, perhaps aren’t able to offer solutions that can set people’s creativity free from ‘closed’ technologies - like the flash player - , but to have at least an interoperable layer gives ‘the potential of freedom’ as people can decide to take their projects and move somewhere else.

*What is interoperability?
Interoperability is the technical ability to plug one product or service into another product or service. It is also one of the founding principles of the internet, as it has been originally envisioned.There are many types of interoperability, like indifferent interoperability: think, for instance, of a car manufacturer that doesn’t care about what chargers are plugged in its cars’ standard cigarette lighters) or cooperative interoperability, when a technology developer wants other people and companies to create add-ons that fit the technology (such as phone manufacturers opting for a standard 3.5mm headphone minijack). And there is the opposite of cooperative interoperability, when a technology is downright hostile to others trying to connect, called adversarial interoperability.
There has been a lot of talk about interoperability in activist and policy circles, but not a clear view on what role it might play in developing digital spaces that are not dominated by huge for-profit corporations. There is also a focus on competitive interoperability (which regulates the big players) and not enough talk about generative interoperability, sustaining new ecosystems. You can read more about this in our background stories on interoperability and how we got to work with it: Interoperability 1: Policymaking is Worldbuilding and Interoperability 2: The Fork in the Road.

Do one thing and do it well

The best example of interoperable thinking and practice, according to Jaromil, is the UNIX philosophy and the operating systems it inspired. Key to this philosophy are the guidelines: Do one thing and do it well; Build on existing software and; Don’t try to create big, complicated systems, but make simple, well designed programmes work together. In other words, create instances of interoperability by linking different technologies together. Perhaps unexpectedly, this has an important ecological dimension, Jaromil reminds us. ‘Using this philosophy we offer solutions that work with existing infrastructures and therefore don’t demand new ones nor new hardware’.

A practical example of the UNIX philosophy that Jaromil is keen to highlight is LaTex, an open source document preparation tool used for (a.o.) mathematics, computer science and cyber security. LaTex was designed as a modular system with interoperability sitting at its core, reason why it has been used for transforming different data components into well-readable formats such as .PDF.

Jaromil is also the CTO of DECODE, a flagship project for the European Commission exploring and experimenting with data portability and ownership. As a part of it, he and fellow developers work on a software project called ZenRoom that allows people to automate operations on data using a human-like language (Zencode) without the need to know technical programming. Interoperability is a core feature for the success of this project, which runs on any PC and even micro-chips or inside a browser web-page: by means of a simple language it allows different people from diverse disciplines to interact and co-design with it.

Time will tell

In the context of ZenRoom interoperability is studied on different levels. First is software design. Zenroom is an ‘input-output machine’, Jaromil explains, that doesn’t open connections outside of its own execution, which makes it more secure. ‘It is interoperable in the sense that it needs someone to operate it in order to function, because it needs input to give a certain output, else it just does nothing. It facilitates interaction and executes just what is being told, a design approach that lowers complexity also in the most difficult distributed computational setups.’

The second level concerns language: as mentioned above, Zenroom can be programmed in human-like language so different people, from tech-savvy to less technically skilled, can work - or interoperate - with the software too.

Time is another level of interoperability, which Jaromil thinks is often overlooked. He argues that developers and companies can do a better job at thinking about how to make software interoperable with the infrastructures in the future.  Since it’s hard to predict the future, Jaromil is also looking back. Zenroom is written in the ancient programming language C (written around fifty years ago and devised to run on the UNIX operating system). This makes the software interoperable with modern and less modern computers and operating systems, “even with 20 year old computers, and my claim is - time will tell – that it will still run 20 years from now”.

Zenroom is interoperable with different hardware platforms too. Jaromil explains that it can run on low-power chips and in an ordinary browser, ‘and everything in between. It occupies only 2 megabytes of RAM which makes it extremely powerful, environmentally sustainable but also portable’.

Connecting different systems

The portability of software or data relates directly to interoperability. If you want to move your data around, you need technologies and infrastructures that recognize the data and through this become interoperable. Even more so in ‘trustless environments’, says Jaromil, ‘like the member states of the European Union’. Dyne.org was selected by the European Commission as one of the organizations to run a pilot ‘authenticating’ the rates Erasmus exchange students obtained abroad, making them portable and usable in their home country. This requires a level of technical interoperability of the different technologies that universities in diverse European countries can use.

The universities example points to another way in which society could benefit from interoperability: to maintain and improve the social and cultural diversity of our societies. The universities are still able to maintain their own systems, but they can – in the near future – interoperate with each other. That is to say: we don’t necessarily need one-size-fits-all (which the GMAFIA desperately want us to believe), but we can have different services and technologies based on our diverse cultures, ideas and the purposes we see fit.

Who am I interoperating with?

Making systems like student rating technologies interoperable or, in the words of Jaromil, ‘opening them up’, is also risky. Because you can predict human behaviour a little bit, ‘but you never know exactly what people do with it’. A solution could be to govern interoperable systems, regulating user behaviour and preventing people from, for instance, commercial capture, (see as an example Microsoft’s dubious relationship with open source operating system GNU/Linux). Jaromil argues that you should not always govern interoperability, but that ‘interoperability is hard to maintain when entirely ungoverned’.

Some will abuse interoperability. Especially when it comes to deceiving human perception, he continues. Take a protocol like SMTP. It’s open and interoperable and enables e-mail globally. But over time, abuses started to appear within the system: first there was spam e-mail and later phishing attacks became a common form of cybersecurity risks. Jaromil: ‘So the same open and interoperable system that your bank uses to communicate, can be faked by people that want to take your money’. Openness brings risks of perception, abuse and of transparency about what or who we are operating with.

Suffering together

The interview becomes a two-way street when Jaromil asks us: ‘What is the opposite of interoperability?’ When we struggle to come up with anything better than ‘Closed?’, he rescues us: ‘The opposite is incompatibility, when something doesn’t plug in.’ A small etymological detour follows. ‘The Italian word compatible derives from a composite of the Latin words cum [meaning: with] and pathos [meaning: to suffer]’, so to suffer with, or to suffer together. While laughing, he adds: ‘If we agree that in some societies suffering is a form of work, compatible [or interoperable] means ‘does work with’ and incompatible means ‘doesn’t work with’.

To make a technology interoperable or incompatible can seem like a technical choice, but it often hides political policy choices that affect users. We need better policies, but which ones? We agree with Jaromil to a mixed approach consisting of public investment, better regulation and wide adoption of interoperable systems. He states: ‘If the public sector doesn't take their role, their role will be lost’, and lobbying is a big obstacle to taking that role. But above all, he continues, we must have a clear common goal of having sovereignty on our digital infrastructures, and not having multinational corporations dominate the whole field. ‘This is our culture and society, we are communicating through computers!’

When we are in Brussels, it is striking how many people are trying to come up with grand digital schemes that are supposed to replace the earlier grand, ‘worse’ ones. Not Jaromil. The UNIX philosophy of doing one small thing well and connecting them one at a time, again shows through: ‘I would never challenge a big system with another big system, but I would start opening bits of it and add better bits’.

]]>
<![CDATA[natacha roussel: “Interoperable to whose benefit?”]]>https://shared-digital.eu/natacha-roussel-interoperable-to-whose-benefit/6184010649816f127c4c0605Thu, 04 Nov 2021 16:26:07 GMTnatacha roussel: “Interoperable to whose benefit?”

The Internet is busted. Huge corporations are calling the shots, extractive business models prevail. And safe and privacy-friendly online spaces have become scarce. How do we move from extractive practices to regenerative ones? How do we retain public control and move to a people-centered internet?

In our research on a Shared Digital Europe and public-civic spaces, we argue that public and civic actors need to build these alternative spaces. And that interoperability is an essential principle, through which they can together form a bigger ecosystem.

Over the course of summer, we consulted friends and experts on the question: how do we get to public-civic spaces, what role does interoperability play? And what is holding governments and civil society back to make the shift? In this series we share the insights that these conversations produced, one at a time.
Previously featured in this series on interoperability and digital public-civic ecosystems were Nathan Schneider, Mai Ishikawa Sutton and Geert-Jan Bogaerts. This time we sit down with natacha roussel from the Brussels-based non-profit petites singularités, a founding member of IN COMMON. IN COMMON is a cooperative data library where civil society and solidary economy actors can share, view, and map their data. For instance, this map presenting shared urban resources in Brussels was created through IN COMMON.

What does a map of Brussels have to do with interoperability? The point of IN COMMON, says roussel, is to ‘make citizen generated data interoperable accross the different actors of the commons’. Citizens of Brussel, in this case, can share data about common services and spaces, such as free water taps and open maker spaces, and use the data set for purposes they see fit. They do not surrender the data to Google Maps, for instance, shielding it from further use or mappings. “Interoperability is essential to decentralised and distributed actors of the commons”, she says, “so that they can get organised from their local point of anchorage towards a larger system communicating across borders”.

Explaining interoperability

roussel adds tremendously to our series on interoperability, not in the least because she knows how to explain how interoperability originally came about. When we ask her about positive examples of interoperability, she simply points to the Internet itself. “The Internet has very clear binding conditions for Interoperability inscribed in the structure of its central TCP-IP protocol, that imposes to not discriminate between packets of data and treat them equally on a basis of a first come first transmitted [principle]. This is often called net neutrality.” No single party owns the ‘net’, enjoys transmission privileges or has the right to impose restrictions on who can transmit or what can be transmitted, which permits the existence of smaller and diversified providers. “Preserving net neutrality is essential”, she says, “and also a constant fight as corporate monopolies constantly make moves that threaten to overthrow it.”
Another example is e-mail, roussel continues. Sending and receiving e-mail is based on a decentralised protocol called SMTP. The addressing model is standardised and interoperable: each e-mail address follow the loginname@example.net structure. Thanks to these interoperable protocols and standards, it doesn’t matter which email provider you’re with, or where you are in the world, your email will be transmitted and received to anyone with an internet connection. “However”, roussel points out, “the overwhelming presence of GMail threatens interoperability, with its own privatised spam management system that declares as spam/censors e-mail coming from small domains”.

Interoperability and power imbalances

We ask roussel about the importance of interoperability in preserving the internet as an open and decentralised space. Here, the conversation shifts from the technical to the political: “It is important to understand how interoperable systems are organised, to whose benefit and who decides. Interoperability, like everything else, should take into account power imbalances in existing systems”.
There are many interoperability-related challenges, she continues. The first and foremost of which is that we live in a world where Big Tech dominates the Internet through ‘a worldwide quasi-monopoly’ over online exchanges, which directly violates the principle of interoperability. According to roussel, it is crucial that we allow enough space for diversity and move away from current skewed power configurations. As an example, she adds that “I would certainly not want my community based decentralised networks to be interoperable with Facebook, because we would get flooded by their data and in a very short term would become invisible to each other”.
Furthermore, interoperability will not magically transform the power of GMAFIA (Google, Microsoft, Amazon, Facebook, IBM, Apple). Connecting to their interoperable services and media will still require the acceptance of their terms of service. Hence, it will resolve in a situation of ‘interoperability without interconnection’.
Also, from a data perspective, petites singularités points out that interoperability is limited and could be strengthened by complementary data portability and open standards, through which people regain control of their personal data. As well as by a minimalist approach to authorization, which currently boils down to all-or-nothing. But why does a chat service need access to my contact list to pass a message?
Fragile ecosystemsThroughout our research and the interviews with Nathan Schneider, Mai Sutton and others, our view on interoperability has transformed into an ‘ecosystem view’, where we look at the interopable and decentralised nature of the system instead of individual services. Such a system is needed, but fragile, says Roussel. “More than anything it involves a lot of human organisation to be maintained with equity for a long time”.

*What is interoperability?
Interoperability is the technical ability to plug one product or service into another product or service. It is also one of the founding principles of the internet, as it has been originally envisioned.There are many types of interoperability, like indifferent interoperability: think, for instance, of a car manufacturer that doesn’t care about what chargers are plugged in its cars’ standard cigarette lighters) or cooperative interoperability, when a technology developer wants other people and companies to create add-ons that fit the technology (such as phone manufacturers opting for a standard 3.5mm headphone minijack). And there is the opposite of cooperative interoperability, when a technology is downright hostile to others trying to connect, called adversarial interoperability.
There has been a lot of talk about interoperability in activist and policy circles, but not a clear view on what role it might play in developing digital spaces that are not dominated by huge for-profit corporations. There is also a focus on competitive interoperability (which regulates the big players) and not enough talk about generative interoperability, sustaining new ecosystems. You can read more about this in our background stories on interoperability and how we got to work with it: Interoperability 1: Policymaking is Worldbuilding and Interoperability 2: The Fork in the Road.

But there are some starting points for a public-civic system, which have, according to roussel, been articulated well and for long by digital rights defenders. The first requirement for such a system is Public Money Public Code, which advocates that publicly financed software developed be made publicly available under a Free Software licence. The second requirement is to support, consult and contribute to existing grass-roots groups and initiatives. Third, is democratic consultations about technology choices with participation by all stakeholders. Finally, roussel continues, we need to “take back education in our hands: both the platforms that are used to transmit (open) knowledge and know-how, and the way we teach technology in school”.

Building resilient interoperable spaces

Interoperability could lead to ecosystems, in which commercial, public and civic actors share data and services, but how do we make sure that (large) commercial actors do not dominate this space? A substantial risk exists that well-intentioned public civic ecosystems or services fall victim to commercial capture. “How large is the risk of capture? Do we need additional market regulation to safeguard this?”, asks Roussel. Or could we take an approach that moves beyond capitalist structures altogether?
IN COMMON’s attempt at doing this manifests in their participation to the project DREAM, funded by the European Commission through its New Generation Internet (NGI) programme. Through DREAM, IN COMMON and partners like Open Engiadina and P2PColab aim at bringing together ‘the best of the Social Web (easy UI, Linked Data), with the best of Peer-to- Peer networking architectures (end-to-end encryption, autonomy, replicability, lack of central control, censorship resistance, privacy-by-design and privacy-by-default)’. The goal of DREAM is to enable the convergence of distributed P2P networks and linked data models, and embedding it firmly within the social solidarity economy.

roussel concludes on a more general note with a piece of advice for moving out of capitalist reach: “From where I stand it feels that the only way to protect ourselves from capture is to stay busy with stuff that has no value for the market, for example resistance and radical care”.

]]>
<![CDATA[Mai Sutton: “If you want an interoperable Internet, start with how organizations work”]]>https://shared-digital.eu/mai-sutton-if-you-want-an-interoperable-internet-start-with-how-organizations-work/61714d8749816f127c4c0585Thu, 21 Oct 2021 11:34:06 GMTMai Sutton: “If you want an interoperable Internet, start with how organizations work”

The Internet is busted. Huge corporations are calling the shots, extractive business models prevail. And safe and privacy-friendly online spaces have become scarce. How do we move from extractive practices to regenerative ones? How do we retain public control and move to a people-centered internet?

In our research on a Shared Digital Europe and public-civic spaces, we argue that public and civic actors need to build these alternative spaces. And that interoperability is an essential principle, through which they can together form a bigger ecosystem.
Over the course of summer, we consulted friends and experts on the question: how do we get to public-civic spaces, what role does interoperability play? And what is holding governments and civil society back to make the shift? In this series we share the insights that these conversations produced, one at a time.
Previously featured in this series on interoperability and digital public-civic ecosystems were Nathan Schneider and Geert-Jan Bogaerts.

Decentralizing the Internet

A few weeks ago we sat down with Mai Ishikawa Sutton, organizer, writer and digital commons advocate. Much of their work deals with decentralization and exploring what that might actually entail in the digital domain. They have, for instance, been a contributor to Internet Archive's (IA) work on DWeb, short for Decentralized Web. The main aim of this project — which launched in 2016 with the first DWeb Summit — was to create a space for those building and exploring ways to build a better web, to meet, share strategies, and cross pollinate. The IA's DWeb work has continued to establish some common ground among diverse communities and contributors building new DWeb protocols and platforms, as the project fleshed out a set of overarching principles to guide the development of a decentralized version of the World Wide Web. Sutton: “The main question was: what does decentralization actually mean meaningfully from a social institutional perspective?’’

One side of the decentralization coin concerns protocols. DWeb protocols such as IPFS and Hypercore enable decentralized data storing, whereas today’s web heavily leans on huge central servers and its enabling protocol, HTTP. By contrast, if one adds a file to IPFS, for example, the file is split into smaller parts and stored in local storage of peer-to-peer ‘nodes’ (connected computers). Basically, if I download an image added to IPFS, I receive the information from multiple nodes at the same time.

Sutton has been creating DWeb content, accessible on protocols like IPFS with open source publishing tool, Distributed Press. “The goal of Distributed Press is to essentially become a Wordpress-like content management system which publishes to DWeb protocols”, they say, “By using these protocols we show the viability of publishing through the decentralized web, and in the process, invite more people to be involved in thinking about its purpose.”

More than protocols

The online magazine COMPOST, which Sutton cofounded — and short for Commons Post — provides an example. The magazine features creative essays and personal stories about the internet, from its current form to the internet as a ‘digital commons’. It’s an active use case of Distributed Press, Suttton explains, to show what the DWeb can do, and to provide an inspiring example of working with decentralized protocols. But the work on decentralization is more than just about protocols, they say, it’s also about engaging communities and exercising shared governance, which COMPOST goes to show.

As a self-identified digital commoning project, Sutton and the team behind Distributed Press believe that interoperability at all layers of the network stack are critical to bring about a more decentralized web, where many protocols, platforms, and networks flourish. "We want to see creators — journalists, artists, sex workers, and others — be able to reach their audience in the way they choose, on their own terms. The current models simply don't work for them, since the platforms they have to rely on, like Instagram or OnlyFans, don't give them a seat at the table. Interoperability is key for new tools like Distributed Press to be built. It's critical for many technical solutions to exist alongside each other”.

From niche to mainstream

Another central aim of the COMPOST project, Sutton continues, is to appeal to a wider audience. “We are trying to attract more creative people and artists to the decentralized web. It's still too much of an insider situation, where technologists are designing and building DWeb tools without other communities' input”. They also think artists need to be paid for their work, and that their needs should be prioritized as the DWeb is being created. In Sutton’s eyes, creative work online is often undervalued in comparison to technical work. “That equation is extremely broken. I really see this as a weakness of Facebook and Instagram and that’s why COMPOST believes in paying artists and creative writers a fair fee”. They think that prioritizing the creative part of the equation is crucial for an alternative civic ecosystem.

But what role could interoperability play in the bigger picture of a fairer and secure Internet? According to Sutton, without interoperability, people will stay locked into networks, platforms, and hardware that prevents new uses, or even new solutions, to emerge alongside existing technologies. Interoperable protocols allow special needs to be addressed, they said, for instance to allow offline communities to access content on locally-hosted servers, without having to be connected to the internet. Sutton believes that if content and identity were federated, the powerful network effects of big platforms would not be as much of a monopolistic threat. Through data interoperability, newer platforms would be able to enter the playing field, as it would be easier for individuals to switch to new products without losing all their data.

The concept of Interoperability has been gaining traction. In policy circles, but also perhaps unexpectedly, from Big Tech itself. Google, Microsoft and Facebook, for example, launched a Consumer Data Portability initiative a few years ago, which would make their services explicitly ‘interoperable’ by allowing users to effectively move their data between service providers.

Watered down

Usually, as in this case, the main argument behind interoperability is that it facilitates more open and fair competition by allowing competitors to connect to and adjust to each other’s services and products. Does Sutton believe that's the answer? “I think it’s an extremely watered down answer. I can’t imagine why Facebook would advocate for a model of interoperability that would actually allow new platforms to compete with them. I'm extremely cynical that Big Tech consortiums that push for open protocols that would substantively challenge the grip they hold on people's data and networks. To me it seems like they're trying to get ahead of stronger policies that would force them to loosen that grip.”

At the same time, implementing other pro-competition measures would be helpful, they note. “It would be much easier to create more alternative systems if there wasn't such a high cost to compete with services by Apple, Google or Facebook. For example, fixing or even eliminating laws that make it crime to circumvent DRM (Digital Rights Management) would be a great start.”

Open Twitter?
Another interesting case is Twitter. Through a research project called BlueSky, Twitter is looking at using or creating an open protocol for chat. The company wants to become one stakeholder of many that use an open protocol for micro-blogging. Responding to critical comments by existing open protocol communities for overlooking them (such as Mastodon, developers of the ActivityPub protocol), Twitter produced an ecosystem report that reviews the strengths and weaknesses of existing open protocols.

Beyond competition, toward democratic models

How do we move beyond ‘competitive interoperability’ and towards some kind of integrated (eco)systems approach where interoperability is but one of the norms? “Well”, Sutton begins, “it feels right to talk about the technical end goal, but thinking about the organizations that create this ecosystem is really important too”. The first thing to look out for, they say, is: How are you creating organizations that would be resilient to private capture and that are flexible enough to deal with new challenges and respond to them with innovative, people-centric approaches?

Many initiatives, for instance, fail because of toxic leaders or cultures. Sutton noted that “even the most well-meaning projects can collapse, or just sputter along because the organization is mismanaged.” Thus it’s important to have explicit conversations about how people work together. Think codes of conduct at the very least. Being aware of these dynamics is key, Sutton says,“If your aim is to create some kind of alternative interoperable system, the social and organizational aspects of the project needs to be addressed seriously from the beginning”.

They take inspiration from more democratic and flexible organizations like cooperatives. And from New York City, where the local government cofunded the Worker Cooperative Business Development Initiative providing a one million dollar grant to developing cooperatives such as service worker or cleaner cooperatives. “I think about not just how new organizations can create these standards, but what would that look like for existing civic institutions to work together. Would there be some sort of clearing house, some sort of body that is able to negotiate what the needs are and be flexible enough to face them?”

]]>
<![CDATA[Geert-Jan Bogaerts: “Don’t Look At Technology, Look At The Players”]]>https://shared-digital.eu/geert-jan-bogaerts-dont-look-at-technology-look-at-the-players/61641e4549816f127c4c0512Mon, 11 Oct 2021 11:46:10 GMTGeert-Jan Bogaerts: “Don’t Look At Technology, Look At The Players”

The Internet is busted. Huge corporations are calling the shots, extractive business models prevail. And safe and privacy-friendly online spaces have become scarce. How do we move from extractive practices to regenerative ones? How do we retain public control and move to a people-centered internet?

In our research on a Shared Digital Europe and public-civic spaces, we argue that public and civic actors need to build these alternative spaces. And that interoperability is an essential principle[1], through which they can together form a bigger ecosystem.

Over the course of summer, we consulted friends and experts on the question: how do we get to public-civic spaces, what role does interoperability play? And what is holding governments and civil society back to make the shift? In this series we share the insights of these conversations one at a time.

First in this series was Nathan Schneider, who talked about open source and cooperativism, about collective action problems and how interoperability alone is not enough. Next up is Geert-Jan Bogaerts, head of Innovation and Digital Media at the Dutch public broadcaster VPRO, advocate and catalyst of open, interoperable systems and organizations (such as Matrix and founder of the coalition Public Spaces. Public Spaces aims to make value-driven technology and online spaces the new normal for Dutch public institutions, the old normal being interacting with the audience through the services of tech giants like Youtube, Instagram or Google. The Public Spaces coalition so far consists of around 30 members: broadcasters, libraries, museums, schools, festivals and heritage organisations. And together, they explore how to work and act collectively, just like Nathan Schneider recommended in a previous blog in this series.
We ask Bogaerts about Matrix, which “basically is an alternative to Slack”, the application facilitates online teamwork and communication. Just like Slack, Matrix offers chat rooms, doc and video sharing, etc., but differs in a significant way: it is designed as an interoperable space. This means that other groups, services or platforms can easily connect their technology to Matrix’s infrastructure, such as chat applications like Telegram, Signal and Whatsapp.

It’s not the technology, it’s the players

Interoperability is not a new concept. The internet itself is an example of an interoperable system. When asked about the importance of interoperability, Bogaerts responds that “collaboration is not a matter of interoperable architectures, but first and foremost of the mentality of the players involved”. Although Matrix is technically interoperable with Whatsapp, Facebook, like other big tech companies, sets limits on the extent to which their technologies can function in conjunction with other platforms and its users.

The technologies are there, says Bogaerts, so no need to develop them from scratch. Does this mean that developing interoperable technologies should be off our list? “Well”, says Bogaerts, “let’s look at one of my current projects, which is about building a content discovery and distribution platform for independent makers. Mastodon already has a federated platform, PeerTube has an interoperable video distribution platform. Combining the two will get you a long way in terms of the right technologies.”

Creating the new YouTube

Most people discover and distribute media content on platforms like Youtube and Facebook. For many content makers, from amateurs to professional institutions, it’s their gateway to an audience. In order to no longer rely on these extractive and opaque technologies, different public broadcasters, like the Dutch VPRO and the British BBC, have developed and use their own platforms, called NPO Start and Iplayer, to publish media content and connect with their audiences.

Despite a good step forward, these technologies have some serious downsides that need to be addressed. One is, according to Bogaerts, the fact that they are exclusive: only VPRO and BBC can publish. The second is related to the first: NPO Start and Iplayer are not very suited for communicating content to the audience, precisely because people are drawn to ‘platforms’ where varieties of media content and makers can be discovered. Hence the need for an interoperable platform that serves both needs.

Interoperability is important, but why?

So why is it important for a content discovery and distribution platform to be interoperable? Bogaerts: “First of all, because for content makers and the public interoperability means retaining autonomy”. Extractive platforms usually apply some kind of vendor lock-in: moving content from one platform to the other means losing the (meta)data that enables interaction with the public. With interoperability, users can easily swap platforms without sacrificing crucial data. Second, interoperability stands for and enables platform pluriformity. “I believe we are moving to an internet beyond one-size-fits-all”, says Bogaerts, “where makers, brands and personalities can steward their own platform as long as these are interoperable and communicate with one another. This can bring together different makers, perspectives and audiences. That’s incredibly valuable”.

Moving from tech to context

Bogaerts agrees with Nathan Schneider that, in order to move to some sort of online public space, interoperability is not enough. A social, political and cultural context is needed that supports this kind of shift. A context that supports interoperable models and behaviour by all players, including commercial ones. What kind of policies could help bring about such a context?
“Flanking policy” is what Bogaerts calls them, to denote the facilitative role they might play in transforming media platforms. One flanking policy could be quality criteria for open standards and the formal enforcement of those criteria by an official regulator. Another could be changing the way publicly financed media organisations are being evaluated. The current emphasis on the size of their audience - the reach of content - is not necessarily an incentive to build or invest in new kinds of content platforms.

Money, nook and cranny

So, what is holding public organizations back to adopt open and interoperable systems now? Flanking policy number three: changing the incentives. An infrastructure is needed that creates the right incentives for shifting towards open, interoperable systems. Such an infrastructure could, for example, support research and development of new media platforms. Or it could manifest in procurement procedures that are public value-driven.
Currently, Bogaerts argues, it’s a moral obligation, rather than a clear incentive, for public media organizations to change their ways and opt for open, interoperable systems. “The Public Spaces group does it because they think it’s important, but the money has to come from every nook and cranny.”


  1. What is interoperability? Interoperability is the technical ability to plug one product or service into another product or service. It is also one of the founding principles of the internet, as it has been originally envisioned. There are many types of interoperability, like indifferent interoperability: think, for instance, of a car manufacturer that doesn’t care about what chargers are plugged in its cars’ standard cigarette lighters) or cooperative interoperability, when a technology developer wants other people and companies to create add-ons that fit the technology (such as phone manufacturers opting for a standard 3.5mm headphone minijack). And there is the opposite of cooperative interoperability, when a technology is downright hostile to others trying to connect, called adversarial interoperability.There has been a lot of talk about interoperability in activist and policy circles, but not a clear view on what role it might play in developing digital spaces that are not dominated by huge for-profit corporations. There is also a focus on competitive interoperability (which regulates the big players) and not enough talk about generative interoperability, sustaining new ecosystems. You can read more about this in our background stories on interoperability and how we got to work with it: Interoperability 1: Policymaking is Worldbuilding and Interoperability 2: The Fork in the Road. ↩︎

]]>
<![CDATA[Nathan Schneider: “Keep The Internet Public. The Market Will Thank Us Later”]]>https://shared-digital.eu/nathan-schneider-keep-the-internet-public-the-market-will-thank-us-later/6151952349816f127c4c047dMon, 27 Sep 2021 10:41:47 GMTNathan Schneider: “Keep The Internet Public. The Market Will Thank Us Later”

The Internet is busted. Huge corporations are calling the shots, extractive business models prevail. And safe and privacy-friendly online spaces have become scarce. How do we move from extractive practices to regenerative ones? How do we retain public control and move to a people-centered internet?

In our research on a Shared Digital Europe and public-civic spaces, we argue that public and civic actors need to build these alternative spaces. And that interoperability is an essential principle[1], through which they can together form a bigger ecosystem.

Over the course of summer, we consulted friends and experts on the question: how do we get to public-civic spaces, what role does interoperability play? And what is holding governments and civil society back to make the shift? In this series we share the insights of these conversations one at a time.

First up in this series is Nathan Schneider, who helped kickstart the platform cooperativism movement and co-edited the book Ours to Hack and to Own: The Rise of Platform Cooperativism. Platform cooperativism relates to the commons in many ways. They both highlight shared ownership and advocate democratic governance. They both look to build alternatives that could replace profit-seeking, shareholder-driven short termism. And they both are critical of tech-optimists who believe technology itself can quick-fix our societies’ challenges.

Cooperativism and open source

Schneider was motivated to combine technology with cooperativism when he saw the potential complementarity of the two. According to him, open source software lacks governability and a business model, and the cooperative movement lacks a technical strategy and innovation model besides venture capital. An example of such open source-cooperative marriages is Social.coop, a Mastodon-based social media platform that Schneider co-founded and which is run as a cooperative.

Platform cooperativism, tells Schneider, represents a kind of critique of open source raising the issue of ownership and how one governs technologies. A central problem of source code that is open to all, no strings attached, is that it’s vulnerable to exploitation and capture by freeriders ‘forking’ the source code without sharing the incomes that are generated by selling the software. These freeriders may very well be big companies. For example, the world's most popular mobile operating system, Google's Android, is based on the open source Linux operating system. Schneider explains that “the cooperative movement is a form of pushback to the way big companies have been the main financial beneficiaries of open source software development.”

An open internet

The internet itself is a prime example of what interoperability might look like. In principle, anyone with a computer can connect to the internet, write code, start a website, build software and so forth. No company owns the internet, no one decides who can use the internet and what they can do with it. But this has not been an inevitable outcome of history, says Schneider. “We could have a privately owned internet, those options were on the table and being built too. We could have had a government-controlled Internet. Russia had its option, Chile had its option.”However, we ended up with an open and interoperable infrastructure. Why? “Basically, because the public sector - the US military - paid the bill and later on had the political will to keep it public and open. “The level of interoperability we have was a result of the public nature of that investment and I think we forgot that. So much of the glory instead goes to venture capital-backed investment and companies.”

Interoperability is not enough

Making things interoperable means creating the conditions under which everyone can use, connect to and build on certain systems or technologies. This is important. Interoperability makes the benefits of new technologies or improvements to existing ones accessible to large groups of people, not just the developer carrying a license. This is precisely why individual players (i.e. private companies) don’t feel incentivized to participate and contribute to maintaining common goods. Take roads for instance, says Schneider: no one wants to take care of them but when they are maintained everybody benefits, including market actors.

There are, however, significant risks associated with interoperability. Risks regarding privacy, for instance, where opening up technologies might, paradoxically, end up violating people’s privacy rights. Facebook, for example, four years ago opened up its API (Application Programming Interface) to others - “an interoperability play”. Cambridge Analytica, a British digital marketing agency, then used the open personal data for political manipulation during Donald Trump’s political campaign that made him president.

Collective action problems

Interoperability alone is not enough. We also need “a more balanced capital stack” where not all funding comes from venture capital, Schneider argues. We need more public investment and investors that are not only concerned with increasing shareholder value. But reforming the ‘capital stack’ makes no sense on a national level, he adds, because for technology, territory is not an issue. Internationally or transnationally at the EU level, this challenge has a better chance of being addressed.

But the most pressing issue when it comes to creating public-civic online spaces, according to Schneider, is a collective action problem. What we need, he argues, is coalitions of entities and organizations that adopt open, interoperable systems simultaneously. Because when one switches from corporate to open source, licensing costs drop but support costs increase. The energy of shifting itself also is expensive. “When the city of Munich is adopted Libre Office instead of Microsoft Office, it was just Munich doing it. If you have a 100 cities of similar needs and similar size doing the same thing as Munich, we could massively reduce these costs by collaborating.”

How many people use LibreOffice? How many civil servants know NextCloud, a substitute for Google’s cloud that the German government has adopted? Or Matrix, a secure chat protocol now being used by the French government? How many are swapping Facebook for open source Mastodon? When governments invest in tools like these, adopting them becomes easier for everyone. Schneider’s university lab uses LibreOffice, NextCloud, and Matrix, and so he sees direct benefit from those far-away public investments.

For now, the people using these open services and technologies belong to a tiny minority. That’s why, according to Schneider, there needs to be a more concerted strategy to invest resources in ways that lower the bar to entry for everyone. Governments can make these investments themselves, and they can encourage civil society organizations to do so as well. The more this happens, the more open technologies can become as accessible and user friendly as platforms like Google and YouTube, and what now seems ‘alternative’ can be the new ‘normal’.

Too often, the focus has been on individual users to adopt and learn to use open tools. But Schneider believes that, with proper investment, a market will emerge to provide a “middle layer” of services that can make open tools easy to use and help manage and maintain them—including cooperative services, owned and governed by their users. For instance, he is a member of a cooperative, May First Movement Technology, that provides many of the open software he uses on a daily basis. Schneider says, “We don't want individuals to run their own software. I want to be able to trust the developers. They take a lot of weight off my back. I don't have to think about version updates. That middle layer enabling accountable collective management is crucial.”


  1. What is interoperability? Interoperability is the technical ability to plug one product or service into another product or service. It is also one of the founding principles of the internet, as it has been originally envisioned. There are many types of interoperability, like indifferent interoperability: think, for instance, of a car manufacturer that doesn’t care about what chargers are plugged in its cars’ standard cigarette lighters) or cooperative interoperability, when a technology developer wants other people and companies to create add-ons that fit the technology (such as phone manufacturers opting for a standard 3.5mm headphone minijack). And there is the opposite of cooperative interoperability, when a technology is downright hostile to others trying to connect, called adversarial interoperability.There has been a lot of talk about interoperability in activist and policy circles, but not a clear view on what role it might play in developing digital spaces that are not dominated by huge for-profit corporations. There is also a focus on competitive interoperability (which regulates the big players) and not enough talk about generative interoperability, sustaining new ecosystems. You can read more about this in our background stories on interoperability and how we got to work with it: Interoperability 1: Policymaking is Worldbuilding and Interoperability 2: The Fork in the Road. ↩︎

]]>
<![CDATA[Introducing SDEPS - Shared Digital European Public Sphere]]>

In April this year the first international conference hosted by Public Spaces brought together a large number of organisations and people working on varying aspects of creating Digital Public Spaces in Europe. In the aftermath of this conference we have started working with a number of these organisations to set

]]>
https://shared-digital.eu/introducing-sdeps-shared-digital-european-public-sphere/60e31d4a49816f127c4c0425Mon, 05 Jul 2021 15:04:14 GMT

In April this year the first international conference hosted by Public Spaces brought together a large number of organisations and people working on varying aspects of creating Digital Public Spaces in Europe. In the aftermath of this conference we have started working with a number of these organisations to set up a European coalition committed to the idea of building alternatives to the exiting online ecosystem, which is dominated by a handful of commercial players: this coalition for a Shared Digital European Public Sphere[1] (SDEPS for short) aims to bring together different types of organisations (institutions, civic initiatives, CSOs, technology projects) from different sectors (public service media, civil society media, education, cultural heritage institutions, commons projects, civic initiatives) with different ways of working (media production, technology development, advocacy) who are united in their determination to build a European Digital Public Sphere.

The coalition strives to provide a European coordination layer for efforts aimed at developing alternative online infrastructures and to become a common voice vis-à-vis EU policy makers. To guide this work the SDEPS coalition has recently adopted the following statement of principles:

What is the challenge we are trying to address?

Access to, and the use of, digital platforms is no longer an innovation - it is an essential resource for organisations both public and commercial. The European digital public arena is largely dominated by a small number of for-profit media platforms. Public service & community media, educational and academic institutions, cultural organisations and producers as well as civic initiatives have increasingly become dependent on their services in the absence of viable public alternatives. The result has been an enormous transfer of wealth from the public sector to these private actors' platforms, which in turn has allowed them to wield enormous power over the media landscape and public discourse, with little or no accountability. This imbalance and lack of a viable alternative is detrimental to the internet, to our democratic values, and to the health of our European societies.

Policy makers across Europe and the society at large have started to realize that this status quo must be changed and that they can positively shape this landscape. Over the past few years we have seen an increasing willingness to regulate in the digital space with the express aim of upholding democratic values and individual rights, accompanied by a clear recognition of the central role that digital plays across society and in Europe’s future. The European Union’s ambition for a ‘Digital Decade’ is reflected for example in the priority given to digital under the Recovery and Resilience Funds, while the vision for the New European Bauhaus initiative seeks new approaches to technology that put inclusivity, accessibility and affordability at the heart of how we live and work.

These efforts to regulate and improve the digital space are very much welcome, however we can do more, indeed we must do more, to ensure the digital values and sovereignty that Europe aspires to. Europe’s digital vision must include an ambition for a European Digital Public Space, which offers and supports alternative digital public infrastructures built on democratic principles.

What do we want?

Europe’s digital ambition must reach beyond the role of global regulator to that of global leader if it is to realise its goal of digital sovereignty. It is not enough to aspire to an alternative to Big Tech, we must actively build it. A European Digital Public Space, built on democratic values and public digital infrastructures, can be the cornerstone of that alternative.

Public digital infrastructures will promote more sovereign societies and individuals through the democratisation of access, transparency and accountability, while shared standards and interoperability will allow knowledge and culture to flow, helping people to connect. Europe’s technological and civic communities already lead the way in developing the building blocks that will make this a reality. However, that must be accompanied by ambition, investment and a strategic approach at the political and European level.

Our pan-European coalition of civil society initiatives, public service and community media organisations, and cultural (heritage) institutions is committed to creating digital public spaces that are aligned with our shared values as Europeans. As part of our ambition to deliver real and enduring value to society and our communities we aim to put the European Public Digital Space on the EU policy agenda. Our goal is to secure the commitment and investment necessary to build the public digital infrastructure we need, and to support an interoperable ecosystem of public institutions and civic initiatives that can drive and spark it.

The time for Europe to invest into digital public infrastructure is now. The networks are in place, the ambition is clear. Now we only need the political will to create digital public spaces that offer a credible vision beyond narrow commercial interests to make this Europe's Digital Decade.

On Friday this week the SDEPS coalition will hold a first public meeting where the members of the coalition will give an introduction into the current and planned activities of the coalition. Join us from 1400h - 1500h CEST to learn more about our work and how you can contribute to building a European Digital Public Sphere. To participate please register by mailing us at participate@sdeps.eu and we will mail you a meeting link later this week.


  1. Note the overlap between the coalition and our vision for a Shared Digital European Public Spaces. ↩︎

]]>
<![CDATA[From generative interoperability to the Interoperable Public-Civic Ecosystem]]>In our Vision for a Shared Digital Europe we identify key principles towards a democratic digital space that is diverse and supports communities and public values.  These principles are: Enable Self-Determination, Cultivate the Commons, Decentralise Infrastructure and Empower Public Institutions.

Due to the domination of market logic and large

]]>
https://shared-digital.eu/from-generative-interoperability-to-the-interoperable-public-civic-ecosystem/60b9c7e549816f127c4c03ccFri, 04 Jun 2021 07:52:11 GMTIn our Vision for a Shared Digital Europe we identify key principles towards a democratic digital space that is diverse and supports communities and public values.  These principles are: Enable Self-Determination, Cultivate the Commons, Decentralise Infrastructure and Empower Public Institutions.

Due to the domination of market logic and large market players, our online environment has transformed into a large surveilled shopping mall. It has replaced dynamics of cooperation, solidarity, peer to peer production and caring, which are the fundamental aspects of our societies and human needs.This marketisation of the digital society also has enormous impact on our offline lives, determining how we communicate, work, shop, move from one place to another, go on holiday, etc.

In order to counter this we need to cultivate the commons and empower public institutions. The strategy to attain this should be based on a shared mission of creating a public civic ecosystem. Many things are needed to accomplish that, and interoperability is one essential element.

For this reason, we have been developing the concept of an Interoperable Public Civic Ecosystem (IPCE), through which the four principles of our vision can be brough to life. In this model, some of key goals are achieved by introducing interoperability rules.

In this text, we take stock of the first, completed phase of our work and sketch out the model that we will be developing in phase two.

Phase one: Two types of interoperability

In the first phase of our work we have identified the need for a strong public and civic ecosystem online. We assume that such an ecosystem will have interoperability (between its constituent parts) as one of its core principles. This IPCE is our particular angle in describing what is more generally referred to as the digital public sphere (also called the public interest internet, and many other names).

We have undertaken a literature review in order to describe the value and impact of interoperability. Based on this, we are proposing that there are two different types of interoperability in the online space: Competitive interoperability and Generative interoperability.

Most of the current (policy) discussion about interoperability is focussed on what we are calling competitive interoperability. There are ongoing efforts in Europe to require (through regulation) dominant communication platforms to become interoperable to some extent. This is currently one of the main focus areas of digital rights organisations (as well as digital SMEs) in their advocacy efforts targeted at the Digital Services Act, Digital Markets Act and (to a lesser degree) the Data Governance Act.

We believe that in addition to these efforts we also need efforts to promote generative interoperability aimed at creating a strong public civic ecosystem online. We understand generative interoperability as something that goes beyond merely acting as a restraint on existing market dynamics. Instead we see it as a design principle that has the potential to build a more decentralised infrastructure that enables individual self-determination in the online environment.

Placed in the context of attempts to build a digital public sphere, the primary objective of generative interoperability would be to enable interactions between public institutions, civic and cooperative initiatives, and their audiences without the intermediation of the existing dominant platforms.

As such generative interoperability will also facilitate a pluralist digital economy. A pluralist economy includes competitive markets with for profit companies, but also the collaborative economy based on democratic and collective ownership such as platform cooperatives, commons such as Wikipedia and the care economy such as informal peer to peer networks. In this part of the economy, value is maintained with the producers and consumers and distributed equally, instead of extracted for profit motives. In short, generative interoperability will contribute to a regenerative economy.

Phase two: How to build with generative interoperability?

While regulation, in the form of imposing requirements on platforms above a certain size threshold, is the key instrument to advance (or enforce) competitive interoperability, generative interoperability requires a different approach. Leveraging generative interoperability will require supportive policies based on a mix of public investments, open standard setting and regulation that introduces interoperability requirements for publicly funded infrastructure.

We see the IPCE model as contributing directly to the idea of a Shared European Digital Public Space, embraced by a large coalition of broadcasters, public institutions, think tanks and civil society. Interoperability is a core principle, as it ensures that different infrastructures are connected together, forming a public sphere at European scale. This requires a mind shift for European policy makers, who will need to see themselves as ecosystem-builders instead of mere market regulators. This means less of the outdated Digital Single Market perspective, and more of the New European Bauhaus vision. Also a shift from the current narrow focus of making European public administration services interoperable.

This requires the willingness to intervene by investing public funds into the creation and use of a public technology stack, one that is based on open standards that enable interoperability between users of this stack. Once a viable public stack exists, public institutions should be required to offer support for these open interoperable standards and any public funding for digital activities of public institutions or civic initiatives should be conditional on support for the relevant open standards.

In this phase of the project, our aim is to define principles that can be derived from the idea of generative interoperability and that should guide the design of the IPCE model. These include interoperability standards, necessary public investments into infrastructure, and required regulation and policies. We will also define key spaces in which interoperability should be introduced to seed the interoperable public-civic ecosystem.

]]>
<![CDATA[Interoperability with a Purpose]]>These days it seems that the relatively humble technical concept of interoperability at the core of almost everyone’s efforts to fix the internet. Interoperability is seen as a key step to fixing the internet and to build digital public infrastructures. At the same time interoperability has arrived on the

]]>
https://shared-digital.eu/interoperability-with-a-purpose/5fd68f0951c73038e7ab8c7dMon, 14 Dec 2020 10:00:00 GMTThese days it seems that the relatively humble technical concept of interoperability at the core of almost everyone’s efforts to fix the internet. Interoperability is seen as a key step to fixing the internet and to build digital public infrastructures. At the same time interoperability has arrived on the policy agenda. Here it is widely expected that the European Union’s upcoming Digital Services Act contains at least some interoperability measures for so-called gatekeeper platforms.

This is a lot of expectation riding on the shoulders of an abstract principle. It is not surprising that interoperability should play a key role in the architecture of a next generation internet that is designed to support public and civic values. After all, the “inter” in “internet” already references the fact that the internet made early computer networks interoperable with each other.  Interoperability has always been a key concept underpinning the development of the net and the current debate on interoperability can also be seen as an effort to reclaim a characteristic of the original internet.

Figure: Paul Baran’s three topologies of networks: centralized, decentralized, distributed (1962). In the centralized topography, there is no need for interoperability. In the decentralized topography, local nodes need to be interoperable with each other, in the distributed topography there is full interoperability. Over the past decades, there has been a strong trend towards centralized platforms away from the original decentralized architecture of the internet.

So why exactly is everyone proposing interoperability as a solution to the ills of the contemporary internet? The main reason for this is the structurally unhealthy dominance of a small number of extremely large platform intermediaries that is increasingly being understood as both socially and politically unhealthy.

In the discussion about interoperability, the most prominent cases are Facebook and messaging services. In the canonical case of Facebook,  introducing interoperability would mean that a given user can connect with other Facebook users, interact with them and with the content that they share or produce, without using Facebook. This would be due to the fact that other services could connect with the Facebook infrastructure. This is often coupled with the argument that requiring Facebook to be interoperable with other services would also create a more level playing field for competing services (who would have access to Facebook's user base). In the same vein proponents of interoperability argue for an ecosystem of messaging services where messages can be exchanged across services.

At the very core, these are arguments in favor of individual freedom of choice and to empower competitors in the market. Interoperability is seen as an attempt to weaken the dominant position of platform intermediaries by countering the power of network effects that drive users into the arms of dominant platforms. We call this approach competitive interoperability. While this would clearly be a step in the right direction (and there is some hope that the upcoming Digital Markets Act will introduce some interoperability requirements for gatekeeper platforms) it is equally clear that competitive interoperability will not substantially change the nature of the online environment. Increasing choice between different services designed to extract value from our online activities may feel better than being forced to use a single service but it does not guarantee that the exploitative relationship between service providers and their users will change. There is very little evidence nor attempts to predict the effects of increased market competition on increasing the control of individual users over their data. This is especially problematic in the light of the fact that allowing users to take their connections from one service to another comes with a whole raft of largely unresolved personal data protection issues.

Towards an Interoperable Public Civic Ecosystem

Even though there are limits on this particular idea of interoperability, this does not mean that the concept has no use. Instead, it needs to be envisaged with a different purpose in mind: Building a different kind of online environment that answers to the needs of public institutions and civic communities.  If we see interoperability as a mechanism to build an online environment designed to support public and civic values then we first need to understand what we want this alternative to look like.

As we have outlined in our vision for a Shared Digital Europe we see the online environment as something that must be much more than a digital marketplace. If we want interoperability to contribute to this vision then interoperability must be understood as something that goes beyond merely acting as a restraint on existing market dynamics. We see interoperability as a design principle that has the potential to build a more decentralized infrastructure that enables individual self-determination in the online environment. We propose to call this type of interoperability generative interoperability.

In our view, the purpose of generative interoperability must be to enable what we call an Interoperable Public Civic Ecosystem. Such an ecosystem would provide an alternative digital public space that is supported by public institutions (public broadcasters, universities and other educational institutions, libraries and other cultural heritage institutions) and civic- and commons-based initiatives. An ecosystem that allows public institutions and civic initiatives to route around the gatekeepers of the commercial internet, without becoming disconnected from their audiences and communities and that allows interaction outside the sphere of digital platform capitalism.

In this context, interoperability should primarily be understood to enable interactions between public institutions, civic initiatives, and their audiences without the intermediation of the now dominant platforms. Seen in this light the purpose of interoperability is not to increase competition among platform intermediaries, but to contribute to building public infrastructures that lessen the dependence of our societies on these intermediaries. Instead of relying on commercial actors to provide infrastructures in the digital space that they can shape according to their business model needs, we must finally start to build public infrastructures that are designed to respond to civic and public values underpinning democratic societies. In building these infrastructures a strong commitment to universal interoperability based on open standards and protocols can serve as an insurance policy against re-centralization and the emergence of dominant intermediaries.

These public infrastructures do not emerge by themselves; they are the product of political and societal will. In the European Union, the political climate seems ripe for creating such a commitment. As evidenced by the current flurry of regulation aimed at the digital space, the European Commission has clearly embraced the view that Europe needs to set its own rules for the digital space. But if we want to see real systemic change we must not limit ourselves in regulating private market actors (via competitive interoperability and other types of regulation) but we must also invest in interoperable digital public infrastructures that empower public institutions and civil institutions. If European policymakers are serious about building the next generation internet they will need to see themselves as ecosystem-builders instead of market regulators. Understanding interoperability as a generative principle will be an important step towards this objective.

Our work on interoperability is supported by the  Next Generation Internet Policy Experimentation Fund, which is part of the NGI Forward project hosted by NESTA and funded by the European Union.

]]>
<![CDATA[Mandated and generative interoperability]]>

Introduction

Interoperability is a technical term that describes different systems' ability to connect and communicate with each other. It is a principle that favors open rather than closed communication between systems, based on shared standards. This fundamental but also obscure technical concept and a foundational principle for the internet has

]]>
https://shared-digital.eu/mandated-and-generative-interoperability/5fd6906851c73038e7ab8ca5Mon, 14 Dec 2020 09:55:00 GMT

Introduction

Interoperability is a technical term that describes different systems' ability to connect and communicate with each other. It is a principle that favors open rather than closed communication between systems, based on shared standards. This fundamental but also obscure technical concept and a foundational principle for the internet has recently become a core concept for many proposals to fix the internet.

Interoperability is a principle hidden within many of today's established communication systems, such as telephone services or email. It's also a fundamental principle of the internet and the World Wide Web. It is credited as being essential to the ability of these technologies to be generative: to support innovation and broadly understood value creation. It has been a non-controversial concept for a long time, considered mainly by engineers defining technical rules of interoperability. Urs Gasser describes it as "central, and yet often invisible, to many parts of a highly interconnected modern society."[1]

Interoperability has been proposed as a regulatory principle for each new generation of technologies. In 2015, Urs Gasser argued for interoperability in the context of the Internet of Things. More recently, in 2019, Chris Marsden and Rob Nicholls made the same argument about AI.[2] Today, interoperability is increasingly seen as a regulatory tool to fight market dominance of the biggest online platforms[3].

At the same time, the online ecosystem has changed significantly over the last decade. One of the critical impact factors has been the growing dominance of a small set of commercial platforms over this ecosystem. As this happens, open communication - based on standards shared between different services - is replaced by closed communication, occurring within a single service and defined by a single commercial actor.

It should be noted that interoperability is not a matter of "all or nothing." There are varying degrees of interoperability that can be applied. The extent of interoperability policies should depend on an analysis of the system to which it is being applied and policy goals that are meant to be achieved. It is a matter of defining both what and how becomes interoperable.

In this document, we present a framework for thinking about two approaches to interoperability, which we call competitive interoperability and generative interoperability. We explain the foundational character of interoperability as a technical concept and trace the origins of these two approaches, tied to four key perspectives on interoperability: competition, innovation, public sphere, and data ownership. Finally, we present an overview of the benefits and costs associated with implementing this principle. We end by suggesting how the two approaches fit into Europe's current digital policies and propose our vision of an Interoperable Public Civic Ecosystem.

Competitive and generative interoperability

In the literature and expert debates on interoperability, two different visions and approaches to interoperability can be identified. In these two approaches, the same term - and the same technical principle - forms the basis for two very different policy approaches. We will call one of these approaches competitive interoperability and the other generative interoperability. In the next part, we will further describe essential perspectives on interoperability that align with one of the two approaches.

Types of interoperability

In today's policy debates, interoperability is often mentioned as a corrective measure for the deficiencies and pathologies related to the dominant online platforms and their influence on the online ecosystem. For example, it is an important mechanism in the European Commission's Digital Services Act. It is listed as a key regulatory measure in the report on "Online platforms and digital advertising", published in July 2020 by UK's Competition & Markets Authority[4], as well as the "Investigation of Competition in Digital Markets" by the Judiciary Committee of the United States Congress[5]. We will call this approach competitive interoperability , as it is strongly connected with a perspective focused on market competition. In this approach, the interoperability principle is translated into regulation that fixes markets.

Competitive interoperability is one of the regulatory options on the table in debates about regulating the dominant internet platforms. It's proponents argue that the alternative approach - focused on regulating the behavior of the existing platforms without attempting to reshape the ecosystem in which they function - has the adverse effect of giving these commercial players even more power. In this approach, interoperability is the tool that forces open the "walled gardens" of today's online platforms. It introduces market competition and, as a result, "fix[es] the Internet by making Big Tech less central to its future"[6].

An alternative approach - which we call generative interoperability - sees it mainly as a positive principle. One that does not provide a fix, but establishes a positive norm. In this approach, the interoperability principle becomes foundational for an open online ecosystem (not necessarily a market). The concept of generativity, as proposed by Jonathan Zittrain[7], defines systems that "produce unanticipated change through unfiltered contributions from broad and varied audiences" by providing leverage, being adaptable, easy to access and master, and to transfer changes to others.

There is a growing body of policy visions that sees the online ecosystem from this perspective. In 2019, we presented our report "A Vision for Shared Digital Europe"[8], where we argue that Europe needs to create a digital space that enables self-determination, cultivates the commons, empowers public institutions, and is built on decentralized infrastructure. In order for this to occur, European digital policies need to shift from a market focus - to a vision of a digital society or digital public sphere that is built with internet technologies. From this perspective, an ecosystem that meets a technical standard of interoperability is one that also meets the societal values represented by the concept of the public sphere. A recent report titled "European Public Sphere. Towards Digital Sovereignty for Europe", published by the German National Academy of Science and Engineering (acatech)[9] offers a good formulation of this perspective, albeit interoperability is mentioned in this report only briefly. A similar perspective is presented in "A Vision for the Future Internet", a working paper recently published by NGI Forward, which argues for a "more democratic, resilient, sustainable, trustworthy and inclusive internet by 2030"[10].

The origins of generative interoperability

Looking back, the World Wide Web can be seen as a template for a system based on the principle of generative interoperability. Tim Berners-Lee's original idea[11] was essentially that of a set of tools enabling information to flow freely between computers and servers, regardless of the technological stack used in a given instance. This principle was initially applied within CERN, the research institution where Berners-Lee was based. Soon afterward, it defined the World Wide Web. It was not used to "break open" any existing communication ecosystem but rather determined the success of a new, alternative ecosystem.

Key factors that enabled the World Wide Web to win over other competing projects were the simplicity of the system and openness of standards, such as the HTTP protocol and HTML markup language. WWW also introduced URI (Universal Resource Identifier), meaning each piece of content received a unique, para-permanent address. These three combined factors were essential for ensuring that the WWW was an interoperable system.

The interoperability debate has gone a long way since the beginning of the World Wide Web. Initially, the idea of interoperability meant the ability of a variety of systems and actors to communicate between themselves (as a result of generative interoperability). It also meant a technological stack, in which interoperability at lower layers enabled services built on top of them to function in an open ecosystem. Today, the most relevant debate concerns interoperability among these new services, built on top of the WWW. Addressing the issue of "online walled gardens" and ways in which they act as new gatekeepers to online communication (through competitive interoperability) is crucial for further development of this ecosystem. At the same time, several reports have recently addressed anew the significance of public digital infrastructures, including Ethan Zuckerman's "The Case for Public Infratructure"[12] and Waag's Public Stack project. An infrastructural perspective, addressing varied layers of the technological stack of the internet is also prominent in the NGI Forward working paper "A Vision for the Future Internet".

Ultimately, the two approaches should be seen as complementary, despite having divergent goals and theories of change behind them. Both of them ultimately refer to the interoperable character of the early internet and the early Web, which has been largely lost today. "We need to take inspiration from what the Internet's early days looked like", declares the Electronic Frontier Foundation [13]. The first one aims to break the dominance of commercial platforms, which have destroyed the decentralised character of the internet and contribute to a range of problematic societal developments. The latter focuses on securing a public interest-driven ecosystem in those spaces, where these actors are not yet dominant. This second strategy has been elegantly expressed by Ursula van der Leyen in her political guidelines for the European Commission: "It may be too late to replicate hyperscalers, but it is not too late to achieve technological sovereignty in some critical technology areas"[14].

Definitions of interoperability

Interoperability is the ability of different systems to connect and communicate with each other. Urs Gasser defines interoperability as "the ability to transfer and render useful data and other information across systems, applications, or components"[15]. The Electronic Frontier Foundation defines it as "the extent to which one platform's infrastructure can work with others"[16]. Gasser notes that a more precise, one-size-fits-all definition is not possible. The extent and characteristics of interoperability depend on context and perspective. In order to acknowledge this, Gasser distinguishes four layers of interoperability: technological, data, human, institutional.

For many people, it is the exchange of data through technological means that comes to mind when they think about interop. But it turns out that the human and institutional aspects of interoperability are often just as – and sometimes even more – important than the technological aspects.

The technological layer described by Gasser consists primarily of the hardware and code underlying interoperable systems. The data layer is key for technological systems to understand each other and is built by the Linked Data and Semantic Web standards and ontologies. The human layer, in turn, is the one where the ability to understand and act on the data exchanged (e.g., common language) is situated. Finally, the institutional level focuses on the societal system's ability to engage with an interoperable system (e.g., demanding shared laws, understanding of regulation; does not demand complete homogeneity of legal systems, however).

As an example, he describes mobile payment systems. On a technological level, it is built by banks, devices, and payment platforms. On a data level, it is about the protocols enabling transaction processing, NFC-ability of cards, and readers. For humans, it is a simple and understandable system with a close resemblance to plastic card transactions.

In the case of the internet, the principle of interoperability is brought to life in different ways across the different layers of its technological stack, running from the physical infrastructure, through the protocol layer, to applications, data and content flows, and finally the social layer at the very top.

Another useful conceptualization of interoperability and related concepts comes from a recent report for the European Commission. Jacque Crémer, Yves-Alexandre de Montjoye, and Heike Schweitzer distinguish between four related concepts that define increasingly strong forms of interoperability [17]. First, they define data portability. It is "the ability of the data subject or machine user to port his or her data from service A to service B". In Europe, the right to data portability is provided by Article 20 of the GDPR. Crucially, data portability does not mean real-time access to data.

Afterward, they distinguish between protocol interoperability and data interoperability. The first term, protocol interoperability , defines a more traditional understanding of interoperability as enabling two services to interconnect. These services are typically complimentary but cannot substitute for each other. One example is a service operating on top of a platform that enables its functioning (e.g., case of Microsoft's operating system enabling programmes to operate). Crucially, this type of interoperability does not require the transfer of data or does so in a limited manner. Data interoperability , in turn, is defined as "real-time, potentially standardised, access for both the data subject/machine user and entities acting on his or her behalf". This type of interoperability allows not only complementary services but also the substitution of functionalities of one service by the other. Finally, full protocol interoperability requires two substitute services to interoperate fully. This is the case with mobile phone networks, emails, or file formats. The authors order these different types of broadly understood interoperability according to their strength as a regulatory mechanism, from data portability to full protocol interoperability.

To conclude, the principle of interoperability contains a technological aspect (protocols, technical requirements), has specific data requirements (formats, syntax, semantics), but crucially also has human and institutional dimensions. This institutional dimension includes the economic dimension of interoperability, as business models are a key factor that determines whether a given part of the online ecosystem is interoperable or not. Furthermore, interoperability mandates can differ in their strength as a regulatory mechanism, and therefore also have a differing impact. Finally, key questions concern the type of broadly understood resources that are made interoperable.

Key perspectives on interoperability

Interoperability is ultimately a simple, foundational principle for online communications. Below we present four key policy perspectives on interoperability, focusing on market competition, innovation, online public spaces, and data ownership. For each of them, the same technical principle becomes a means for achieving different goals. These four perspectives do not fully align with the two approaches that we defined previously. Although a market competition perspective is key for competitive interoperability and a public space perspective defines generative interoperability.

Interoperability and market competition

Arguments in favor of introducing greater interoperability of modern communication systems, and in particular of dominant online platforms, are most commonly framed in terms of market competition.

In 2002, Viktor Meyer-Schöneberg brought attention to the issue of interoperable public service networks used in emergencies. In his paper "Emergency communications: the quest for interoperability in the US and Europe"[18], he described the three key characteristics of a technological solution allowing for interoperability between a variety of actors:

  • suitable technology (one that scales and accommodates many users simultaneously),
  • common standards (including protocols and procedures), and
  • funding that sustainably supports necessary infrastructure.

By comparing the EU TETRA network for public services with similar US projects, he highlighted the need for an uncomplicated set of common communication standards. He also praised the European approach for using public-private partnerships to develop the system.

In 2018, Viktor Meyer-Schönberg and Thomas Ramge proposed a "progressive data sharing mandate", based on modeling of data-driven markets, conducted by Jens Prüfer and Christoph Schottmüller[19]. The mandate would force dominant actors in a market to share a portion of collected data with their competition. The amount shared would be proportional to the company's market share.

Interoperability and innovation

Innovation is typically mentioned, alongside market competition, as a key positive outcome of greater interoperability. In a recent interview with MIT Tech Review, Viktor Meyer-Schönberg stated:

Innovation is moving at least partially away from human ingenuity toward data-driven machine learning. Those with access to the most data are going to be the most innovative, and because of feedback loops, they are becoming bigger and bigger, undermining competitiveness and innovation. So if we force those that have very large amounts of data to share parts of that data with others, we can reintroduce competitiveness and spread innovation. (...) You can break up a big company, but that does not address the root cause of concentration unless you change the underlying dynamic of data-driven innovation[20].

Interoperability has been studied by researchers from the Berkman Center for Internet and Society, especially in the context of driving innovation. In 2005, Urs Gasser and John Palfrey published a report, where they conclude that ICT interoperability needs to be defined within specified context, subject, and topic[21]. In their work, they analyse three case studies (of DRM-protected music, Digital ID, and Mashups) indicating different levels of interoperability. They also list available methods to build interoperable ecosystems, both from the state and from the private sector side. Finally, they conclude that indeed ICT Interoperability should be beneficial for innovation, but "the picture is filled with nuance". They further developed the ideas in the 2012 book "Interoperability: The Promise and Perils of Highly Interconnected Systems"[22].

Interoperability as a foundational principle for online public spaces

There is a long history of defining the internet as a public space, going back to John Perry Barlow's "A Declaration of Independence of Cyberspace"[23]. Peter Dahlgreen already in 2006 described the internet as a basis for some of the democracy's communication spaces (an idea taken from Habermas) in an article "The Internet, Public Spheres, and Political Communication: Dispersion and Deliberation"[24]. On the one hand, he appreciated many of the characteristics of the online spaces (e.g., the ability of the audience to interact), but also warned against the drawbacks we observed a few years later (the blurring of journalism with non-journalism, disengagement of citizens, business models focused on short-term profits). His idea was to create "domains" (not in the technical sense of an internet domain, but a more general political understanding of a space or platform) for public advocacy and activists, journalism, politics and supplement them with civic forums. The paper describes a general idea, but it lacks any description of the implementation.

A case for interoperability has been proposed by Jonathan Zittrain, who defined for this purpose "generative" systems[25]. Generativeness of a system is defined as an ability to create new systems and innovate; generative technologies are defined as ones having the capacity for leverage (becoming useful for services built on top of them), adaptability, ease of mastery, and accessibility. Zittrain argues that the original Web was a prime example of a generative system, but it loses this trait as the Web matures and becomes more stable. Hence, he argues, the interoperable and open nature of PC computers, networks, and "endpoints" (services built on top of the Web; today, we should include platforms in this category) is crucial for further development.

Another insightful and more recent paper comes from legal studies. In "Regulating informational infrastructure: Internet platforms as the new public Utilities" K. Sabeel Rahman aims to set the main routes of regulation of online public spaces[26]. He describes the goals of such regulation:

  • fair access and treatment (non-discrimination, neutrality, common carriage, fair pricing, non-extractive terms of service),
  • protection of users (e.g., fiduciary duties with respect to user's data, protection against algorithmic nuisance)
  • accountability of platforms.

Then proceeds to describe three main approaches: managerial, self-governing managerial, structuralist. The managerial approach focuses on regulatory oversight, implementation of a variety of forms such as algorithmic impact statements, which would require new statutory authorities or regulatory bodies. The self-governing managerial approach highlights the responsibility of the private sector by being oriented toward guarding public responsibility, standards, and accountability. Structuralist regulation, according to Rahman, would focus on limiting the business models of firms, altering the dynamics of the market, e.g., by renewed antitrust regulation, creating a tax on data use, or creation of "public options" as alternatives to a private stack (or endorsement of "vanilla" options of privately operated services that are publicly available). This option is viewed as incurring lower costs but should not be viewed as cost-free.

Interoperability and data governance

The fourth perspective focuses on the fact that interoperability enables content and – in particular – data flows. Managing these data flows is, therefore, a key challenge for interoperable systems. This perspective focuses, therefore, on data governance, often framed in terms of data ownership.

The importance of data ownership as means of enabling societally beneficial use of data and of protecting at the same time privacy was signaled in 2009 by Alex Pentland, who proposed a "New Deal on Data"[27]. Pentland argued that the basic tenets of ownership - right to possess, use, and dispose of - should apply to personal data. He also argued that policies need to encourage the sharing of data for the common good. While interoperability is not mentioned by him explicitly, it is clear that his proposal assumes data interoperability.

In 2012, Doc Searls proposed in his book "The intention economy"[28] a VRM (Vendor Relationship Management, a twist on the CRM - Customer Relationship Management) system that addressed tensions between data ownership and privacy protection. He described VRM as a personal data hosting service for individuals, hosting information about him/her and allowing the user to set Terms and Conditions on how they may be accessed by external services (e.g., platforms).

Searl's idea did not directly translate into a technological solution, but it inspired the work of others. SOLID, Tim Berners-Lee's current project is a personal data hosting solution that will enable the creation of decentralised services and applications, which use data in ways that are fully under users' control.

One other prominent approach is being developed by MyData, which envisions - in the MyData Declaration - an ecosystem in which trusted intermediaries manage users' personal, with the goal to empower individuals with their personal data, thus helping them and their communities develop knowledge, make informed decisions, and interact more consciously and efficiently with each other as well as with organisations.

One of the key debates on data governance concerns the idea of data as an asset that can be owned - especially personal data. Luigi Zingales and Guy Rolnick, in an op-ed in the New York Times, proposed personal data ownership and interoperability as solutions to the challenges of the modern Web (of personal data, but also of data generated by platforms using personal data)[29]. In an influential article titled "Should We Treat Data as Labor?", a group of researchers tied to the RadicalxChange initiative suggest a "Data as Labor" approach that sees data as "user possessions that should primarily benefit their owners"[30]. There is also growing literature on ways of collectively managing data for the public good. For example, Mariana Mazzucato proposed, in her op-ed "Let's make private data into a public good", a public repository that "owns public's data and sells it to the tech giants"[31]. The idea of data ownership is opposed by an approach based on the idea of data rights, presented clearly by Valentina Pavel in "Our Data future"[32].

Benefits and costs of interoperability

Interoperability is a structural principle and not a goal in itself. It is rather a means for attaining other policy goals. At the same time, by virtue of its generative nature, it is a simple principle that creates open systems with complex outcomes.

Most commonly, interoperability is seen as a tool for increasing competition and innovation. Yet, our literature review shows a broad range of effects that the introduction of a principle of interoperability can have. Below you will find a summary of the arguments for introducing interoperability, followed by arguments against it and its adverse effects. We have divided the arguments into economic, societal, and legal ones.

In policy debates on interoperability, proposals are often made on the basis of faith in principles and not on the basis of evidence - which, admittedly, is usually not available. For example, in the case of dominant online platforms, there is a persistent lack of data that makes it challenging to model the effects of any changes in how data flows are controlled. It is similarly hard to model in detail the effect that decentralised, interoperable alternatives would have on different aspects and actors in the online ecosystem.

Nevertheless, based on our literature review, we believe that it is not enough to have faith in interoperability as such or to refer to past examples - as these relate to different, much simpler technologies and media ecosystems - favorite examples of interoperability researchers concern text messages and emails, while our concern today are services that manage multiple flows and layers of content and data, often with the use of opaque algorithms.

Economic benefits

Increasing competition (& de-monopolising the Web)

In the world of protocols, you still get the global connectivity benefit, but without the lockdown control and silos (and, potentially, the questionable privacy practices). In a world of protocols, there may be a global network, but you get competition at every other level. You can have competitive servers, competitive apps and user interfaces, competitive filters, competitive business models, and competitive forms of data management.

Urs Gasser argues that internet interoperability is a tool helping economic innovation as it lowers the lock-in effects and hence decreases the entry barriers to certain industries[33]. He gives an example of HBO content distribution, which used to be confined to cable and satellite providers; by extending its availability to web browsers and TV-enhancing devices (Chromecast, PlayStation, Roku), it has increased competition for subscription TV services. Operators of cable, satellite, and online TV subscription services can no longer rely on user lock-in (meaning the inability to access the content elsewhere).

Gasser also discusses two counter-arguments. Firstly, in some circumstances, interoperability could lead to a lack of competition. For example, bilateral agreements or closed standards - even when they enable interoperability - can, over time, prove to be anticompetitive. Interoperability can be deployed as a tool for building closed ecosystems.

Secondly, he argues that reaching maximum competitiveness does not imply maximum innovativeness. A different model of innovation assumes that monopolists also have incentives to innovate, and competition functions through "leapfrogging" - replacing incumbent players by designing a new generation of technology that grants temporary dominance.

Crémer et al. in a report for the European Commission state:

access to a large amount of individual-level data have become key determining factors for innovation and competition in an increasing number of sectors. We will discuss access to individual-level data by platform and within ecosystems later in this chapter, and at some length in the data chapter, including questions of data portability and data interoperability for personal and machine individual-level data, questions of access to privileged and private APIs, and the self-reinforcing role of data in ecosystems and leveraging market power[34].

Increasing innovation

Interoperability is often portrayed as a tool enabling the emergence of new services built on top of pre-existing infrastructure and other services, hence igniting innovation. An argument of such kind is proposed by Urs Gasser. He also refers to the early times of the WWW as an example of innovation-incurring interoperable systems. He also cites IoT interoperability as a modern example of a platform that has the potential to inspire the creation of new services.

Interoperable ecosystems can also be viewed as ones fostering science and research activities, as was highlighted by the European Commission's Staff Working Document, "A Digital Single Market Strategy for Europe –Analysis and Evidence":

[In Europe] neither the scientific community nor industry can systematically access and re-use the research data that is generated by public budgets, despite strong demand[35].

Crémer et al. state in their report:

the entry of new competitors might be facilitated by multi-homing and interoperability. If users can use several platforms at the same time, i.e., multi-home, it will be easier for a new entrant to convince some users to switch to their platform while still being able to conserve the benefits of using the incumbent platform to interact with others.

Lower switching costs on the user- and supply-side

Ben Thompson expands on the hypothesis that interoperability (especially via API disclosure) could help defuse the market power of aggregators. His "Aggregation Theory" describes the unique position of online aggregators (which include many online platforms), who work with near-zero distribution and transaction costs and rely on a superior customer experience, leading to a network effect promoting these already large services[36].

According to Thompson, once an aggregator gains a monopolistic position, it becomes a monopsony: also the only buyer for specialized suppliers[37]:

And this, by extension, turns the virtuous cycle on its head: instead of more consumers leading to more suppliers, a dominant hold over suppliers means that consumers can never leave, rendering a superior user experience less important than a monopoly that looks an awful lot like the ones our antitrust laws were designed to eliminate.

He argues that mandated API disclosure and interoperability (a solution imposed on e.g., Microsoft) can address this issue, as these measures decrease the switching cost for the consumer. For example, the switching costs in the case of Uber remain relatively low despite its large user pool exactly because it does not benefit from a monopolistic position with relation to customers or drivers, both of whom can easily switch platforms or multi-home: use multiple platforms simultaneously. Mandated API and interoperability could lead to a similar effect in cases where network relationships play a crucial role. He also highlights that interoperability could be beneficial on the supply side as well. For example, not just passengers but also drivers would be able to export data from Uber to a competing platform.

Thompson finalizes his argument by saying that technologies with fixed costs that rely on network effects generate innovation, as there are enough potential rewards by simply being first. He argues there is no need for strict preservation of intellectual property.

Cross-border public services

In the European context, the principle of interoperability can support the cross-border character of online services in the Digital Single Market. The European Commission, in the Staff Working Document, A Digital Single Market Strategy for Europe –Analysis, and Evidence[38], highlights that interoperable online services are a tool for enabling cross-border public services to work, an issue especially important in the European context. This notion can be extended to cross border delivery of goods and services in general.

More generally, the lack of interoperability among public entities and private operators restricts the potential for digital end-to-end services, One Stop Shops, the once-only principle, the single data entry principle, the transparency of public services, and the full exploitation of public open data. [...] 25% of firms in the EU state that interoperability issues are a problem for cross-border online sales, with 10% declaring it to be a major problem.

Societal benefits

User autonomy, choice, and accessibility

Effective interoperability gives users the ability to choose and move from one service to another effortlessly[39]. Current strong lock-in effects have not been lowered adequately by the GDPR. "Voting with one's feet" is an essential element of a free-market economy, but its importance extends beyond the economy itself, as it enables one to stand guard for civil rights and liberties (e.g., in the case of changing Terms and Conditions limiting privacy rights).

Single sign-in digital ID infrastructure (possible today with Facebook and Google login) is also an example of how interoperability enables access and openness for users, notes Urs Gasser[40].

Marsden and Nicholls write in "Interoperability: A solution to regulating AI and social media platforms":

There are social benefits of interoperability. It eliminates the consumer need to acquire access to every network or the tendency to a winner-takes-all outcome. This is inelegant from a device design perspective, too: readers may remember when the US had different mobile design standards to the EU (CDMA rather than GSM). In Instant Messaging (IM), arguably the winner-takes-all is Facebook/WhatsApp/Instagram without interoperability – with all IMs inside the corporation becoming interoperable[41].

De-centralising data ownership

Gus Rossi and Charlotte Slaiman develop an argument that interoperability could foster new players and actors to develop data banks[42]. They claim that the current centralization of data ownership in a handful of companies translates into their superior position in AI development. This, in turn, can further lead to novel approaches towards data storage and ownership and possibly a wider variety of services focused on privacy.

To conclude, proponents of interoperability highlight its ability to address the issues with a lack of competition in many of the online spaces, increasing the choice of individuals using online services (especially regarding data privacy) and their control over data.

Economic costs

Lowering innovation and incentives to invest

Thomas Lenard criticizes interoperability as a potential solution to the challenges of the Web as, in his view, it will decrease innovation on the internet[43]. His reasoning is based on an extended understanding of data portability (or "interoperability of data"; term to include also the data inferred about a user and/or his social graph). He argues that greater interoperability would reduce the rents collected by innovators. As the (costly) new features and results of work of algorithms available on their services will become available via API (or will be easily copied), services built alongside or on top of platforms may easily catch-up and draw clients to themselves.

He argues that "winner-takes all" markets (such as the ones we observe on the Web) are always risky investments because of the unpredictability of the outcome. Making it easier for the competition to catch-up and for users to switch for investors would translate into a possibly shorter time to retrieve profits from innovation, hence discouraging them from investing in the first place.

Over-regulation and over-standardising

Thomas Lenard in his paper also considers how mandating interoperability (e.g., via the ACCESS legislation, "Augmenting Compatibility and Competition by Enabling Service Switching Act of 2019" in the US) would effectively favour the existing large platforms, who have the resources to adjust to the regulation, become the first to use newly available data from the competition and have the market position enabling them to survive. The cost-side of the interoperability proposals would however similarly affect firms of all sizes, creating the need for adjustment for big companies and startups.

In fact, some of the internet platforms are cooperating on a form of interoperability in the Data Transfer Project, which will be mentioned in the second part of the essay.

Anti-social innovation

Urs Gasser mentions that although innovation - often seen as one of the key outcomes of interoperability - is usually viewed in a positive context, it is a double-edged sword:

Innovation can be bi-­directional. Just as interop can help support the development of innovative devices and software that has positive social value, it can also support innovative devices and software with negative social value. On the internet, worms, viruses, spam, and other unwanted activity are in many ways just as "innovative" and just as dependent on interoperability as more positive developments[44].

Gasser gives the example of the Heartbleed vulnerability in the SSL protocol that enables secure, encrypted communication across the internet - and due to the vulnerability, in fact, compromised communication. But potential hazards are not limited to hacking of the technological stack but include as well disinformation or novel methods of election manipulation.

It is worth noting that for some, this aspect is, if not a benefit, then at least a neutral and necessary part of interoperability. Cory Doctorow argues that "It's possible that people will connect tools to their Big Tech accounts that do ill-advised things they come to regret. That's kind of the point, really"[45].

The advantages tied to the freedom of making own uses of tools exceed the possible costs. Especially if the alternative is giving dominant market players the right to decide, which uses and innovations are beneficial, and which ones are not.

Societal costs

Privacy & security risks

Security and privacy are one of the primary concerns regarding interoperability.

Ben Thompson in "Portability and interoperability" describes the history behind Zuckerberg's 2019 declaration to support interoperability[46]. He highlights that the declaration and indeed the activities in the Data Transfer Project (of which Facebook is a founding member) initiative do not go as far as to include interoperability of what Thompson argues is the valuable data - e.g. information about user's network of friends (also referred to as a "network transfer tool"). Facebook initially supported an API enabling other entities to see user's friends (which was called an Open Graph). Over-exploitation of this API has, however, resulted in the Cambridge Analytica scandal and led Facebook to a decision to close the API.

Furthermore, even before the scandal Facebook has limited access to the API for services aiming to "replicate our functionality or bootstrap their growth", indicating that the company has become afraid of a scenario where their costly investments are being easily duplicated or used by other entities. Such an argument was described above, in the part that concerns lowering innovation and investment incentives.

Decreased reliability of technological systems

The complexity of the increasingly standardized system creates fragility, as there is a chance some flaws will propagate worldwide – as in the case of flaws in the SSL protocol. People and companies will increasingly depend on standards without having the means to fix them (as they normally would with simpler architectures).

Urs Gasser extends this argument with an example of not only security risks but also business reliance:

As interoperability increases, downstream systems may become increasingly reliant on upstream systems. This problem was observed when Twitter's decision to change its open API threatened the business of the downstream systems that were built on top of that API[47].

Hence, businesses relying on others can have limited control over the fundamental elements of their model. Urs Gasser provides an example of a Kindle lock-in mechanism (using proprietary .mobi format for e-books), which he argues enables Amazon to almost arbitrarily lower prices.

Increased homogeneity

Similar network effects to those driving the dominant platforms of today can make a very narrow set of interoperability standards dominant. This is not bad per se, as homogeneity has a positive side (e.g., enabling scaling for small companies), but it can also limit innovation:

Internet is a wonderfully interoperable system that has led to tremendous innovation, the protocols that underlie it represent a form of homogeneity. Most of the interconnected systems that people use today rely on TCP/IP. In the same way, email protocols have become a de facto standard, meaning that innovations in security for email must be built upon an existing set of protocols[48].

Legal costs

Accountability and liability

The increasing complexity of systems also needs consideration regarding blurring boundaries of responsibility (it has to be noted that a similar debate takes place on the topic of automated decision-making systems). This issue is highlighted by the PrivacyTech whitepaper, published as part of the A New Governance initiative:

Legal aspects of data circulation and portability are not cleared by GDPR entirely. Many companies are still not comfortable with the idea of data circulation. They fear liability issues if they open up their data to other organizations. Data circulation has also become a media issue since Facebook's Cambridge Analytica scandal. Many corporates fear negative press coverage that would result from a misuse of data circulation. We need to define a liability model for data circulation that would be based on GDPR and would reassure all the stakeholders of the ecosystem. For it to be accessible to any company, whatever its resources, it has to be defined as a standard monitored through proper governance.[49]

The proposed Data Governance Act presented by the European Commission aims to address some of these questions by introducing a "supervisory framework for the provision of data sharing services"[50].

Regulatory siloing

Although it is probably more a challenge than a drawback of interoperability, it is worth mentioning the argument raised by the A New Governance initiative:

The energy and the will to build new standards is already there. Many domains and expertise are involved: legal, technical, design, business, policy. Many sectors are involved: Mobility, Healthcare, Administration, Commerce, Finance and insurance, Entertainment, Energy, Telecom, Job market, Education, etc. The main problem is that all those initiatives are partial as they mostly work by country, by sector, by expertise or through closed consortia. Despite our energy and good will, we are recreating silos, which will be highly detrimental to the main goal. Personal data circulation and protection is a cross-sector issue, and data have no boundaries.

A more complex picture is also presented in "Data sharing and interoperability: Fostering innovation and competition through APIs" by Oscar Borgogno and Giuseppe Colangelo:

Analysing the main European regulatory initiatives which have so far surfaced in the realm of data governance (right to personal data portability, free flow of non-personal data, access to customer account data rule, re-use of government data), it seems that the EU legislator is not tackling the matter consistently. Indeed, on one side, all these initiatives share a strong reliance on APIs as a key facilitator to ensure a sound and effective data sharing ecosystem. However, on the other side, all these attempts are inherently different in terms of rationale, scope, and implementation. The article stresses that data sharing via APIs requires a complex implementation process and sound standardization initiatives are essential for its success.[51]

Despite the several legislative initiatives put forward so far by the European Commission, a clear view as to who should define APIs and how they should define them is still lacking.

Summarising costs and benefits

As illustrated in the table below, there are widely varying perspectives on the benefits and costs of interoperability - and ultimately on its societal and economic desirability as well.

Costs and Benefits of interoperability summary

A substantial amount of these arguments takes "innovation" as a proxy for the desirability of interoperability (and other regulatory interventions). In summarizing the different perspectives, it is important to note that innovation can have both positive and negative effects on society. As a result, judging regulatory interventions based on their expected or presumed impact on innovation will be insufficient if we are trying to understand the impact of such interventions.

In the end, interoperability is a technical principle that remains foundational for the networked information ecosystem. After a long period of centralisation and consolidation of the online ecosystem, interoperability has re-emerged as one of the most promising regulatory tools for shaping this digital space.

The question of how interoperability should be used is a political question. Among (European) policymakers there seems to be a broad consensus that competitive interoperability should be used to ensure more competition and limit the market power of so-called gatekeeper platforms. The real question is if the ambition to "Shape Europe's Digital Future" requires a more ambitious approach that embraces as well the generative potential of interoperability.

Generative and competitive interoperability in European digital policies

The current European Commission has presented a new approach to Europe's digital policymaking in the Communication on Shaping Europe's Digital Future[52]. With this policy document, it expands on the previous Commission's market-centered vision of the Digital Single Market. To give an example of this shift, while a previous data strategy would focus on a "market for data" perspective, the new vision outlined in the "European Strategy for Data" calls for "a European data space based on European rules and values"[53].

The two-pronged approach that we propose by distinguishing competitive and generative interoperability is clearly visible in these two documents. On the one hand, they clearly define the challenge posed by platforms that acquire significant scale, turning them into gatekeepers of the online market - thus, policies (especially new competition rules) are envisaged to ensure that markets continue to be fair and open.

But European policymaking does not stop here. Policies are also being designed not to solve the challenge of the platforms but with the aim of building a trustworthy digital society. In December 2019, European Commission President van der Leyen presented this two-pronged approach by stating that:

It may be too late to replicate hyperscalers, but it is not too late to achieve technological sovereignty in some critical technology areas. To lead the way on next-generation hyperscalers, we will invest in blockchain, high-performance computing, quantum computing, algorithms and tools to allow data sharing and data usage. We will jointly define standards for this new generation of technologies that will become the global norm[54].

In such a strategy, competitive interoperability should be used to curb the power of hyperscalers and mitigate their negative impact. Generative interoperability is, in turn, a key principle for the jointly defined technological standards that ensure sovereignty.

This argument is also at the core of the European Strategy for Data, presented in February 2020 by Commissioner Breton, which aims to support the emergence, growth, and innovation of European data-driven businesses and other initiatives. The strategy presents the following argument:

Currently, a small number of Big Tech firms hold a large part of the world's data. This could reduce the incentives for data-driven businesses to emerge, grow and innovate in the EU today, but numerous opportunities lie ahead. A large part of the data of the future will come from industrial and professional applications, areas of public interest or internet-of-things applications in everyday life, areas where the EU is strong. Opportunities will also arise from technological change, with new perspectives for European business in areas such as cloud at the edge, from digital solutions for safety critical applications, and also from quantum computing. These trends indicate that the winners of today will not necessarily be the winners of tomorrow.

Once again, we see the two-pronged approach, which on one hand curbs the power of monopolists and, on the other, supports new markets and ecosystems.

As we argue in "Interoperability with a purpose" this presents an obvious opportunity to put the idea of generative interoperability into action and start building a uniquely European digital space built on civic and democratic values:

In our view, the purpose of generative interoperability must be to enable what we call an Interoperable Public Civic Ecosystem. Such an ecosystem would provide an alternative digital public space that is supported by public institutions (public broadcasters, universities and other educational institutions, libraries and other cultural heritage institutions) and civic- and commons-based initiatives. An ecosystem that allows public institutions and civic initiatives to route around the gatekeepers of the commercial internet, without becoming disconnected from their audiences and communities and that allows interaction outside the sphere of digital platform capitalism. (Read the remainder of our call for an Interoperable Public Civici Ecosystem here)


  1. Urs Gasser, Interoperability in the Digital Ecosystem (2015), available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2639210 ↩︎

  2. Chris Marsden and Rob Nicholls, Interoperability: A solution to regulating AI and social media platforms (2019), available at https://www.scl.org/articles/10662-interoperability-a-solution-to-regulating-ai-and-social-media-platforms ↩︎

  3. Ian Brown, Interoperability as a tool for competition regulation (2020), available at https://osf.io/preprints/lawarxiv/fbvxd ↩︎

  4. Competition and Markets Authority, Online platforms and digital advertising market study (2020), available at https://www.gov.uk/cma-cases/online-platforms-and-digital-advertising-market-study ↩︎

  5. Subcommittee on Antitrust, Commercial and Administrative, Investigation of Competition in Digital Markets (2020), available at https://judiciary.house.gov/uploadedfiles/competition_in_digital_markets.pdf ↩︎

  6. Cory Doctorow, Interoperability: Fix the Internet, Not the Tech Companies (2019), available at https://www.eff.org/deeplinks/2019/07/interoperability-fix-internet-not-tech-companies ↩︎

  7. Jonathan Zittrain, The Future of the Internet and How to Stop It (2006), available at https://dash.harvard.edu/bitstream/handle/1/4455262/Zittrain_Future+of+the+Internet.pdf ↩︎

  8. Sophie Bloemen, Paul Keller and Alek Tarkowski, A Vision for Shared Digital Europe (2019), available at / ↩︎

  9. acatech, European Public Sphere. Towards Digital Sovereignty for Europe (2020), available at https://en.acatech.de/publication/european-public-sphere/ ↩︎

  10. Katja Bego, Working Paper: A vision for the future internet (2020), available at https://research.ngi.eu/working-paper-a-vision-for-the-future-internet/ ↩︎

  11. Tim Berners-Lee, Information Management: A Proposal (1990), available at https://www.w3.org/History/1989/proposal.html ↩︎

  12. Ethan Zuckerman, The Case for Public Infratructure (2020), available at https://knightcolumbia.org/content/the-case-for-digital-public-infrastructure ↩︎

  13. Svea Windwehr and Chritoph Schmon, Our EU Policy Principles: Interoperability (2020), available at https://www.eff.org/deeplinks/2020/06/our-eu-policy-principles-interoperability ↩︎

  14. Ursula van der Leyen, A Union that strives for more. My agenda for Europe : political guidelines for the next European Commission 2019-2024 (2019), available at https://op.europa.eu/en/publication-detail/-/publication/43a17056-ebf1-11e9-9c4e-01aa75ed71a1 ↩︎

  15. Urs Gasser, Interoperability in the Digital Ecosystem (2015), available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2639210 ↩︎

  16. Bennett Cyphers and Danny O'Brien,Facing Facebook: Data Portability and Interoperability Are Anti-Monopoly Medicine (2018), available at https://www.eff.org/deeplinks/2018/07/facing-facebook-data-portability-and-interoperability-are-anti-monopoly-medicine ↩︎

  17. Jacques Crémer, Yves-Alexandre de Montjoye, Heike Schweitzer, Competition policy for the digital era (2019), available at https://ec.europa.eu/competition/publications/reports/kd0419345enn.pdf ↩︎

  18. Viktor Mayer-Schönberger, Emergency Communications: The Quest for Interoperability in the United States and Europe (2002), available at https://www.hks.harvard.edu/publications/emergency-communications-quest-interoperability-united-states-and-europe ↩︎

  19. Viktor Mayer-Schönberger and Thomas Ramge, A Big Choice for Big Tech. Share Data or Suffer the Consequences (2018), available at https://www.foreignaffairs.com/articles/world/2018-08-13/big-choice-big-tech ↩︎

  20. Angela Chen, Making big tech companies share data could do more good than breaking them up (2019), available at https://www.technologyreview.com/2019/06/06/135067/making-big-tech-companies-share-data-could-do-more-good-than-breaking-them-up/ ↩︎

  21. Urs Gasser and John Palfrey, When and How ICT Interoperability Drives Innovation (2005), available at https://cyber.harvard.edu/interop/pdfs/interop-breaking-barriers.pdf ↩︎

  22. Urs Gasser and John Palfrey, Interoperability: The Promise and Perils of Highly Interconnected Systems (2012), available at https://cyber.harvard.edu/publications/2012/interop ↩︎

  23. John Perry Barlow, A Declaration of Independence of Cyberspace (1996), available at https://www.eff.org/cyberspace-independence ↩︎

  24. Peter Dahlgreen, The Internet, Public Spheres, and Political Communication: Dispersion and Deliberation (2006), available at https://www.tandfonline.com/doi/full/10.1080/10584600590933160 ↩︎

  25. Jonathan Zittrain, The Future of the Internet and How to Stop It (2006), available at https://dash.harvard.edu/bitstream/handle/1/4455262/Zittrain_Future+of+the+Internet.pdf ↩︎

  26. K. Sabeel Rahman, Regulating Informational Infrastructure: Internet Platforms as the New Public Utilities (2018), available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3220737 ↩︎

  27. Alex Pentland, Reality Mining of Mobile Communications: Toward A New Deal On Data (2009), available at https://link.springer.com/chapter/10.1007%2F978-1-4419-0056-2_1 ↩︎

  28. Doc Searls, The Intention Economy: When Customers Take Charge (2012) ↩︎

  29. Luigi Zingales and Guy Rolnik, A Way to Own Your Social-Media Data (2017), available at https://www.nytimes.com/2017/06/30/opinion/social-data-google-facebook-europe.html ↩︎

  30. Imanol Arrieta Ibarra, Leonard Goff, Diego Jiménez Hernández, Jaron Lanier, E. Glen Weyl, Should We Treat Data as Labor? Moving Beyond 'Free' (2017), available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3093683 ↩︎

  31. Mariana Mazzucato, Let's make private data into a public good (2018), available at https://www.technologyreview.com/2018/06/27/141776/lets-make-private-data-into-a-public-good/ ↩︎

  32. Valentina Pavel, Our Data Future (2019), available at https://privacyinternational.org/long-read/3088/our-data-future ↩︎

  33. Urs Gasser, Interoperability in the Digital Ecosystem (2015), available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2639210 ↩︎

  34. Jacques Crémer, Yves-Alexandre de Montjoye, Heike Schweitzer, Competition policy for the digital era (2019), available at https://ec.europa.eu/competition/publications/reports/kd0419345enn.pdf ↩︎

  35. European Commission, A Digital Single Market Strategy for Europe - Analysis and Evidence (2015), available at https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A52015SC0100 ↩︎

  36. Ben Thompson, Aggregation Theory (2015), available at https://stratechery.com/2015/aggregation-theory/ ↩︎

  37. Ben Thompson, Antitrust and Aggregation (2016), available at https://stratechery.com/2016/antitrust-and-aggregation/ ↩︎

  38. European Commission, A Digital Single Market Strategy for Europe - Analysis and Evidence (2015), available at https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A52015SC0100 ↩︎

  39. Gus Rossi and Charlotte Slaiman, Interoperability = Privacy + Competition (2019), available at https://www.publicknowledge.org/blog/interoperability-privacy-competition/ ↩︎

  40. Urs Gasser, Interoperability in the Digital Ecosystem (2015), available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2639210 ↩︎

  41. Chris Marsden and Rob Nicholls, Interoperability: A solution to regulating AI and social media platforms (2019), available at https://www.scl.org/articles/10662-interoperability-a-solution-to-regulating-ai-and-social-media-platforms ↩︎

  42. Gus Rossi and Charlotte Slaiman, Interoperability = Privacy + Competition (2019), available at https://www.publicknowledge.org/blog/interoperability-privacy-competition/ ↩︎

  43. Thomas M. Lenard, Static Ideas of Competition in the Information Age (2020), available at https://www.cato.org/sites/cato.org/files/2020-03/regv43n1-5.pdf ↩︎

  44. Urs Gasser, Interoperability in the Digital Ecosystem (2015), available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2639210 ↩︎

  45. Cory Doctorow, Interoperability: Fix the Internet, Not the Tech Companies (2019), available at https://www.eff.org/deeplinks/2019/07/interoperability-fix-internet-not-tech-companies ↩︎

  46. Ben Thompson, Portability and interoperability (2019), available at https://stratechery.com/2019/portability-and-interoperability/ ↩︎

  47. Urs Gasser, Interoperability in the Digital Ecosystem (2015), available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2639210 ↩︎

  48. Urs Gasser, Interoperability in the Digital Ecosystem (2015), available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2639210 ↩︎

  49. PrivacyTech, Une nouvelle gouvernance pour les donnes au xxieme siecle (2019), available at https://www.privacytech.fr/livre-blanc-privacytech.pdf ↩︎

  50. https://ec.europa.eu/digital-single-market/en/news/proposal-regulation-european-data-governance-data-governance-act ↩︎

  51. Oscar Borgogno, Giuseppe Colangelo, Data sharing and interoperability: Fostering innovation and competition through APIs (2019), available at https://www.sciencedirect.com/science/article/abs/pii/S0267364918304503 ↩︎

  52. European Commission, Communication: Shaping Europe's digital future (2020), available at https://ec.europa.eu/info/publications/communication-shaping-europes-digital-future_en ↩︎

  53. European Commission, Communication: A European strategy for data (2020), available at https://ec.europa.eu/info/strategy/priorities-2019-2024/europe-fit-digital-age/european-data-strategy ↩︎

  54. Ursula van der Leyen, A Union that strives for more. My agenda for Europe : political guidelines for the next European Commission 2019-2024 (2019), available at https://op.europa.eu/en/publication-detail/-/publication/43a17056-ebf1-11e9-9c4e-01aa75ed71a1 ↩︎

]]>
<![CDATA[New project: Towards an Interoperable Public Civic Ecosystem]]>In our vision for a Shared Digital Europe, which we published in April 2019, public and civic actors have an important role in safeguarding a digital space that is not dominated by market logic and that functions as a commons.

This space has increasingly been taken over by private platforms

]]>
https://shared-digital.eu/new-project-towards-an-interoperable-public-civic-ecosystem/5ed64cbb51c73038e7ab8b03Tue, 02 Jun 2020 13:50:21 GMTIn our vision for a Shared Digital Europe, which we published in April 2019, public and civic actors have an important role in safeguarding a digital space that is not dominated by market logic and that functions as a commons.

This space has increasingly been taken over by private platforms that dominate the online communication landscape. In order to break their exclusive grip on online communication, we are exploring how to provide means for public institutions, civic initiatives and other non-market actors to insert themselves into the rich communication fabric enabled by digital technology.

Last month Commons Network has received a Grant from the Next Generation Internet Policy Experimentation Fund to do research on how interoperability mechanisms can create a more sustainable online communication landscape. In line with our core principles, we are particularly interested in how interoperability can empower public institutions and other non-profit actors.

In recent interoperability debates, the focus has been on interoperability as means of ensuring competition and weakening monopolies. Our contribution will focus on interoperability as a principle enabling an alternative, decentralised public civic ecosystem, which creates opportunities for new actors to establish their position.

The research will be undertaken by the Shared Digital Europe core team. We will be analysing interoperability proposals and policy models over the summer, and follow up with a policy design workshop that will bring together experts from our network. We will present our research at the NGI Policy Summit 2020 on 28-29 September.

If you are working on topics such as interoperability, the role of public media in the digital ecosystem or the digital public sphere - please get in touch. We are curious to learn about your ideas. Also please write us with any questions or insights, via hello@shared-digital.eu or @shareddigitalEU.

We're also trying something new with this project. "The Interoperability" is a newsletter with work-in-progress notes from our team. First note will go out in several days - sign up below to receive it. We will be sending you updates every fortnight or so.

Thanks!

The Next Generation Internet Policy Experimentation Fund, is part of the NGI Forward project hosted by NESTA and funded by the European Union.

]]>
<![CDATA[The press publishers right will fail - to support the media we should tax information aggregators instead.]]>To anyone following the debate, it is becoming increasingly clear that the hard-fought new press publishers' right, that the EU adopted last year as part of the copyright reform package, is a paper tiger that is not worth the paper it is printed on. Granting new rights to press publishers

]]>
https://shared-digital.eu/as-predicted-the-new-press-publishers-right-is-a-failure-lets-make-information-aggregators-pay-news-media-producers-for-real/5eb03a1880d0ad04c694e451Thu, 30 Apr 2020 08:00:00 GMTTo anyone following the debate, it is becoming increasingly clear that the hard-fought new press publishers' right, that the EU adopted last year as part of the copyright reform package, is a paper tiger that is not worth the paper it is printed on. Granting new rights to press publishers was never going to fix the economic problems faced by press publishers.

Instead of continuing to bet on the failed idea that all the publishers need is a right that they can leverage against the information aggregators that dominate the information ecosystem, it is time to shift gears and make information aggregators  pay up for the loss of advertising revenue that their emergence has l brought for professional news media producers. Payments, not in the form of charitable giving or under the guise of supporting innovation in the sector, but based on the realisation that information  aggregators  must be held accountable for negative externalities caused by their dominant role in the online environment.

So how did we get here?

Of the two most controversial elements of last year's Copyright Directive the new press publishers right has always been the ugly duckling. While not really interesting enough to gather as much attention as the infamous Article 13, the idea of granting press publishers a new - made up - right intended to provide them with leverage to get paid by news aggregators has been just as misguided as the ideas underpinning Article 13. Met with near unanimous rejection by academics and continuing a legislative lineage that consisted of two abject failures (in Germany and Spain), this new right has been pushed through against repeated warnings that it would be at the minimum ineffective and at worst dangerous to free access of information.

If we are interested in preserving and nourishing a healthy and diverse media landscape, that is crucial for democratic societies, the intention of the press publishers right (to make platforms pay publishers for the "use" of their content by information aggregators) makes sense. However, the rights-based approach to achieve this objective, that is underpinning the new publishers right and its legislative predecessors has been flawed from the start. As previous attempts in Spain and Germany have shown, granting rights to publishers does not really work as long as the intended targets of these rights are at the same time the main drivers of traffic (and thus online revenue) for the publishers. As long as press publishers cannot afford to lose the traffic that is brought to them by a few dominant information aggregators, any new right is essentially worthless as a bargaining chip.

Over the past few months it has been possible to observe this dynamic play out once more in France. After the French legislator eagerly transposed Article 15 of the DSM directive, Google, the main target of the new legislation, did what everybody could expected them to do: Instead of engaging in licensing negotiations with French press publishers, it offered them a simple choice: give us a license for free, or we will not feature your content in our products anymore. Predictably the majority of press publishers reacted to this cornelian dilemma by granting Google licenses for free.

This is where the similarities with the situation in Germany and Spain end for now. In response to complaints by press publishers, the French competition authority has recently issued a ruling that orders Google to "engage in good faith negotiations" with any press publishers making a request to do so. Given that this ruling does not change the underlying negotiation positions it seems unlikely that the outcome of such a negotiation will be much different from the status quo. For Google it still does not make sense to pay publishers for providing a service that is objectively valuable  to them. And while the ruling of the competition authority may validate the sense of entitlement on the publishers side it does not alter the fact that the new right does not provide them with any meaningful economic leverage over Google or other dominant information aggregators such as Facebook. The systemic change in the advertising market that the emergence of information aggregators has caused is not something that can be negotiated away. And while additional traffic obtained thanks to the aggregators has some benefit for individual publishers, it does not compensate for the systemic shift of advertising from media producers to information aggregators.

Interestingly the ruling of the French competition authority seems to have inspired a more drastic intervention by the Australian Government. At the end of April, the Australian government ordered its competition authority to draw up a mandatory code of conduct to govern the relationship between digital platforms and media businesses. This mandatory code of conduct is expected to require digital platforms to pay media companies for the use of "news content". While the details of the code of conduct are yet unknown it is interesting to see the Australian government come to the same conclusion as the French competition authority (that the dominant position of major information aggregators is harming the producers of news media, and that the former compensate the latter) without first going through the motions of inventing a new right.

There is a refreshing honesty in the Australian approach: Instead of assuming that the complex relationship between media producers and information aggregators can be solved via negotiations between the two sides, the Australian approach simply attests that the business models of the information aggregators are causing harm to the producers of news media, and calls for an intervention to remedy this harm. This position clearly acknowledges that news media producers are more than mere market participants and that a strong, independent and diverse media sector is an essential element of democratic societies.

The business model of online information aggregators has undermined the economic viability of traditional press publishing (mainly by attracting advertising revenue away from the news media publishers) and as a result the diversity and nature of the press is under immense pressure across the globe. While there can be discussion on how far the information aggregators’ business is dependent on content produced by traditional press publishers, the fact that they are aggregating their output as part of their business model cannot be disputed.

Given that a strong independent press is an essential part of democratic societies there is an urgent need to find ways to support news media creators of all sorts (including but not limited to traditional press publishers). The most straightforward way to achieve this is to make information aggregators pay up for the negative externalities they cause: Just as most modern societies tax other products and services with negative externalities to cover their societal costs, we should also tax advertising revenues derived from aggregating media products and use the resulting revenues to support the continued creation of news media and journalistic content.

Protecting and nourishing a healthy and diverse media landscape should not depend on the outcome of "good faith negotiations" that take place on the orders of competition authorities, but must be part of a societal contract aimed at subsidising the news media sector because it is an essential element of our democracies. This is not something that the market will take care of, no matter how much we try to prop up the negotiation position of specific classes of media publishers via newly invented rights.

As a matter of principle we should support news media producers based on the value that they provide for society. Under the current conditions this would entail taxing information aggregators whose business models rely on the availability of information, and directing these resources back to those who are producing it. This approach is long overdue, but there are signs that the current crisis might actually make this happen for real.

]]>
<![CDATA[Interoperability is the solution to the Zoom fiasco]]>When the pandemic started and the stay at home orders were given, videoconferencing tools became one of the first items in anyone's crash course in digitisation. Beyond a narrow group of remote workers and tech savvy people, most users have never participated in a Meetup, or a Zoom teleconference.

In

]]>
https://shared-digital.eu/interoperability-is-the-solution-to-the-zoom-fiasco/5e8b431180d0ad04c694e433Mon, 06 Apr 2020 15:19:03 GMTWhen the pandemic started and the stay at home orders were given, videoconferencing tools became one of the first items in anyone's crash course in digitisation. Beyond a narrow group of remote workers and tech savvy people, most users have never participated in a Meetup, or a Zoom teleconference.

In two weeks, Zoom became a household name. The platform responded by providing free accounts that allowed calls of unlimited time, upped from the previous 40 minute limit. In the last week, I have personally experienced a meeting with friends, my daughter's classroom and several work meetings on Zoom. Others have been organising parties, concerts or government meetings (like those that Boris Johnson organised).

In the last few days Zoom turned out to be utterly rotten. Faulty design choices make "Zoom Bombing" possible. Vice reports that Zoom has been sharing user data with Facebook, without prior consent. And researchers at Citizen Lab have disclosed that Zoom uses non-standard security and sends security data to servers in China.

While most commentators have been focusing on privacy implementations, for me the case is interesting from the perspective of provision of public goods. At the time of the pandemic, tools for remote communication became a core piece of not just infrastructure - they become our society. We need tools like Zoom to run governments, attempt to emulate cultural experiences, or to keep connection with close ones, across distance. Teleconference tools become our restaurants, opera halls, workshop rooms and conference venues.  Their security and quality of service becomes a matter of societal resilience.

Of course, there are other teleconferencing options than Zoom, but the list is surprisingly short. Cisco's Webex is a less popular alternative. And teleconferencing functions are baked into multiple remote collaboration suites like Google Suite, Microsoft Teams, Slack, Facebook Workplace, and so on. There are also multiple commercial teleconferencing solutions, usually suited for large-scale use. Zoom, with its easy setup and ability to scale up by a factor of 20 in response to the crisis without becoming unreliable had an architecture well suited for popular take-up. And one that clearly sacrificed security and privacy for this purpose.

What lesson can we learn from this fiasco, at a time when users scramble to find the next, best solution? On a personal call last evening, we tested Jit.si, the only open source solution available on the market. But we moved back to Zoom, due to low quality of the call. Together with my friends - who include a therapeutist, cultural studies scholar, architect and a government employee - we rumbled for a moment about the need to be responsible and to protect our privacy. But there was no other, easy option.

During the pandemic, challenges become more pronounced. All the faults and harms, and the hard choices related to them that we make, are thrown into sharp relief. This applies in general to our digital environment, and is highlighted by specific issues, like the Zoom security challenge. So now is also the time to pivot away from faulty solutions, and establish more resilient, sustainable and sovereign approaches.

In our vision, we connect together the need for sovereignty and for core public infrastructure. Paul Keller wrote recently on our blog  that

At the start of the 2020s, Europe lacks a substantial stake in the two core elements of the current computing paradigm: the mobile devices that we use to access digital services and the hyper scale cloud infrastructures that power these services.

The Zoom fiasco is a case in point of this. And therefore Europe should consider a quick push to support public teleconferencing solutions. It is not acceptable that a European head of state holds cabinet meetings via a US-owned private service that runs part of its infrastructure from China. The teleconferencing space, with its relatively simple service, provides also a great testbed for decentralised solutions. These ensure that users are not locked into specific solutions that are very hard to abandon once the “winner takes it all" dynamics amplified by the current crisis have driven everyone into the arms of the same service.

Finally, a time of crisis and resource constraints is a good moment to fall back on open source solutions. These are designed to be resilient. But the open source ethic, in its pure form, throws the weight and responsibility of running a service onto the user - imagining that everyone is a hacker. And therefore public institutions should step in, and take on the role of supporting sustainable and fair infrastructure for its citizens.

The new European Data Strategy builds an argument that while today's data is held by a small number of Big Tech companies, there are emergent applications where they are not yet dominant and where Europe could lead. Remote teleconferencing is exactly such space - where a mix of infrastructural approaches, open source solutions and data sharing principles could create in a sustainable way a crucial layer for remote societies.

]]>