Aik van Eemeren on Interoperability for Digital Public Space. This is the sixth part of our series of interviews with experts on how to get to public-civic spaces and the role of interoperability in getting there.... read more
The Internet is busted. Huge corporations are calling the shots, extractive business models prevail. And safe and privacy-friendly online spaces have become scarce. How do we move from extractive practices to regenerative ones? How do we retain public control and move to a people-centered internet?
In our research on a Shared Digital Europe and public-civic spaces, we argue that public and civic actors need to build these alternative spaces. And that interoperability is an essential principle, through which they can together form a bigger ecosystem.
Over the course of summer, we consulted friends and experts on the question: how do we get to public-civic spaces, what role does interoperability play? And what is holding governments and civil society back to make the shift? In this series we share the insights that these conversations produced, one at a time.
Previously featured in this series on interoperability and digital public-civic ecosystems were Nathan Schneider, Geert-Jan Bogaerts, Mai Ishikawa Sutton and natacha roussel.
Jaromil is a free and open software developer. The name Jaromil is a developer pseudonym, his real name is Denis Roio. He is the founder and CTO of dyne.org, a foundation that produces software not to make profit, but for its social impact. The work and projects at dyne.org has interoperability as a strong feature on several levels, Jaromil tells us.
When asked about the importance of interoperability, he speaks from his own experiences with collaborating with video artists in Amsterdam. Many of them, he says, were using Adobe’s flash player as a format, which came free of cost but also without the source code. A few years ago Adobe ‘dismissed’ the flash player and, as a result, many artists became unable to show their work. ‘In these situations the work should have been portable between different formats. Interoperability is important because it is about building bridges between systems that can be open.’ Software developers, he says, perhaps aren’t able to offer solutions that can set people’s creativity free from ‘closed’ technologies - like the flash player - , but to have at least an interoperable layer gives ‘the potential of freedom’ as people can decide to take their projects and move somewhere else.
Do one thing and do it well
The best example of interoperable thinking and practice, according to Jaromil, is the UNIX philosophy and the operating systems it inspired. Key to this philosophy are the guidelines: Do one thing and do it well; Build on existing software and; Don’t try to create big, complicated systems, but make simple, well designed programmes work together. In other words, create instances of interoperability by linking different technologies together. Perhaps unexpectedly, this has an important ecological dimension, Jaromil reminds us. ‘Using this philosophy we offer solutions that work with existing infrastructures and therefore don’t demand new ones nor new hardware’.
A practical example of the UNIX philosophy that Jaromil is keen to highlight is LaTex, an open source document preparation tool used for (a.o.) mathematics, computer science and cyber security. LaTex was designed as a modular system with interoperability sitting at its core, reason why it has been used for transforming different data components into well-readable formats such as .PDF.
Jaromil is also the CTO of DECODE, a flagship project for the European Commission exploring and experimenting with data portability and ownership. As a part of it, he and fellow developers work on a software project called ZenRoom that allows people to automate operations on data using a human-like language (Zencode) without the need to know technical programming. Interoperability is a core feature for the success of this project, which runs on any PC and even micro-chips or inside a browser web-page: by means of a simple language it allows different people from diverse disciplines to interact and co-design with it.
Time will tell
In the context of ZenRoom interoperability is studied on different levels. First is software design. Zenroom is an ‘input-output machine’, Jaromil explains, that doesn’t open connections outside of its own execution, which makes it more secure. ‘It is interoperable in the sense that it needs someone to operate it in order to function, because it needs input to give a certain output, else it just does nothing. It facilitates interaction and executes just what is being told, a design approach that lowers complexity also in the most difficult distributed computational setups.’
The second level concerns language: as mentioned above, Zenroom can be programmed in human-like language so different people, from tech-savvy to less technically skilled, can work - or interoperate - with the software too.
Time is another level of interoperability, which Jaromil thinks is often overlooked. He argues that developers and companies can do a better job at thinking about how to make software interoperable with the infrastructures in the future. Since it’s hard to predict the future, Jaromil is also looking back. Zenroom is written in the ancient programming language C (written around fifty years ago and devised to run on the UNIX operating system). This makes the software interoperable with modern and less modern computers and operating systems, “even with 20 year old computers, and my claim is - time will tell – that it will still run 20 years from now”.
Zenroom is interoperable with different hardware platforms too. Jaromil explains that it can run on low-power chips and in an ordinary browser, ‘and everything in between. It occupies only 2 megabytes of RAM which makes it extremely powerful, environmentally sustainable but also portable’.
Connecting different systems
The portability of software or data relates directly to interoperability. If you want to move your data around, you need technologies and infrastructures that recognize the data and through this become interoperable. Even more so in ‘trustless environments’, says Jaromil, ‘like the member states of the European Union’. Dyne.org was selected by the European Commission as one of the organizations to run a pilot ‘authenticating’ the rates Erasmus exchange students obtained abroad, making them portable and usable in their home country. This requires a level of technical interoperability of the different technologies that universities in diverse European countries can use.
The universities example points to another way in which society could benefit from interoperability: to maintain and improve the social and cultural diversity of our societies. The universities are still able to maintain their own systems, but they can – in the near future – interoperate with each other. That is to say: we don’t necessarily need one-size-fits-all (which the GMAFIA desperately want us to believe), but we can have different services and technologies based on our diverse cultures, ideas and the purposes we see fit.
Who am I interoperating with?
Making systems like student rating technologies interoperable or, in the words of Jaromil, ‘opening them up’, is also risky. Because you can predict human behaviour a little bit, ‘but you never know exactly what people do with it’. A solution could be to govern interoperable systems, regulating user behaviour and preventing people from, for instance, commercial capture, (see as an example Microsoft’s dubious relationship with open source operating system GNU/Linux). Jaromil argues that you should not always govern interoperability, but that ‘interoperability is hard to maintain when entirely ungoverned’.
Some will abuse interoperability. Especially when it comes to deceiving human perception, he continues. Take a protocol like SMTP. It’s open and interoperable and enables e-mail globally. But over time, abuses started to appear within the system: first there was spam e-mail and later phishing attacks became a common form of cybersecurity risks. Jaromil: ‘So the same open and interoperable system that your bank uses to communicate, can be faked by people that want to take your money’. Openness brings risks of perception, abuse and of transparency about what or who we are operating with.
The interview becomes a two-way street when Jaromil asks us: ‘What is the opposite of interoperability?’ When we struggle to come up with anything better than ‘Closed?’, he rescues us: ‘The opposite is incompatibility, when something doesn’t plug in.’ A small etymological detour follows. ‘The Italian word compatible derives from a composite of the Latin words cum [meaning: with] and pathos [meaning: to suffer]’, so to suffer with, or to suffer together. While laughing, he adds: ‘If we agree that in some societies suffering is a form of work, compatible [or interoperable] means ‘does work with’ and incompatible means ‘doesn’t work with’.
To make a technology interoperable or incompatible can seem like a technical choice, but it often hides political policy choices that affect users. We need better policies, but which ones? We agree with Jaromil to a mixed approach consisting of public investment, better regulation and wide adoption of interoperable systems. He states: ‘If the public sector doesn't take their role, their role will be lost’, and lobbying is a big obstacle to taking that role. But above all, he continues, we must have a clear common goal of having sovereignty on our digital infrastructures, and not having multinational corporations dominate the whole field. ‘This is our culture and society, we are communicating through computers!’
When we are in Brussels, it is striking how many people are trying to come up with grand digital schemes that are supposed to replace the earlier grand, ‘worse’ ones. Not Jaromil. The UNIX philosophy of doing one small thing well and connecting them one at a time, again shows through: ‘I would never challenge a big system with another big system, but I would start opening bits of it and add better bits’.