Currently, new products and services stream onto the market effecting our daily lives, often with the adverse effect of unforeseen variables which are overlooked. Constructive digital innovation looks to bring those variables to the forefront of the design process, integrating the ethical needs of the end user into the design. Ownership of data, control of privacy and freedom of choice were central themes which came up during this workshop led by Eindhoven Design Academy & Greenberry.
By Natalja Heybroek
This interactive session was based on six ideas which were presented by students of the Design Academy Eindhoven and further discussed in groups. Approaching each idea from a utopian as well as a dystopian perspective to land somewhere in the middle by understanding which restrictions were relevant for that idea. Ethical concerns, privacy and social concerns were discussed.
Engaging with the future
The ideas varied largely in their application, but all included a digital element. Firstly, there was the Energetic Blue Light, which like coffee could give you energy throughout the day. Another was the Personalised Shower, helping you to start and end every day getting the best out of yourself. The next was aboutSleep Tracking, in order to understand sleeping habits for a better night rest. The fifth revolved around the ability to influence your surroundings through a Virtual Alternative Reality. Lastly, the Political Tracker would, from collected data and an algorithm, be able to tell you more about your political opinion.
As a starting point, each group was asked: How do we want to engage with these interconnected futures?
Danielle Arets (Design Academy Eindhoven) introduces the workshop
Controlling Data
For example, what guidelines or restrictions would you need to make the Political Tracker a desirable product? The aim of the product being to widen the scope of political participation, making decision making easier and more accessible. The restrictions that were discussed revolved around the voluntary choice to take part and the transparency of what data are collected, how they are stored and how the algorithms come to their conclusions.
The issue of privacy would play a large part as it’s important that the data are not used by political parties or any other third party. Finally, a lot of focus revolved around ownership of the data, so that the customer would always remain the owner and could take it back at any time. As Bas Raijmakers, creative director of STBY said: ‘We need to think about what kind of conversations we want to have with algorithms.’
Design Academy students present their concepts
The human element
The underlying human values which underpins the challenge faced with a technological future are trust and reliability. Data often loses its context, making it challenging to come up with accurate conclusions using data algorithms. Furthering this discussion, Sunny Dolat from The Nest Collective explains concerns of when a country starts relying on technology. In Kenya, where he is from, the country has experienced a growing monopoly company in charge of widespread, daily used technologies. When they face a technological problem, it can affect the running of a whole country. All participants left this workshop asking themselves: Can we trust the humans who control data? Can we trust the data? And can we trust the technology?
Natalja Heybroek is a self-employed communications specialist at Pineapple Communications