Contact

Non-discrimination by design |

2021

Concept & Research

Design Elements

Chapter 1

Non discrimination by design |

Concept & Research

 

Making the best, objective and efficient decision, that is the goal. So, we optimize choices with Artificial Intelligence: Decisions based on data, mathematics, and statistics rather than human emotions, prejudice or mood. In theory, digital decision-making should lead to more equality: the human bias will be eliminated. But, unfortunately, reality has proven different. In the process of digital decision-making, minorities are systematically disadvantaged, excluded or even punished. This is done consciously or unconsciously by the organizations themselves and is often not visible to the victims and the authorities.

 

The Ministry of the Interior and Kingdom Relations commissioned an investigation into the cause of discrimination in Artificial Intelligence. A team of experts from Tilburg University, Eindhoven University of Technology, Vrije Universiteit Brussel and The Netherlands Institute for Human Rights collaborated on creating a guideline and explaining how organizations can prevent their systems from being discriminatory. Therefore, the team differentiated the technical, legal, and organizational conditions that should be applied before, during and after creating an AI system.

 

Janssen designed a concept to visually translate the dangers of bias in algorithms – based on three conditions: technical (an incomplete or selective dataset), legal (an unsharp or one-dimensional perspective), and organizational (a guiding definition of success). The design is a system of manipulated, biased ‘o’s. These change shape, color and composition throughout the document to bring the content to life and guide the reader in understanding the complex legal matter.

 

The result is a visually unique document in which text and content reinforce each other – a representation of an interdisciplinary collaboration between politicians, lawyers, scientists, tech-engineers, and the artist/designer’s role to tackle social issues.

 

This project is commissioned by the Dutch Ministry of Interior and Kingdom relations.

 

Research and content by: Bart van der Sloot (Tilburg University), Esther Keymolen (Tilburg University), Merel Noorman (Tilburg University), The Netherlands Institute for Human Rights, Hilde Weerts (Eindhoven University of Technology), Yvette Wagensveld (Tilburg University), Bram Visser (Vrije Universiteit Brussel).

 

Concept, book design, and artwork by Julia Janssen

Recommended episode

The HyperClick Podcast

3// Cathy O'Neil

Victims of Algorithms

Listen!

2017 | PANEL

Julia Janssen participated in a panel debate about Unfair Algorithms with Cathy O'Neil, Astrid Oostenbrug, Mirko Tobias Schäfer and Bart Jacobs

Reading tips

Weapons of Math Destruction |Cathy O'Neil

Team Human | Douglas Rushkoff

Studio Julia Janssen is

based at the Wheelhouse

 

Achtergracht 17-19,

1017 WL Amsterdam

Privacy Statement

Contact us:

Julia Janssen

julia@studiojuliajanssen.com

Subscribe for our

newsletter!

Suzan Slinger –

production and general manager

suzan@studiojuliajanssen.com

for a speaking opportunity, please contact the Next Speaker

kiki.meijer@thenextspeaker.com

Instagram

@studiojuliajanssen

LinkedIn

Julia Janssen

Non-discrimination by design |

Concept & Research

 

Making the best, objective and efficient decision, that is the goal. So, we optimize choices with Artificial Intelligence: Decisions based on data, mathematics, and statistics rather than human emotions, prejudice or mood. In theory, digital decision-making should lead to more equality: the human bias will be eliminated. But, unfortunately, reality has proven different. In the process of digital decision-making, minorities are systematically disadvantaged, excluded or even punished. This is done consciously or unconsciously by the organizations themselves and is often not visible to the victims and the authorities.

 

The Ministry of the Interior and Kingdom Relations commissioned an investigation into the cause of discrimination in Artificial Intelligence. A team of experts from Tilburg University, Eindhoven University of Technology, Vrije Universiteit Brussel and The Netherlands Institute for Human Rights collaborated on creating a guideline and explaining how organizations can prevent their systems from being discriminatory. Therefore, the team differentiated the technical, legal, and organizational conditions that should be applied before, during and after creating an AI system.

 

Janssen designed a concept to visually translate the dangers of bias in algorithms – based on three conditions: technical (an incomplete or selective dataset), legal (an unsharp or one-dimensional perspective), and organizational (a guiding definition of success). The design is a system of manipulated, biased ‘o’s. These change shape, color and composition throughout the document to bring the content to life and guide the reader in understanding the complex legal matter.

 

The result is a visually unique document in which text and content reinforce each other – a representation of an interdisciplinary collaboration between politicians, lawyers, scientists, tech-engineers, and the artist/designer’s role to tackle social issues.

 

This project is commissioned by the Dutch Ministry of Interior and Kingdom relations.

 

Research and content by: Bart van der Sloot (Tilburg University), Esther Keymolen (Tilburg University), Merel Noorman (Tilburg University), The Netherlands Institute for Human Rights, Hilde Weerts (Eindhoven University of Technology), Yvette Wagensveld (Tilburg University), Bram Visser (Vrije Universiteit Brussel).

 

Concept, book design, and artwork by Julia Janssen

follow us: