ELSA Lab for Smart and Responsible Mobility

Published on: 29 April 2022

With the rise of innovative forms of mobility, we are realising more and more that current legislation, which is aimed at people, does not always apply to smart vehicles. It is crucially important that we decide as a society what the boundaries are within which these systems can operate, and where ethical goals should be defined.

What social challenge in AI is being tackled?

The ELSA Lab for Smart and Responsible Mobility is helping create a responsible and future-proof mobility system, where people are safe and feel safe, where vehicles act responsibly, where mobility is available to all and where not only fundamental rights such as privacy, personal autonomy and non-discrimination but also values such as justice, solidarity and transparency are guaranteed. The public need to be properly informed about new technology, involved in its design and implementation, and the mobility system must meet individual and social needs, which justifies our trust in smart mobility.

What type of solution is offered to the end user?

The ELSA Lab’s key goal is not necessarily developing new AI technology, but involving the AI researchers and their knowledge of AI in the studies so that the ELSA Lab can address the ethical, legal and social aspects in particular, together with developers and AI researchers, governments, industry (which has to work with these developments) and the general public.

With the rise of innovative forms of mobility, we are realising (for instance) more and more that current legislation, which is aimed at people, does not always apply to smart vehicles. This can be seen where the law uses human terms such as ‘blame’, ‘criminal intent’, etc. These issues cannot apply to AI systems or autonomous vehicles that do not have their own awareness, desires or intentions and do not understand ethics or the concept of right and wrong.

What AI methods or technologies are used in the studies?

For optimising the positive and ethically responsible solutions in mobility, as designed by AI, it is important that the effects that various AI methods have on the outcomes are widely discussed. Machine learning, supervised learning, unsupervised learning and reinforcement learning are more general terms that help us let AI optimise a specific learning process. Eventually, the goal of the ELSA Lab for Smart and Responsible Mobility is to lead a discussion about responsible implementation of AI and how it adds to social values and a human-centric character, as well as taking steps towards creating a mobility system that meets society’s wishes.

This ELSA Lab is therefore following the basic rule that technology follows society. AI is not a goal in itself. AI methods and technology should always serve what society demands and must meet the relevant ethical and legal norms. The ELSA Lab should encourage working towards a variety of AI utility functions and ethical goal functions (moral programming) so that it is clear what steps governments need to take, what rules and legal frameworks must be set up, what weightings or ‘values’ must be defined and by whom, how it is to be implemented by programmers who optimise these utilities, and whether it is having the desired social effects. This will be a hybrid approach (hybrid AI) with – depending on the specific uses in mobility – heuristics, TSP, classification (support vector machines), ANN (artificial neural networks), MDP (Markov decision process), self-explaining neural networks.

Is there cooperation with other sectors?

In the broad sense, mobility is an exceptionally complex problem where many ethical or morally relevant choices must be made – choices that demand a balance or judgement between different and sometimes contradictory interests such as individual freedom versus collective responsibility, economy versus ecology, or affordability of services versus data privacy.

These issues cannot be solved by simple optimisation of a singular goal function but require a wider social debate, scientifically responsible information gathering and distribution, and making the consequences of choices or compromises visible and tangible. Setting up the ELSA Lab for Smart and Responsible Mobility from the Brainport AI Hub has allowed parties from the traditional triple helix (centres of expertise, commercial sector and government) to make contacts with each other easily. This is then complemented by the fourth helix element: the residents.

What is the ultimate success this ELSA Lab can achieve?

Setting up an ecosystem for ethical, legal and societal aspects of individuals’ mobility. An important part of this is the Ethics Experience Centre where we let various groups of ‘mobilists’ get to know new technology, in everyday situations, actively involving them in the dialogue about the dilemmas of new mobility technology.

The basic assumption is that technology and society are interwoven and that legal, ethical and social issues are part of the core of every technological innovation process. It is crucially important that we decide now – as a society – what the legal and ethical frameworks are that these systems can operate in, how those frameworks are interrelated, and what approach could be used to make sure they drive technological innovation in the right directions effectively. The ELSA Lab for Smart and Responsible Mobility is working on that.

Awarded the NL AIC Label

NL AIC Label

The Netherlands AI Coalition has developed the NL AIC Label to underline its vision for the development and application of AI in the Netherlands. An NL AIC Label formally recognises an activity that is in line with the aims and strategic goals of the NL AIC and/or the quality of that activity. The NL AIC would like to congratulate the ELSA Lab for Smart and Responsible Mobility!

More information?

If you’re interested in how this ELSA Lab develops further, you can contact the following people if you have any questions:

Marieke Martens
Kevin van Lierop

If you would like more information about human-centric AI and the ELSA concept, please go to this page.

Share via: