Postdoc in ethics of autonomous vehicles, with special attention to ethical analysis of human...

Updated: almost 2 years ago
Job Type: FullTime
Deadline: 01 Jul 2022

The Philosophy Section of the University of Twente in the Netherlands is looking for a postdoc (19-20-month position) in ethics of autonomous vehicles, with special attention to ethical analysis of human factors in trust in autonomous vehicle s, as part of a large European project on ethics of emerging technologies

The Challenge

The principal aim of this postdoc will be to develop ethical guidance tools for the assessment of trust in autonomous vehicles, to help engineers, designers, regulators, civil society organizations, and researchers in assessing the trustworthiness of autonomous vehicles, and of different components and elements in autonomous vehicle systems. This will involve, amongst others, the development of a ‘trust relationship tool’ that allows actors to identify trust relationships that exist, or are likely to emerge, between key stakeholders in  autonomous vehicle systems. This tool will complement technical assessments of trust in autonomous vehicles by looking at human and social factors in trust.

This postdoc project will also develop a novel approach to the ethical assessment and guidance of autonomous vehicle systems, which will build in part on the above mentioned tools. Autonomous vehicles, and the social-technical systems that support them, must factor in values like privacy, explicability, and human agency in addition to values like human safety, efficiency and so on. The postdoc will draw from an Ethics by Design approach to show how values like privacy, explicability, and human agency can increase the overall trustworthiness of a particular system. In seeking to make an autonomous vehicle system trustworthy, it must take into account a range of different values, and may need to make tradeoffs between these values. The postdoc will work closely with other members on the project to see how technical aspects of trust are being built into the system, and to what this means for developing a system that is worthy of human and social trust.

Additional time will be devoted to project coordination and research dissemination tasks. This will involve writing reports for the project (often together with others), and you will also have some time to work on academic publications.

The CONNECT Project

The CONNECT Project for Continuous and Efficient Cooperative Trust Management for Resilient Cooperative, connected and  automated mobility (CCAM) is a prestigious three year project international project funded by the European Union’s Horizon Europe program. It has a budget of €6 million, and will start in September 2022. It has 17 participating institutions, including the University of Twente. The project is led by the Technikon Forschungs- Und Planungsgesellschaft MBH in Austria (Coordinator Ms Lisa Burgstaller).

CONNECT addresses the convergence of security and safety in CCAM by assessing dynamic trust relationships and defining a trust reasoning framework based on which involved entities can establish trust for cooperatively executing safety-critical functions. This will enable both a) cyber-secure data sharing between data sources in the CCAM ecosystem that had no or insufficient pre-existing trust relationship, and b) outsourcing tasks to the MEC and cloud in a trustworthy way. This is quite technical, looking at specific factors and features that allow for relationships between CCAM elements and components to assess and respond to other components as trustworthy or not. However, autonomous vehicle systems also require that humans find the system to be trustworthy, and that the system is itself worthy of that trust.

CONNECT seeks to not just design trust into the autonomous vehicle systems, but to find ways to assess and assure human users, designers, regulators, and so on, that the system is worthy of that trust. The project will draw from the more technical elements of trust to see and show that a range of values are necessary for a system to be trustworthy.

The postdoc will engage in ethical analysis, some legal analysis, and stakeholder engagement activities. The project will develop/refin/extend (as desirable and applicable to the three technologies) existing/proposed ethics frameworks, operational guidelines or Codes (e.g., developed in SIENNA, SHERPA, PANELFIT, SATORI and other projects) to enable the effective ethics governance of the technologies. It will reconcile the needs of research and innovation and the legitimate concerns of the society while stimulating innovation and reducing ethical risks. To do so, it will especially engage with researchers and innovators, research ethics committees (RECs), research integrity (RI) bodies, civil society organisations (CSOs), policy makers and the public.



Similar Positions