PhD position in Philosophy

Updated: about 1 year ago
Deadline: ;

The University of Luxembourg is an international research university with a distinctly multilingual and interdisciplinary character. The University was founded in 2003 and counts more than 6,700 students and more than 2,000 employees from around the world. The University’s faculties and interdisciplinary centres focus on research in the areas of Computer Science and ICT Security, Materials Science, European and International Law, Finance and Financial Innovation, Education, Contemporary and Digital History. In addition, the University focuses on cross-disciplinary research in the areas of Data Modelling and Simulation as well as Health and System Biomedicine. Times Higher Education ranks the University of Luxembourg #3 worldwide for its “international outlook,” #20 in the Young University Ranking 2021 and among the top 250 universities worldwide. 

The Faculty of Humanities, Education and Social Sciences (FHSE) brings together expertise from the humanities, linguistics, cognitive sciences, social and educational sciences. People from across 20 disciplines are working within the Faculty. Along with the disciplinary approach a very ambitious interdisciplinary research culture has been developed. The FHSE offers four Bachelor and 15 Master degrees and a doctoral school providing students with the necessary knowledge and high-qualified skills to succeed in their future career.

The Institute of Philosophy (part of the Faculty of Humanities and Social Sciences) at the University of Luxembourg seeks applications for a four-year (48 months) position as a doctoral student. The position is part of the FNR funded project ‘The Epistemology of AI systems’ (C22/SC/17111440) starting on September 1st, 2023. The PI for the project is Thomas Raleigh (Associate Professor of Philosophy) and the Co-PI is Leon Van Der Torre (Professor of Computer Science). The Post-Doc for the project will be Dr Aleks Knoks. The specific focus for the PhD position is envisaged to be the topic of Testimony and AI, though there will be some scope for flexibility concerning the PhD topic so long as it fits with the themes of the project.

The overall aim of the project is to apply philosophical concepts and tools from recent work in philosophy of science and social epistemology to investigate how to rationally use and respond to AI technology and what to do to improve human-machine interactions from an epistemic point of view. We will focus in particular on questions concerning understanding ‘opaque’ AI systems using simplified explanatory models, trusting AI-generated testimony, conditions under which (opaque) AI systems are epistemically authoritative and how their authority differs from the authority of human experts.



Similar Positions