12 fully-funded PhD studentships in Safe and Trusted AI

Updated: about 2 years ago
Location: London, ENGLAND
Job Type: FullTime
Deadline: 11 Apr 2022

Qualification Type: PhD

Location: King’s College London, Imperial College London

Funding for: UK students, international students

Funding amount: Tuition fees, tax free stipend set at UKRI rate plus London weighting, travel allowance.

Hours: Full time

The UKRI Centre for Doctoral Training (CDT) in Safe and Trusted Artificial Intelligence (STAI)  brings together world leading experts from King’s College London and Imperial College London to train a new generation of researchers in methods of safe and trusted artificial intelligence (AI). 

AI technologies are increasingly ubiquitous in modern society, with the potential to fundamentally change all aspects of our lives. While there is great interest in deploying AI in existing and new applications, serious concerns remain about the safety and trustworthiness of current AI technologies. These concerns are well-founded: there is now ample evidence in several application domains (autonomous vehicles, image recognition, etc.) that some AI systems may currently be unsafe in allowing undesired and sometimes incorrect behaviour, also bringing a lack of trust in their actions and decisions. 

An AI system is considered to be safe when we can provide some assurance about the correctness of its behaviour, and it is considered to be trusted if the average user can have confidence in the system and its decision making. 

What we offer 

We offer a unique four-year PhD programme , focussed on the use of symbolic AI techniques for safe and trusted AI, providing an explicit language for representing, analysing and reasoning about systems and their behaviours. Explicit models can be verified and solutions based on them can be guaranteed as safe and correct; and they can provide human-understandable explanations and support user collaboration and interaction. 

As a student at the CDT, you will engage in various training activities alongside your individual PhD project, both in state-of-the-art AI techniques, and in ethical, societal, and legal implications of AI in a research and industrial settings.  

You will graduate as an expert in safe and trusted AI, able to consider the implications of AI systems, to recognise this as a key part of the AI development process and equipped to meet the needs of industry, academia, and the public sector.

Funding 

The CDT will fund approximately 12 students to join the programme in September 2022. Our fully-funded studentships are 4 year awards that include tuition fees, a tax-free stipend set at the UKRI rate  plus London-weighting, and a generous allowance for research consumables and conference travel.  

How to Apply 

Applications are now open! The CDT will consider applications in several rounds, until all places have been filled.  

ROUND  

A : Application Deadline , 8 December 2021. Notification Expected : February 2022 

B : Application Deadline , 15 February 2022. Notification Expected : April 2022  

C : Application Deadline , 11 April 2022. Notification Expected : June 2022

Committed to providing an inclusive environment in which diverse students can thrive, we particularly encourage applications from women, disabled and Black, Asian and Minority Ethnic (BAME) candidates, who are currently under-represented in the sector. 

For further details on how to apply visit: 

https://safeandtrustedai.org/apply-now/ 

For queries please contact: 

[email protected]  



Similar Positions