Research Associate in Trustworthy Machine Learning for Malware Detection

Updated: 2 months ago
Location: London, ENGLAND
Job Type: FullTime
Deadline: 14 Apr 2024

The Department of Informatics of King’s College London is looking to appoint a Research Associate in Trustworthy Machine Learning for Malware Detection, to work on the EPSRC project “XAdv: Robust Explanations for Malware Detection”, at the intersection of AI and Systems Security, led by Dr. Fabio Pierazzi (PI). 

Project Objectives 

The EPSRC project XAdv ("X" for explanation, and "Adv" for adversarial robustness), aims to design "robust explanations" for malware detection, i.e., explanations of model decisions which are easy to understand and visualize for security analysts (to support faster verification of maliciousness, and development of patches), and which are trustworthy and reliable even in presence of malware evolution over time and evasive malware authors. 

Robustness of explanations will be evaluated from two main perspectives: concept drift (i.e., malware evolution over time), and adversarial ML (i.e., ML-aware attackers, who carefully craft malicious samples to evade detection systems).

Moreover, this project will explore how robust explanations can be used to automatically adapt ML-based malware detection models to new threats over time (e.g., into novel active learning strategies), as well as to integrate domain knowledge from security analysts' feedback from robust explanations to improve detection accuracy.

Application Details

Applicants should clarify in their cover letter how they plan to address the research challenges posed in the XAdv EPSRC project, and how their profile and expertise would fit well for the scope of the project and within the essential/desirable criteria.

Shortlisted candidates will be invited to online interviews, where they will prepare a short research presentation followed by a Q&A with the panel. Detailed instructions will be shared with shortlisted candidates (tentatively) by April 19. Interviews will be (tentatively) held between May 1 and May 3.

The Role

The candidate will be responsible for leading the co-design and development of novel solutions for robust explanations for malware detection, in the presence of concept drift and adversarial attacks. 

They will also be expected to lead top security/ML publications (e.g., USENIX Security, S&P, TOPS, ICML, ICLR, AISec, DLSP) under the guidance of Dr. Fabio Pierazzi, and the advisory team as required. They will develop prototypes and code frameworks (e.g., in Python) for large-scale experimental evaluations of their research ideas, and co-lead also user studies and methodologies to verify accuracy of the explanation methods. 

The candidate will become part of the Cybersecurity Group (CYS) in Informatics at King's (https://www.kcl.ac.uk/research/cys ), and work more closely with the team led by Dr. Fabio Pierazzi (https://fabio.pierazzi.com/team/ ), currently consisting of 6 Ph.D. students. The wider team of collaborators of this project, including industrial partners such as NCC Group and Avast (now Gen Digital), and universities such as UIUC, TU Berlin, and University of Cagliari, will ensure excellent development opportunities throughout the project. The researcher will be allowed to support the supervision of PhD, MSc and BSc students associated with the project, to increase their competitiveness for future academic and industry roles.

For further information about the Department of Informatics at King’s, please see   https://nms.kcl.ac.uk/luc.moreau/informatics/overview.pdf .

Contract type

This post will be offered on an a fixed-term contract for 18 months*, not exceeding 30th September 2026 which is the end date of the grant.

Anticipated June/July 2024 start

This is a full-time post – 100% full time equivalent

*There is potential to extend, depending on funding availability and the grant end date (30th September 2026).



Similar Positions