Funded PhD Opportunity AI4N: AI for Next-generation Neurotechnology

Updated: over 1 year ago
Location: Londonderry, NORTHERN IRELAND
Job Type: FullTime
Deadline: 11 Oct 2022

Summary

Implantable and wearable neurotechnology utilization is expected to increase dramatically in the coming years, with applications in enabling movement-independent control and communication, rehabilitation, treating disease, improving health, recreation (neurogaming) and sport among other applications. There are multiple driving forces:− continued advances in underlying science and technology; increasing demand for solutions to repair the nervous system; increase in the aging population world-wide producing a need for solutions to age-related, neurodegenerative disorders and “assistive” brain-computer interface (BCI) technologies; and commercial demand for nonmedical BCIs. The potential for the UK economy to benefit from neurotechnology R&D has been recognised in the recent transformative roadmap for neurotechnology in the UK [1].

Wearable neurotechnologies which perform electroencephalography (EEG), non-invasively, present an excellent challenge for Artificial Intelligence. First, EEG has a low signal to-noise ratio (SNR), as the brain activity measured is often buried under multiple sources of environmental, physiological and activity-specific noise of similar or greater amplitude, often referred to as ‘artifacts’. Secondly, EEG is a non-stationary signal (i.e., its statistics vary across time). As a result, a classifier trained on a temporally-limited amount of user data might generalize poorly to data recorded at a different time on the same individual.  Thirdly, there is high inter-subject (user) variability arising due to physiological differences between individuals, which vary in magnitude but can severely affect the performance of models that are meant to generalize across subjects. AI must be developed to cope with non-stationarity of biological signals (drifting dynamics) and for online real-time adaptation for continually evolving systems. Roy et al [2] systematically assessed the prospects of deep learning for EEG based BCIs and have shown that deep learning approaches, whilst promising, have yet to present the same gains in processing neural signals as they have in e.g., natural language processing (NLP) and computer vision, largely due to two factors, limitations of the data available and the much lower SNR (compared to NLP and CV).

The proposed research programme which is aligned with a UKRI Turing AI Fellowship (tinyurl.com/TuringAIFellow ) and Ulster spinout company (neuroconcise.co.uk) aims to address these factors with innovations in deep neural networks trained through neuroevolution approaches.

The proposed AI framework for Next Generation Neurotechnology (AI4N) will involve long short term memory (LTSM) based recurrent networks in the neural time series prediction pre-processing (NTSPP) framework , [3], hybrid capsule neural network [4] to replace traditional feature extraction and classification and reinforcement learning enhanced multi-objective evolutionary algorithms (MOEA) [5] and co-evolutionary algorithms (CoEA) [6]  for hyperparameter optimisation. All these approaches will be available for integration.

Successful applicants will have access to the Spatial Computing and Neurotechnology Innovation Hub (SCANi-hub), the Northern Ireland Functional Brain Mapping Facility and the Northern Ireland High Performance Computing Facility and will be integrated in a team of researchers that are trialling neurotechnology with end-users including those with brain injuries, people living with stroke, spinal injury and post-traumatic stress disorder as well as in advanced applications silent speech decoding [7] [8] and 3D arm motion trajectory prediction [9] in augmented and virtual reality paradigms.

Essential criteria

Applicants should hold, or expect to obtain, a First or Upper Second Class Honours Degree in a subject relevant to the proposed area of study.

We may also consider applications from those who hold equivalent qualifications, for example, a Lower Second Class Honours Degree plus a Master’s Degree with Distinction.

In exceptional circumstances, the University may consider a portfolio of evidence from applicants who have appropriate professional experience which is equivalent to the learning outcomes of an Honours degree in lieu of academic qualifications.

  • Experience using research methods or other approaches relevant to the subject domain
  • Research proposal of 1500 words detailing aims, objectives, milestones and methodology of the project
  • A demonstrable interest in the research area associated with the studentship

Desirable criteria

If the University receives a large number of applicants for the project, the following desirable criteria may be applied to shortlist applicants for interview.

  • First Class Honours (1st) Degree
  • Masters at 70%
  • For VCRS Awards, Masters at 75%
  • Experience using research methods or other approaches relevant to the subject domain
  • Work experience relevant to the proposed project
  • Publications - peer-reviewed
  • Experience of presentation of research findings

Funding and eligibility

The University offers the following levels of support:

Department for the economy (DFE)

The scholarship will cover tuition fees at the Home rate and a maintenance allowance of £16,062 (tbc) per annum for three years (subject to satisfactory academic performance). This scholarship also comes with £900 per annum for three years as a research training support grant (RTSG) allocation to help support the PhD researcher.

  • Candidates with pre-settled or settled status under the EU Settlement Scheme, who also satisfy a three year residency requirement in the UK prior to the start of the course for which a Studentship is held MAY receive a Studentship covering fees and maintenance.
  • Republic of Ireland (ROI) nationals who satisfy three years’ residency in the UK prior to the start of the course MAY receive a Studentship covering fees and maintenance (ROI nationals don’t need to have pre-settled or settled status under the EU Settlement Scheme to qualify).
  • Other non-ROI EU applicants are ‘International’ are not eligible for this source of funding.
  • Applicants who already hold a doctoral degree or who have been registered on a programme of research leading to the award of a doctoral degree on a full-time basis for more than one year (or part-time equivalent) are NOT eligible to apply for an award.

Due consideration should be given to financing your studies. Further information on cost of living

Recommended reading

  • K. Mathieson, T. Denison, and C. Winkworth-Smith, “A transformative roadmap for neurotechnology in the UK,” A Transform. roadmap neurotechnology UK, 2021, [Online]. Available: ktn-uk.org/wp-content/uploads/2021/06/A-transformative-roadmap-for-neurotechnology-in-the-UK.pdf .
  • Y. Roy, H. Banville, I. Albuquerque, A. Gramfort, T. H. Falk, and J. Faubert, “Deep learning-based electroencephalography analysis: a systematic review,” J. Neural Eng., vol. 16, no. 5, p. 051001, Aug. 2019, doi: 10.1088/1741-2552/ab260c.
  • D. Coyle, “Neural Network Based Auto Association and Time- Series Perdiction,” IEEE Comput. Intell. Mag., no. November, pp. 47–59, 2009.
  • M. Khodadadzadeh, X. Ding, P. Chaurasia, and D. Coyle, “A Hybrid Capsule Network for Hyperspectral Image Classification,” IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens., vol. 14, pp. 11824–11839, 2021, doi: 10.1109/JSTARS.2021.3126427.
  • A. Zhou et al., “Multiobjective evolutionary algorithms: A survey of the state of the art,” Swarm Evol. Comput., vol. 1, no. 1, pp. 32–49, 2011, doi: 10.1016/j.swevo.2011.03.001.
  • L. Miguel Antonio and C. A. Coello Coello, “Coevolutionary Multiobjective Evolutionary Algorithms: Survey of the State-of-the-Art,” IEEE Trans. Evol. Comput., vol. 22, no. 6, pp. 851–865, 2018, doi: 10.1109/TEVC.2017.2767023.
  • C. Cooney, R. Folli, and D. Coyle, “Neurolinguistics for Continuous Direct-Speech Brain-Computer Interfaces,” IScience, vol. 8, pp. 103–125, 2018, doi: 10.1016/j.isci.2018.09.016.
  • C. Cooney, R. Folli, and D. Coyle, “A bimodal deep learning architecture for EEG- fNIRS decoding of overt and imagined speech.”
  • A. Korik, R. Sosnik, N. Siddique, and D. Coyle, “Decoding Imagined 3D Hand Movement Trajectories From EEG : Evidence to Support the Use of Mu , Beta , and Low Gamma Oscillations,” Front. Neurosci., vol. 12, no. March, pp. 1–16, 2018, doi: 10.3389/fnins.2018.00130.