PhD Machine Learning for Brain-Computer Interface (BCI) based on Electroencephalography (EEG)

Closing Date
1 Apr 2019
Salary
£15,009
Address
Centre for Accountable, Responsible and Transparent AI, University of Bath
Duration
4 years

Brain-computer Interface (BCI) provides a potential way for paralyzed people to communicate with the outside world and restore motor function that has been impaired by devastating neuromuscular disorders. In general, there are two major types of BCIs based on electroencephalography (EEG), non-independent BCIs and independent BCIs. 

Regarding non-independent BCIs, steady state visual evoked potentials (SSVEPs) are particularly attractive due to high signal to noise ratio (SNR) and robustness. SSVEP is a resonance phenomenon that can be observed mainly in electrodes over the occipital and parietal lobes of brain when a subject looks at a light source flickering at a constant frequency. In this case, there is an increase in the amplitude of the EEG at flickering frequencies and their harmonics and there are different methods to extract the frequency components of SSVEPs. The first task of this project aims to study SSVEP based BCI. The goal is to develop novel machine-learning algorithms to realize stable and accurate classifications for multiple human commands. 

Regarding independent BCIs, motor imagery (MI) is a very popular paradigm. The neurophysiological phenomenon called event-related desynchronization (ERD)/ synchronization (ERS) accompanying real and imagined body part movement lays the fundamental mechanism for classifying motor imagery based BCIs. Due to the inherent complexity of brain dynamics and characteristics of ERD/ERS rhythm are highly non-stationary and subject specific, the classification accuracy is generally not high enough and varied dramatically among different subjects. Hence, the widespread of MI based BCI out of laboratory is still impossible until now. The second task of this project aims to study MI based BCI. The goal is to develop novel machine-learning algorithms to realize stable and accurate recognition of human intentions. 

The AI novelty in this project is that the advanced machine learning technologies will be developed to process the brain signals and improve the performance of traditional BCIs. 

This project aims to realize practical BCI-assisted systems with high performance, which can enhance the life-quality for the disabled persons and aging population. 

This project is associated with the UKRI CDT in Accountable, Responsible and Transparent AI (ART-AI), which is looking for its first cohort of at least 10 students to start in September 2019. Students will be fully funded for 4 years (stipend, UK/EU tuition fees and research support budget). Further details can be found at: http://www.bath.ac.uk/research-centres/ukri-centre-for-doctoral-training-in-accountable-responsible-and-transparent-ai/

Desirable qualities in candidates include intellectual curiosity, a strong background in maths and programming experience. 

Applicants should hold, or expect to receive, a First Class or good Upper Second Class Honours degree. A master’s level qualification would also be advantageous. 

Informal enquiries about the project should be directed to Dr Dingguo Zhang on email address dz492@bath.ac.uk

Enquiries about the application process should be sent to art-ai-applications@bath.ac.uk

Formal applications should be made via the University of Bath’s online application form for a PhD in Computer Science: https://samis.bath.ac.uk/urd/sits.urd/run/siw_ipp_lgn.login?process=siw_ipp_app&code1=RDUCM-FP01&code2=0013 

Start date: 23 September 2019.

Funding Notes

ART-AI CDT studentships are available on a competition basis for UK and EU students for up to 4 years. Funding will cover UK/EU tuition fees as well as providing maintenance at the UKRI doctoral stipend rate (£15,009 per annum for 2019/20) and a training support fee of £1,000 per annum. 

We also welcome all-year-round applications from self-funded candidates and candidates who can source their own funding.

References

1. A Wearable SSVEP-Based BCI System for Quadcopter Control Using Head-Mounted Device Wang, M., Li, R., Zhang, R., Li, G. & Zhang, D., 10 Apr 2018, In : IEEE Access. 6, p. 26789-26798 10 p. 

2. Human-to-human closed-loop control based on brain-to-brain interface and muscle-to-muscle interface, Mashat, M. E. M., Li, G. & Zhang, D., 1 Dec 2017, In : Scientific Reports. 7, 1, 11001. 

3. Toward Multimodal Human-Robot Interaction to Enhance Active Participation of Users in Gait Rehabilitation, Gui, K., Liu, H. & Zhang, D., 1 Nov 2017, In : IEEE Transactions on Neural Systems and Rehabilitation Engineering. 25, 11, p. 2054-2066 13 p., 7926461. 

4. Brain-Computer Interface Controlled Cyborg: Establishing a Functional Information Transfer Pathway from Human Brain to Cockroach Brain, Li, G. & Zhang, D., 16 Mar 2016, In : PLoS ONE. 11, 3, p. e0150667 

For more information and to apply, click here

Contact Details

Dr Dingguo Zhang: dz492@bath.ac.uk
art-ai-applications@bath.ac.uk