PhD studentship: Brain mechanisms underlying the interaction between language and visual attention (Aberdeen University)

Closing Date
5 Jan 2020
Competition funded (UK/European students only).

School of Medicine, Medical Sciences & Nutrition, Aberdeen University

About This PhD Project

Project Description


Dr Joost Rommers (University of Aberdeen)

Dr Soren K Andersen (University of Aberdeen)

Human language is the most advanced communication system that evolution has produced. It unfolds on a rapid time scale (listeners process around 200 syllables per minute), necessitating research methods with a high temporal resolution. Previous work has successfully applied one such method, eye-tracking, to study language using the ‘visual world’ paradigm wherein listeners hear spoken words or sentences and view a visual display of objects (e.g., Tanenhaus et al., 1995). This paradigm has revealed that listeners quickly relate language to the surrounding world: upon hearing a spoken word, the eyes fixate on a corresponding object or on related objects within a few hundred milliseconds.

However, speakers and listeners can process objects even before fixating on them, because attention can also be allocated covertly, without moving the eyes. Furthermore, the neural mechanisms underlying language-mediated shifts of attention are unknown, and it is anticipated that uncovering these mechanisms will provide deeper insights into how language influences attention. For example, does language enhance attention to relevant objects in a display, or does it suppress attention to irrelevant objects?

Fortunately, electrical brain activity (EEG) provides direct measures of covert attention allocation that distinguish between enhancement and suppression. These include time-domain measures of event-related potential components as well as spectro-temporal measures based on steady-state visual evoked potentials elicited by stimuli flickering at different frequencies (e.g., Andersen & Müller, 2010). Recent work has shown that it is feasible to exploit these techniques even in speaking participants, despite the added noise from speech muscle activity (Rommers, Meyer, & Praamstra, 2017).

The present project will focus on the listener side. Specifically, the PhD candidate will use advanced temporal and spectral analyses of electrophysiological signals to investigate how the brain relates ongoing language input to the surrounding world. This project will involve developing a high level of expertise in digital signal processing and programming (Matlab, R), both skills highly useful in industry and academia. The project would be suitable for candidates with a background in psychology, neuroscience, or engineering who have interests in language, attention and electrophysiology.

Application Procedure:

Please send your completed EASTBIO application form, along with academic transcripts and CV to Alison McLeod at Two references should be provided by the deadline using the EASTBIO reference form. Please advise your referees to return the reference form to

For more information, please click here. 

Further Information

Funding Notes
This 4 year PhD project is part of a competition funded by EASTBIO BBSRC Doctoral Training Partnership. This opportunity is only open to UK nationals (or EU students who have been resident in the UK for 3+ years immediately prior to the programme start date) due to restrictions imposed by the funding body. Queries on eligibility? Email Alison McLeod (

Candidates should have (or expect to achieve) a minimum of a First Class Honours degree in a relevant subject. Applicants with a minimum of a 2:1 Honours degree may be considered provided they have a Distinction at Masters level.
Andersen, S. K., & Müller, M. M. (2010). Behavioral performance follows the time course of neural facilitation and suppression during cued shifts of feature-selective attention. Proceedings of the National Academy of Sciences, 107(31), 13878-13882.

Rommers, J., Meyer, A. S., & Praamstra, P. (2017). Lateralized electrical brain activity reveals covert attention allocation during speaking. Neuropsychologia, 95, 101–110.

Tanenhaus, M. K., Spivey-Knowlton, M. J., Eberhard, K. M., & Sedivy, J. C. (1995). Integration of visual and linguistic information in spoken language comprehension. Science, 268(5217), 1632-1634.