Dear Users
This Thursday (starting 4.30 pm in YNiC open plan) there will 2 project proposal presentations:
1) Markus van Ackeren "Integrating multimodal semantic knowledge through language: An MEG study"
Abstract: Research from the past decade has shown that retrieving semantic knowledge about objects in our environment engages a widely distributed cortical network. For example, understanding words denoting visual information (green, round) engages visual cortical areas. In contrast, understanding words denoting auditory information (crunching) recruits auditory regions. So far, a plethora of studies have investigated words that are strongly associated with a single modality. However, semantic knowledge about most objects in the world is inherently multimodal. For example, even a single word like /apple /is associated with visual (green, round), auditory (crunchy), haptic (smooth, sticky), and gustatory (sweet) properties. In my research I aim to understand how the brain orchestrates the simultaneous retrieval of multimodal semantic knowledge. I will present behavioural and EEG findings showing that a) there is a processing cost to integrating information from multiple modalities, and b) multimodal integration is accompanied by local power changes in a low frequency band (4-8 Hz). In the current project, I would like to employ MEG to be able to pinpoint the neural generators of the theta power modulation during multimodal integration, and to investigate interactions between multimodal and unimodal areas in the brain.
2) Samantha Strong "The Functional Sub-Divisions of the Human Motion Sensitive Visual Cortex: An fMRI Guided TMS Study"
Everyone is welcome to attend and refreshments will be provided afterwards.
Best wishes Rebecca