Welcome to the Time in Brain and Behaviour Laboratory at the University of Melbourne.
Our lab investigates time in the brain, from a neural, cognitive, and behavioural perspective. We use computational methods and neuroimaging techniques, particularly multivariate EEG decoding, as well as psychophysical and behavioural approaches, to study how the brain works over time.
We are always looking for bright, enthusiastic (honours) students and PhD candidates to contribute to the work in the lab. Take a look at our Projects page to see more about what we work on.
Update (17-03-2020): We are very proud of Tessel's latest paper, just out in PNAS, using EEG to show that our brains predict the present!
- Dr Hinze Hogendoorn
- Jim Maarseveen, MSc
- Tessel Blom, MSc
- Philippa Johnson, MSc
- Sidney Davies
- Lysha Lee
- Kate Coffey
- Jane Yook
- Nicky Rickerby
- Deanna Spinks
- Prof. Allison McKendrick – Optometry and Vision Sciences, University of Melbourne
- Prof. Anthony Burkitt – Biomedical Engineering, University of Melbourne
- Dr. Stefan Bode – Decision Neuroscience Lab, University of Melbourne
- Dr. Daniel Feuerriegel – Decision Neuroscience Lab, University of Melbourne
- A/Prof Geoff Stuart – University of Melbourne
- A/Prof Thomas Carlson – University of Sydney
- Prof. Frans Verstraten – University of Sydney
- Prof. David Alais – University of Sydney
- Dr. Hamish MacDougal – University of Sydney
- Prof. Patrick Cavanagh – Universite Paris Descartes, France
- Dr. Rufin VanRullen – CNRS Toulouse, France
- Elle van Heusden
- Elektra Schubert
- Adam Ryde
- Dominic Yip
- Chelsea Liang
- Duy Dao
- Ryan Maloney
- Ahmad Al-Dhalaan
Predicting the Present
The brain needs time to process sensory input, meaning that our conscious experience of the world is based on information that is outdated by the time we perceive it. One way that the brain might compensate for its internal delays is by prediction. This project uses time-resolved EEG recordings to study the role of anticipatory neural activity in “predicting” the present.
Shortcuts to consciousness
Although the brain needs time to process sensory input, being able to rapidly detect and respond to events in the outside world is very important for any organism's survival. Because in some cases speed is more important than accuracy, our brains have evolved to sometimes 'fast-track' crucial information so that we are able to act on it. To do so, the brain makes assumptions, and when these assumptions are violated, visual illusions result: we see things that are not really happening. This project uses visual illusions such as the flash-lag effect to understand the shortcuts that the brain takes when compensating for its own delays.
When predictions fail
The brain uses predictive strategies to compensate for its own delays, allowing us to "see" objects where they are right now, even if the sensory information about "right now" has not yet been processed. But what happens if events unfold unpredictably, such as when a moving object changes direction? The brain would need time to detect that its prediction was wrong, and during that time it would continue to extrapolate the object along its initial trajectory. What happens to these failed predictions? How does the brain revise its old prediction? This project uses a combination of multivariate EEG decoding and behavioural techniques to understand how the brain corrects its failed predictions.
If you are interested in contributing to any of these projects, or have another idea you would like to discuss, please get in touch!
Blom, T., Johnson, P., Feuerriegel, D., Bode, S., & Hogendoorn, H. (2020). Predictions drive neural representations of visual events ahead of incoming sensory information. Proceedings of the National Academy of Sciences USA. DOI 10.1073/pnas.1917777117. (pdf)
Coffey, K., Adamian, N., Blom, T., van Heusden, E., Cavanagh, P., & Hogendoorn, H. (2019). Expecting the unexpected: Temporal expectation increases the flash-grab effect. Journal of Vision 19(9). (pdf)
Hogendoorn, H. & Burkitt, A.N. (2019). Predictive coding with neural transmission delays: a real-time temporal alignment hypothesis. eNeuro 10.1523/ENEURO.0412-18.2019. (pdf).
Maarseveen, J., Paffen, C.L.E., Verstraten, F.A.J., & Hogendoorn, H. (2019). The duration aftereffect does not reflect adaptation to perceived duration. PLOS ONE 14(3): e0213163. (pdf)
Blom, T., Liang, Q., & Hogendoorn, H. (2019). When predictions fail: correction for extrapolation in the flash-grab effect. Journal of Vision 19(3). (pdf)
Van Heusden, E., Harris, A.M., Garrido, M., & Hogendoorn, H. (2019). Predictive coding of visual motion in both monocular and binocular human visual processing. Journal of Vision 19(3). (pdf)
Goddard, E., Klein, C., Solomon, S.G., Hogendoorn, H., & Carlson, T.A. (2018). Interpreting the dimensions of neural feature representations revealed by dimensionality reduction. NeuroImage 180A, 41-67. (pdf)
Van Heusden, E., Rolfs, M., Cavanagh, P., & Hogendoorn, H. (2018). Motion extrapolation for eye movements predicts perceived motion-induced position shifts. Journal of Neuroscience 38(38), 8243-8250. (pdf)
Maarseveen, J., Hogendoorn, H., Verstraten, F.A.J., & Paffen, C.L.E. (2018). Attention Gates the Selective Encoding of Duration. Scientific Reports 8, 2522. (pdf)
Hogendoorn, H., & Burkitt, A.N. (2018). Predictive coding of visual object position ahead of moving objects revealed by time-resolved EEG decoding. NeuroImage 171, 55-61. (pdf)
Maarseveen, J., Paffen, C.L.E., Verstraten, F.A.J. & Hogendoorn, H. (2017). Representing dynamic stimulus information during occlusion. Vision Research 138:40-49. (pdf)
Hogendoorn, H., Alais, D., MacDougall, H., & Verstraten, F.A.J. (2017). Velocity perception in a moving observer. Vision Research 138, 12-17. (pdf)
Fahrenfort, J.J., van Leeuwen, J., Olivers, C., & Hogendoorn, H. (2017). Perceptual Integration without Conscious Access. Proceedings of the National Academy of Sciences USA 114(14), 3744-3749. (pdf)
Maarseveen, J., Hogendoorn, H., Verstraten, F.A.J., & Paffen, C.L.E. (2017). An investigation of the spatial selectivity of the duration after-effect. Vision Research 130, 67-75. (pdf)
Hogendoorn, H., Verstraten, F.A.J., MacDougall, H., & Alais, D. (2017). Vestibular signals of self-motion modulate global motion perception.Vision Research 130, 22-30. (pdf)
Hogendoorn, H. (2016). Voluntary saccadic eye movements ride the attentional rhythm. Journal of Cognitive Neuroscience 28(10), 1625-1635. (pdf)
Hogendoorn, H., Kammers, M.P.M., Haggard, P., & Verstraten, F.A.J. (2015). Self-touch modulates the somatosensory evoked P100. Experimental Brain Research 233(10), 2845-2858. pdf
Hogendoorn, H., Verstraten, F.A.J., & Cavanagh, P. (2015). Strikingly rapid neural basis of motion-induced position shifts revealed by high temporal-resolution EEG pattern classification. Vision Research 113, 1-10. pdf
Hogendoorn, H. (2015). From sensation to perception: Using multivariate classification of visual illusions to identify neural correlates of conscious awareness in space and time. Perception 44, p. 71-78. pdf
Dr. Hinze Hogendoorn
Redmond Barry Building, Room 806
Parkville, VIC 3010, Australia
Tel: +61 (0)3 8344 3895
Personal website: http://www.hinzehogendoorn.com
Staff website: http://findanexpert.unimelb.edu.au/display/person797181
Follow us on twitter: https://twitter.com/hinzehogendoorn