RESEARCH ASSISTANT

PROJECT TITLE:  Uniting Deep Learning and VR to Understand Human Gaze Behavior

 

PROJECT DESCRIPTION:  How do we allocate our attention in a complex, 360-degree scene? This project will use headmounted VR to introduce individuals to novel panoramic environments and deep learning (CNNs) to model eye-gaze behavior in these environments to understand how the brain represents real-world scenes.

 

PREREQUISITES: Seeking a student in Engineering or CS with previous experience working with convolutional neural nets (CNNs). We would like at least an 8-10 hour/week commitment for 2 terms. This project is a great opportunity to apply your background in deep learning in a fast-paced, scientific environment.

 

OUR LAB: The Robertson Lab at Dartmouth is a vivacious, multidisciplinary environment.  We use neuroimaging techniques (fMRI and MRS), psychophysics, and Virtual Reality (VR) to tackle research questions in areas such as:

  • The neurobiology of autism (fMRI, Magnetic Resonance Spectroscopy, EEG)

  • Real-world scene perception (fMRI, head-mounted Virtual Reality)

  • Neuroplasticity and learning (Magnetic Resonance Spectroscopy, fMRI)

HOW TO APPLY: To apply, please contact Dr. Caroline Robertson (cerw@dartmouth.edu) with a brief description of your interest in the position, your background, and a coding portfolio, made up of two well-annotated sample scripts.

OUR LAB

3 Maynard Street, Moore Hall

Dartmouth College

Hanover, NH 03755

Email: cerw at dartmouth dot edu

  • White Instagram Icon
LINKS