Please note: projects with Healthy Brains, Healthy Lives are open to all disciplines, and take place from mid-May to mid-August, rather than over June and July, as with other projects. Students who take part in these projects will be awarded an additional $1,000 on top of the $5,000 standardly offered through the program.
Professor Stuart Trenholm
While much has been discovered over the last 100 years about how our brain allows us to perceive the world around us through visual system, much remains to be discovered. In this project, we will use behavioral tests in mice to examine their visual perception, probing their ability to perceive and generalize different types of visual features.
Student Responsibilities: Student’s will be expected to perform behavioral tests using a touchscreen operant reward system to examine visual perception in mice. Student’s will be expected to analyze their data, generate figures and perform statistical tests to describe their results, and present their findings in lab meetings.
Prerequisites: Be friendly and motivated! Be interested in asking questions.
Deliverable: Students will learn how to design and perform a scientific experiment, how to analyze data (including developing some coding skills), learn how to select and perform appropriate statistical tests and generate meaningful figures explaining results (also involving coding). Students will receive instruction on publicly presenting their research and be included in journal club meetings where they will learn to read and discuss primary scientific literature.
PROJECT: Behavioral studies with mice in virtual reality
Professor Arjun Krishnaswamy
This project will involve the training of mice to detect visual stimuli in virtual reality to obtain behavioral measures of visual attention
Student responsibilities: Students will be required to work together with a grad student and a postdoctoral fellow, as well as other undergraduates, to train a cohort of mice, collect data, and if time permits, perform analysis in matlab
Prerequisites: Knowledge of basic biology and math.
Deliverable: A behavioral dataset that characterizes the effects of attention during visually guided behavior.
PROJECT: Segmenting pyramidal neurons in visual cortex
Professor Blake Richards
For a larger project in the lab, we need to identify pyramidal neurons in the brains of mice from fluorescence microscopy images. These neurons have big, complex, tree-like structures that reach up to the surface of the brain (see the example figure below). In the long run, we hope to train artificial intelligence systems to do this identification for us. But, to do that, we need to provide human labeled data. This requires someone to carefully trace these neuronal “trees” by hand.
Student responsibilities: The student will be required to trace pyramidal neurons in 3D fluorescence microscopy images taken from the brains of mice.
Prerequisites: None, though a strong interest in neuroscience would be desirable.
Deliverable: This work will produce a new dataset for training artificial intelligence systems to identify pyramidal neurons in fluorescence microscopy images.
PROJECT: Tests of human-like cognition in artificial intelligence
Professor Blake Richards
In this project we will build “video games” for both humans and artificial intelligence (AI) agents to “play.” We will use these games to test for some of the cognitive capabilities that humans and animals have (such as the ability to remember what you did when), and then examine how AI agents are or are not capable of passing these tests.
Student responsibilities: The student will be required to use SilicoLabs (https://www.silicolabs.ca/) to create little games/puzzles that people can solve but which will be difficult for current AI systems.
Prerequisites: Some knowledge of computer programming, even just the basics, would be important.
Deliverable: We will develop specific tests/games that probe for an understanding of one’s past and the ability to identify key steps required to solve a problem/puzzle. If time allows, we will collect initial data demonstrating the difference between human and AI performance in these tests.