Computational Neuroscience Seminar

POSTPONED: How to learn a cognitive map for offline learning during sleep

Speaker: Daniel Levenstein, Yale School of Medicine

Location: Warren Weaver Hall 1314

Videoconference link: https://nyu.zoom.us/j/9726507138

Date: Tuesday, February 24, 2026, 4 p.m.

Synopsis:

The talk will be postponed to a later date.

Like learning, sleep changes the brain to improve its future performance. Unlike learning, these changes occur in the absence of overt behavior or sensory input. This “offline learning” thus contains a mystery: how does internally generated activity improve the brain’s cognitive function? Leading theories of memory consolidation and policy learning posit that the hippocampus acts as a generative world model, or cognitive map, which can be used to simulate plausible “replay” episodes to learn from offline. Recently it’s been found that, like the hippocampus, artificial neural networks trained to predict sensory inputs develop spatially-tuned cells. However, whether predictive learning can account for the ability to produce replay, or the hippocampus’ representation of non-spatial variables, is unknown. In this talk I will discuss recent work in which we find that spatially-tuned cells, which robustly emerge from all forms of predictive learning, do not guarantee the presence of a cognitive map with the ability to generate replay. Offline simulations only emerged in networks that used recurrent connections and head-direction information to predict multi-step observation sequences, which promoted the formation of a continuous attractor reflecting the geometry of the environment. These offline trajectories were able to show wake-like statistics, autonomously replay recently experienced locations, and could be directed by a virtual head direction signal. In addition, these networks developed sweeping representations of future positions and represented non-spatial task-relevant variables, like observed in the hippocampus. These results demonstrate how the hippocampal operations underlying offline learning can emerge from sequential predictive learning.