Predictive Attractor Models

Advances in Neural Information Processing Systems (NeurIPS'24)

Ramy MounirSudeep Sarkar

University Of South Florida
resrc/overview.png

Paper Code Video BibTeX

Abstract


Sequential memory, the ability to form and accurately recall a sequence of events or stimuli in the correct order, is a fundamental prerequisite for biological and artificial intelligence as it underpins numerous cognitive functions (e.g., language comprehension, planning, episodic memory formation, etc.) However, existing methods of sequential memory suffer from catastrophic forgetting, limited capacity, slow iterative learning procedures, low-order Markov memory, and, most importantly, the inability to represent and generate multiple valid future possibilities stemming from the same context. Inspired by biologically plausible neuroscience theories of cognition, we propose Predictive Attractor Models (PAM), a novel sequence memory architecture with desirable generative properties. PAM is a streaming model that learns a sequence in an online, continuous manner by observing each input only once. Additionally, we find that PAM avoids catastrophic forgetting by uniquely representing past context through lateral inhibition in cortical minicolumns, which prevents new memories from overwriting previously learned knowledge. PAM generates future predictions by sampling from a union set of predicted possibilities; this generative ability is realized through an attractor model trained alongside the predictor. We show that PAM is trained with local computations through Hebbian plasticity rules in a biologically plausible framework. Other desirable traits (e.g., noise tolerance, CPU-based learning, capacity scaling) are discussed throughout the paper. Our findings suggest that PAM represents a significant step forward in the pursuit of biologically plausible and computationally efficient sequential memory models, with broad implications for cognitive science and artificial intelligence research.

Online Presentation (Coming Soon)


Poster


poster.png

Approach




generation.png


Sequence Generation. (Left): Offline generation by sampling a single possibility (i.e., attractor point) from a union of predicted possibilities. (Right): Online generation by removing noise from an observation using the prior beliefs about the observed state. Markov Blanket separates the agent’s latent variables from the world observable states.

BibTex


If you find this work useful for your research, please cite:
@inproceedings{mounir2025predictive,
  title={Predictive Attractor Models},
  author={Mounir, Ramy and Sarkar, Sudeep},
  booktitle={Thirty-eighth Conference on Neural Information Processing Systems},
  year={2025}
}

Further information


If you like this project, please check out other related works from our group:

Acknowledgements


This research was supported by the US National Science Foundation Grants CNS 1513126 and IIS 1956050.

© This webpage was in part inspired from this template.