Part of a series on |
Machine learning and data mining |
---|
Mamba[a] is a deep learning architecture focused on sequence modeling. It was developed by researchers from Carnegie Mellon University and Princeton University to address some limitations of transformer models, especially in processing long sequences. It is based on the Structured State Space sequence (S4) model.[2][3][4]
Cite error: There are <ref group=lower-alpha>
tags or {{efn}}
templates on this page, but the references will not show without a {{reflist|group=lower-alpha}}
template or {{notelist}}
template (see the help page).