Get AI summaries of any video or article — Sign up free

Selective State Spaces — Topic Summaries

AI-powered summaries of 4 videos about Selective State Spaces.

4 summaries

No matches found.

Mamba vs. Transformers: The Future of LLMs? | Paper Overview & Google Colab Code & Mamba Chat

Venelin Valkov · 3 min read

Mamba’s core pitch is a way to make large language models handle much longer inputs without paying Transformers’ usual attention cost. Transformers...

Mamba ArchitectureSelective State SpacesLong-Context LLMs

Mamba sequence model - part 1

West Coast Machine Learning · 2 min read

Mamba’s core pitch is that sequence models can match Transformer-quality results on language and other modalities while scaling linearly with...

Selective State SpacesStructured State Space ModelsS4 and HiPPO

Mamba part 2 - Can it replace Transformers?

West Coast Machine Learning · 3 min read

Mamba’s core pitch is simple: it aims to match—and in some settings surpass—Transformer-style language modeling while scaling linearly with sequence...

Mamba vs TransformersSelective State SpacesS4 State Space Models

Mamba part 3 - Details of Mamba and Structured State Space

West Coast Machine Learning · 3 min read

Mamba’s core pitch is that sequence modeling can be made both fast and selective without attention’s quadratic cost. The approach builds on state...

State Space ModelsS4 DiscretizationSelective State Spaces