- 10:45
- -
- 11:15
In the past four years, several innovative variants of SSM-based deep learning architectures have been proposed, including S3, S4, H3, and S5, each bringing its unique contributions and advancements. Among these, Mamba, or S6, stands out as a cutting-edge SSM recently introduced by leading researchers Albert Gu and Tri Dao in their paper, "Mamba: Linear-Time Sequence Modeling with Selective State Spaces." Mamba is specifically designed to tackle sequences with highly complex structures, showcasing its advanced capabilities in this domain.
Mike Erlihson is a seasoned AI professional currently leading AI development at a stealth company, leveraging his PhD in Mathematics and extensive expertise in deep learning and data science. As a prolific scientific content creator and lecturer, he has reviewed over 250 deep learning papers and hosted more than 20 recorded podcasts in the field, building a substantial following of over 5OK in LinkedIn. In addition to his professional work, Mike is committed to education and knowledge-sharing in the AI community, making complex topics accessible through his various content platforms.