Now Reading
TII Launches Falcon Mamba 7B as Top Open-Source SSLM

TII Launches Falcon Mamba 7B as Top Open-Source SSLM

TII Launches Falcon Mamba 7B as Top Open-Source SSLM

The Technology Innovation Institute (TII), renowned for its cutting-edge scientific research and a key pillar of Abu Dhabi’s Advanced Technology Research Council (ATRC), has introduced its latest advancement: the Falcon Mamba 7B. This new model stands out as the leading open-source State Space Language Model (SSLM) globally, a claim independently verified by Hugging Face. Marking a departure from previous Falcon models that utilized transformer-based architectures, the Falcon Mamba 7B represents a novel approach in the SSLM category. Consequently, this release underscores TII’s ongoing commitment to pioneering research and the provision of groundbreaking tools and products in an open-source format.

H.E. Faisal Al Bannai, Secretary General of ATRC and Adviser to the UAE President for Strategic Research and Advanced Technology Affairs, commented: “The Falcon Mamba 7B signifies TII’s fourth consecutive top-ranked AI model, solidifying Abu Dhabi’s position as a global leader in AI research and development. This achievement is a testament to the UAE’s steadfast dedication to innovation.”

In comparisons with transformer-based models, the Falcon Mamba 7B surpasses Meta’s Llama 3.1 8B, Llama 3 8B, and Mistral’s 7B according to the new benchmarks introduced by Hugging Face. For SSLMs, the Falcon Mamba 7B leads all open-source models on previous benchmarks and is poised to top Hugging Face’s forthcoming, more challenging benchmark leaderboard.

Dr. Najwa Aaraj, Chief Executive of TII, stated: “With the Falcon Mamba 7B, the Technology Innovation Institute continues to redefine the limits of technology through its Falcon series of AI models. This model exemplifies pioneering innovation and sets the stage for future AI breakthroughs that will enhance human capabilities and improve lives.”

State Space models excel in comprehending complex, evolving scenarios, such as entire books, without needing extra memory for large information sets. In contrast, transformer models are highly effective at recalling and utilizing previously processed information within sequences, making them ideal for tasks like content generation, though they demand substantial computational resources due to their exhaustive comparisons of each word.

SSLMs have broad applications in fields such as estimation, forecasting, and control tasks. They also match transformer models in Natural Language Processing, with potential uses extending to machine translation, text summarization, computer vision, and audio processing.

Dr. Hakim Hacid, Acting Chief Researcher of TII’s AI Cross-Center Unit, remarked: “The launch of the Falcon Mamba 7B reflects the collaborative spirit at TII that fueled its development. This release marks a significant advancement, sparking new insights and driving the quest for intelligent systems forward. At TII, we are pushing the envelope in both SSLM and transformer models to further innovation in generative AI.”

See Also
UAE tops Tourism

The Falcon LLMs have achieved over 45 million downloads, highlighting the models’ exceptional success. The Falcon Mamba 7B will be available under the TII Falcon License 2.0, which is a permissive Apache 2.0-based license and includes an acceptable use policy that promotes responsible AI practices. Furthermore, for more details about the new model, visit FalconLLM.TII.ae.

About Author

© 2021 The Technology Express. All Rights Reserved.

Scroll To Top