AI21 Jamba

AI21 Debuts Hybrid Structure Generative AI Model Jamba

Generative AI developer AI21 Labs has released a new large language model named Jamba with a sizeable 256,000-token context window. Jamba is also a unique hybrid of State Space model (SSM) and transformer architectures, a first for production-grade models, according to the startup.

AI21 Jamba

Jamba specifically combines AI21’s Mamba SSM with the kind of transformer infrastructure used in popular generative AI models like OpenAI’s GPT-3. The hybrid design aims to address the limitations of purely state space models and transformer approaches. Jamba integrates transformer, Mamba state space, and mixture-of-experts layers into a single architecture optimized for memory usage, throughput, and overall performance. Compared to transformer models of the same size, Jamba delivers three times higher throughput on long-context tasks critical to enterprise use cases. You can see how the structure works in harmony above.

AI21 has released Jamba’s model weights as open source under an Apache 2.0 license to foster innovation. The company has also integrated Jamba with Nvidia’s AI catalog to streamline enterprise deployment as an inference microservice. With a massive 256,000 token context window, Jamba outperforms other state-of-the-art models of comparable scale across a wide range of benchmarks, as seen below.

Scalability is another key advantage. A single high-powered GPU of about 80GB using Jamba can handle a 140,000-token context window on its own. This allows for more accessible deployment and lowers the barrier for AI experimentation across industries. That’s not the largest context window around, but it’s far from the smallest, either. For instance, Gemini 1.5 from Google boasts a theoretical context window of a million tokens, while Meta’s Llama 2 has just 32,000 tokens for its context window. On the other hand, Llama 2 only needs an 8GB size chip.

“We are excited to introduce Jamba, a groundbreaking hybrid architecture that combines the best of Mamba and Transformer technologies,” AI21 vice president of product Or Dagan said. “This allows Jamba to offer unprecedented efficiency, throughput, and scalability, empowering developers and businesses to deploy critical use cases in production at record speed in the most cost-effective way.”

Generative AI Startup AI21 Labs Adds $53M to Earlier $155M Funding Round

Generative AI and LLM Startup AI21 Labs Raises $155M, Reaches $1.4B Valuation

AI21 Labs Revamps Wordtune Generative AI Assistant for Enterprise With New Features