Open Source Generative AI Model Developer Together Raises $102.5M
Generative AI model startup Together has raised $102.5 million in a Series A funding round led by Kleiner Perkins and joined by Nvidia and others. Together has created a cloud platform for designing generative AI apps with open source models and tools, with plans to leverage the new money to add new options and widen access to its large language models (LLMs) and services.
Open Source Together
Together pitches itself as a decentralized cloud for generative AI where businesses can find and use open source AI models to suit their needs. They can use the cloud platform to fine-tune and run these models for what the company claims is significantly lower costs than Google Cloud, Microsoft Azure, or Amazon and AWS while still being more secure and easier to customize.
Together is known for its RedPajama-V2 dataset, the largest open dataset for training LLMs, which has seen over 1.2 million downloads in the past month. The RedPajama project began with a 1.2 trillion token dataset that enterprises can use for pre-training models available for licensing. The name and outfit reference the popular children’s book Llama Llama Red Pajama. Meanwhile, FlashAttention v2, a joint effort by Together AI’s Chief Scientist Tri Dao and collaborators, is now utilized by major players in the industry, including OpenAI and Meta. The company’s innovative work in inference, showcased in Medusa and Flash-Decoding techniques, has led to the fastest inference stack for transformer models. This is accessible through the Together Inference API, offering quick access to over 100 open models.
“At Together AI, we believe the future of AI is open source. So we are creating a comprehensive cloud platform to allow developers everywhere to build on open and custom AI models,” Together CEO Vipul Ved Prakash explained in a blog post. “Open source AI provides a strong foundation for these applications with increasingly powerful generative models being released almost weekly. The Together AI platform allows developers to quickly and easily integrate leading open source models or create their own models through pre-training or fine-tuning. Our customers choose to bring their generative AI workloads to Together AI owing to our industry leading performance and reliability; while still having comfort that they own the result of their investment in AI and are always free to run their model on any platform.”
In addition to research, Together AI has also focused on developing robust compute infrastructure. The company is expanding its AI infrastructure to 20 exaflops across multiple data centers in the US and EU. Their custom-designed cloud infrastructure, powered by Nvidia GPUs and networking across partners like Crusoe Cloud and Vultr, is tailored for high-performance AI applications. The new funding, quintuple the $20 million raised by Together in May, gives the startup some runway to prove its vision of open source generative AI has a place in the market.
Follow @voicebotaiFollow @erichschwartz
Meta Introduces Large Language Model LLaMA as a Competitor for OpenAI
Stanford Closes Meta LLaMA-Based Alpaca Generative AI Demo Over Safety and Cost Problems