AI Chip Maker Graphcore Raises $222M

Artificial intelligence chip manufacturer Graphcore has closed a $222 million funding round led by Ontario Teachers’ Pensions Plan Board. The new funding values the British company at $2.77 billion and sets it up to compete with Nvidia, Qualcomm, and other AI chips in user.

More for Graphcore

Graphcore’s advanced chips, referred to as Intelligence Processing Units (IPUs), are used in various machine learning operations and hardware, including robots and driverless cars. The chips are designed to make AI more efficient and operate faster using multiple cores to improve how well a system handles natural language processing and related activities. Microsoft Azure started using Graphcore chips for machine intelligence and NLP in 2018, and the company revealed its second generation of IPUs this past summer. The new money, which comes from new sources like Fidelity International and Schroders, as well as existing investors Baillie Gifford and Draper Esprit, is earmarked to help the company expand its sales and develop new technologies. Graphcore has raised over $710 million with the latest round of funding in the four years since it was founded.

“Having the backing of such respected institutional investors says something very powerful about how the markets now view Graphcore,” Graphcore CEO and co-founder Nigel Toon said in a statement. “The confidence that they have in us comes from the competence we have demonstrated building our products and our business. We have created a technology that dramatically outperforms legacy processors such as GPUs, a powerful set of software tools that are tailored to the needs of AI developers, and a global sales operation that is bringing our products to market.”

Chipping In

The power offered by Graphcore’s chips gives the company a chance to compete with mainstays of the industry like Nvidia and Qualcomm. The industry is only growing more competitive when it comes to chips useful for AI, especially voice AI. Amazon recently started moving most of its Alexa operations onto the Inferentia computer chip produced in-house and away from the Nvidia chips used by Alexa until now. Amazon claims it will increase speed and reduce Alexa’s energy demands, but the extra control the company will have is probably no small inducement. Rumored for a long time, Inferentia chips are built specifically for the machine learning tasks that comprise much of Alexa’s work, like recognizing language and images and generating an appropriate response.

Google is also supposedly developing a processor, partnering with Samsung to design the chip, which may be named Whitechapel, for Pixel smartphones and Chromebook computers. This chip would replace the Qualcomm-built chips Google uses right now and make it easier for Google to improve Google Assistant’s functioning. Apple keeps its cards close to its chest, but could be engaged in a similar quest, hence acquisitions like edge-based artificial intelligence startup for a reported $200 million. Xnor’s low-power machine learning technology is designed to run continuously without the cloud, which would mean more efficient, quicker operations.


Alexa Moves Most Operations to Amazon-Built Chip to Boost Speed and Efficiency

Google is Developing Its Own Chip to Upgrade Google Assistant on Pixel Smartphones and Chromebooks: Report

New Arm Hardware Could Make Voice Assistants 50 Times More Powerful